In 1975, a twenty-five-year-old engineer at Kodak named Steven Sasson built the first digital camera. It was the size of a toaster, weighed eight pounds, took twenty-three seconds to capture a single image, and stored that image on a cassette tape at a resolution of 0.01 megapixels. Terrible by any standard. Also, by any standard, the future.
Sasson brought it to his bosses. Showed them the prototype, walked them through the technology, the trajectory, the implications. He even calculated the timeline: based on Moore's Law and the rate of sensor improvement, digital image quality would match film quality within fifteen to twenty years. That put the crossover point somewhere between 1990 and 1995.
Kodak's leadership looked at the prototype, looked at the projections, and made a decision that would cost the company its existence. They buried it. Not because they didn't understand the technology. Sasson later recalled that the executives grasped the engineering immediately. They buried it because the conclusion threatened everything they believed about their business. Confirmation bias, the brain's tendency to seek information that supports existing beliefs and filter out information that contradicts them, had already decided what the data meant before anyone finished reviewing it. Kodak made its money on film, on chemical processing, on the physical prints that customers picked up at drugstores. Digital photography didn't just represent a new product line. It represented the elimination of the old one. The brain that built a $10 billion empire on film chemistry could not process data that predicted film's death, so it filtered the data instead of updating the model.
Over the next two decades, Kodak watched the exact timeline Sasson had predicted unfold, year by year, just as he'd said it would. Digital cameras improved. Film sales peaked around 2000 and began declining. By 2003, digital camera sales had overtaken film camera sales globally. And Kodak, the company that invented the technology, filed for bankruptcy in January 2012. The prediction was right. The company that heard it first died anyway. Confirmation bias didn't make Kodak's leaders stupid. It made them deaf to information their brains had already decided was wrong.
What Is Confirmation Bias?
In 1960, a British psychologist named Peter Wason designed an experiment so simple it almost didn't seem worth running. He gave participants a sequence of three numbers: 2, 4, 6. Their job was to figure out the underlying rule. They could propose their own sequences of three numbers, and Wason would tell them whether each sequence fit the rule or not. When they felt confident, they could announce the rule.
Almost every participant came up with the same hypothesis: the rule is "ascending even numbers." Then they tested it the way human brains are designed to test hypotheses, which is to say they tested it in the worst possible way. They proposed 8, 10, 12. Wason said yes, it fits. They proposed 20, 22, 24. Yes. They proposed 100, 102, 104. Yes.
Confident now, they announced the rule: ascending even numbers.
They were wrong. The actual rule was simply "any three ascending numbers." The sequence 1, 2, 3 would have fit. So would 5, 97, 1,000. But almost nobody tried a sequence like that, because trying a sequence that might disprove your hypothesis feels wrong. It feels unproductive. The brain doesn't want to waste a guess on something that might fail. So it spends all its guesses on confirmations, which tell you nothing you didn't already believe.
Wason's experiment demonstrated what later researchers named confirmation bias, and it has since become one of the most studied and most replicated findings in all of cognitive psychology. Sixty-five years of research, hundreds of studies, and the core finding has never been overturned. When human beings form a belief, they preferentially seek information that supports it and avoid, discount, or fail to notice information that contradicts it. Not occasionally. As a default mode of cognition.
The part that matters for your business: confirmation bias doesn't feel like bias. It feels like due diligence. The Kodak executives who dismissed Sasson's digital camera weren't sitting in a boardroom thinking "let's ignore the evidence." They were thinking "we've looked at this carefully and it doesn't change our strategy." The looking was the problem. They looked only at data that confirmed what they already knew, and since Kodak knew film, the data they noticed was always about film.
How Confirmation Bias Works in the Brain
In 2006, a psychologist named Drew Westen at Emory University published what happened when he put thirty men in an fMRI scanner during the 2004 presidential race between George W. Bush and John Kerry. Fifteen were committed Democrats. Fifteen were committed Republicans. Each man was shown a set of statements by both candidates that were clearly contradictory, situations where the candidate they supported had obviously flip-flopped or said something inconsistent.
Westen wanted to answer one question: what does the brain do when it encounters information that threatens a deeply held belief?
The scans made it obvious. When the partisans evaluated contradictions from their own candidate, the reasoning centers of the brain went quiet. The dorsolateral prefrontal cortex, the region associated with conscious, deliberate evaluation, showed no increased activation. The brain wasn't thinking through the contradiction. It was managing the emotional threat the contradiction represented.
What did activate was a different circuit entirely: the ventromedial prefrontal cortex, the lateral orbital cortex, the anterior cingulate, and the posterior cingulate, regions associated with emotion processing and conflict resolution. And once the brain found a way to dismiss the threatening information, the ventral striatum lit up, the same reward circuit that fires when you eat chocolate or win a bet.
The partisans weren't just failing to reason objectively. Their brains were paying them not to. Confirming what you already believe feels good, neurochemically good, a hit of the same reward chemistry that fires when you close a deal. Disconfirming information registers as threat. And the brain's response to threat is never "let me think about this more carefully." It's "let me make this go away."
Westen published the results in the Journal of Cognitive Neuroscience. His summary was blunt: "We did not see any increased activation of the parts of the brain normally engaged during reasoning." Instead, he wrote, "it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones."
That machinery is running underneath every strategic decision your company makes. Every product review meeting, every customer feedback session, every competitive analysis. The brain that formed the hypothesis is the same brain evaluating the evidence, and it is structurally biased toward confirming what it already believes.
Confirmation Bias Examples in Business
Kodak isn't an outlier. The pattern repeats across industries, decades, and company sizes.
Blockbuster and Netflix. In 2000, Reed Hastings flew to Dallas to propose a partnership with Blockbuster. Netflix would run Blockbuster's online brand; Blockbuster would promote Netflix in its stores. John Antioco, Blockbuster's CEO, reportedly laughed at the proposal. Netflix was losing money. It had 300,000 subscribers. Blockbuster had nearly 8,000 stores and $5 billion in annual revenue. The data that confirmed Blockbuster's dominance was everywhere: store traffic, franchise revenue, late-fee income that alone exceeded $800 million per year. The data that predicted disruption, growing broadband adoption, declining physical media sales, Netflix's accelerating subscriber growth, was available but contradicted the model. Blockbuster filed for bankruptcy in 2010.
Eli Lilly and the Zyprexa crisis. Throughout the early 2000s, Eli Lilly received increasing reports that its blockbuster antipsychotic drug Zyprexa was associated with significant weight gain and diabetes risk. Internal documents later revealed in litigation showed that the company had data suggesting the metabolic risks were higher than publicly acknowledged. Eli Lilly's internal narrative was built around Zyprexa as a breakthrough medication, which it was. But that narrative created a confirmation filter: data supporting efficacy received full attention, while data about side effects was interpreted through the lens of the drug's therapeutic value. The company eventually paid over $1.4 billion in settlements related to its marketing of Zyprexa, one of the largest pharmaceutical settlements in history at the time.
The small-scale version. You don't need to be Kodak to run this machinery. A founder who surveyed fifty potential customers about their product idea is already deep in confirmation bias if they led with "Would you use a product that does X?" instead of "What's the biggest problem you face in Y?" The first question generates agreement. The second generates information. Most founders ask the first one, because the first one feels more productive, and it does feel productive, right up until the product launches and nobody buys it.
And it connects to every validation mistake covered in other frameworks. The mom test problem, where friends and family tell you your idea is great, is confirmation bias dressed in social clothing. The sunk cost fallacy, where you can't abandon a failing project because of what you've already invested, is confirmation bias protecting a prior belief: "this was a good decision." Product-market fit surveys that ask leading questions are confirmation bias baked into the methodology.
How Confirmation Bias Hijacks Your Strategy
The dangerous version of confirmation bias isn't the dramatic kind, the Kodak kind, where the contradicting evidence is obvious in hindsight. It's the quiet kind, where the bias is invisible because it operates through selection rather than rejection.
You don't reject contradicting data. You just never encounter it, because the questions you ask, the people you talk to, the metrics you track, and the meetings you attend are all structured around the hypothesis you've already formed. It's the same dynamic that makes teams filter out dissent rather than surface it.
A founder who believes their product's main value proposition is convenience will track convenience-related metrics: time saved, steps eliminated, ease-of-use scores. If the real value proposition is something else entirely, say status or identity or community, they'll never see it, because they're not measuring it. They're not ignoring the signal. They're standing in a room where the signal can't reach them.
Confirmation bias sits upstream of every other cognitive bias. Loss aversion makes you flinch at the wrong moment. The endowment effect makes you overvalue what you have. Overconfidence prevents you from recognizing the gap between what you believe and what's actually true. But confirmation bias determines which information reaches your decision-making process in the first place. By the time you're weighing options, confirmation bias has already filtered out the options that would have changed your mind.
Try This: The Disconfirmation Protocol
The goal here is simple: force your brain to seek evidence against its own hypothesis.
-
Write down your strongest current belief about your business. Not your mission statement. Your operating assumption. "Our customers care most about price." "Our product is better than the competition." "This feature is what's driving retention." Whatever you'd bet money on if someone challenged you over dinner. That's the belief to test.
-
Ask one question: "What would I expect to see if this belief were wrong?" If your customers don't actually care most about price, what behavior would you observe? If your product isn't better, where would the evidence show up? Write down three specific, observable indicators that would exist in a world where your belief is false.
-
Go look for those indicators. Not casually. Actively. Talk to the five customers most likely to disagree with you, the ones who almost churned, the ones who use one feature and ignore the rest, the ones who switched from a competitor you thought was inferior. Read the negative reviews. Pull the metrics you've never tracked because they weren't related to your hypothesis.
-
Set a pre-commitment threshold. This is the same logic behind the pre-mortem technique used in high-stakes decisions. Before you look at the data, decide what would change your mind. "If three out of five churned customers say the same thing, I'll revisit the strategy." "If the conversion rate on the alternative value proposition exceeds X, I'll run a formal test." Without the pre-commitment, the brain will find a way to explain away whatever it finds. That's what brains do. The pre-commitment is the only thing that overrides the machinery.
-
Schedule this quarterly. Confirmation bias isn't a one-time error. It's a continuous process. The brain rebuilds the filter every day. One disconfirmation exercise won't fix it permanently, the same way one workout won't keep you fit for a year. The protocol works by repetition, not by revelation.
Confirmation bias survives every debunking and every training program because the bias itself prevents you from noticing it. It's the one cognitive error that defends itself. The brain has already decided what's true, and it rewards you for finding evidence that agrees. Fighting it requires something the brain doesn't do voluntarily: seeking out the evidence that would prove you wrong, before you need it.
Chapter 1 of Wired explains the machinery underneath confirmation bias, the prediction engine that runs every perception, every decision, and every belief you hold. It's the same system that made Kodak's executives unable to process what their own engineer was showing them, and it's the same system that made a group of Emory University partisans feel pleasure when their brains successfully dismissed contradictory facts. The chapter covers a case that's more unsettling than either: a woman who identified a man in a police lineup with absolute certainty, testified against him in court, and helped send him to prison for eleven years for a crime he didn't commit. Her brain had built a prediction, and once it locked in, no amount of contradicting evidence could unlock it. The part that will change how you think about your own certainty is on page 12.
FAQ
What is confirmation bias? Confirmation bias is the brain's tendency to seek, interpret, and remember information that confirms existing beliefs while ignoring or discounting information that contradicts them. First demonstrated by psychologist Peter Wason in 1960, it has become one of the most replicated findings in cognitive psychology. In a business context, it means founders and executives systematically filter out data that challenges their strategy while overweighting data that supports it.
What are common confirmation bias examples in business? Kodak invented the digital camera in 1975 but buried the technology because it contradicted their film-based business model. They filed for bankruptcy in 2012. Blockbuster dismissed Netflix's partnership proposal in 2000 because their $5 billion revenue confirmed the strength of their physical rental model. They filed for bankruptcy in 2010. At a smaller scale, founders who survey customers with leading questions like "Would you use a product that does X?" are running confirmation bias through their validation process.
What does confirmation bias look like in the brain? Drew Westen's 2006 fMRI study at Emory University showed that when people evaluate contradictory information about a candidate they support, the brain's reasoning centers go quiet. Instead, emotion-processing regions activate to manage the threat, and the reward circuit fires once the contradiction is dismissed. The brain doesn't just fail to reason objectively about confirming beliefs. It actively rewards you for not reasoning objectively.
How can you overcome confirmation bias in business decisions? The most effective technique is a structured disconfirmation protocol: explicitly identify your strongest business assumption, ask what observable evidence would exist if that assumption were wrong, then actively seek that evidence. Setting a pre-commitment threshold before examining the data (deciding in advance what would change your mind) prevents the brain from explaining away whatever it finds. This needs to happen quarterly, not once, because the bias rebuilds continuously.
Works Cited
- Sasson, Steven J. "We Had No Idea." Plugged In (Kodak blog), 2007. Cited in Estrin, James. "Kodak's First Digital Moment." The New York Times, August 12, 2015. https://lens.blogs.nytimes.com/2015/08/12/kodaks-first-digital-moment/
- Wason, P. C. (1960). "On the Failure to Eliminate Hypotheses in a Conceptual Task." Quarterly Journal of Experimental Psychology, 12(3), 129–140.
- Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). "Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election." Journal of Cognitive Neuroscience, 18(11), 1947–1958.
- Keating, Giles. "The Rise and Fall of Kodak." The Economist, January 14, 2012.
- "Blockbuster (entertainment)." Wikipedia. https://en.wikipedia.org/wiki/Blockbuster_(entertainment)
- Berenson, Alex. "Eli Lilly Said to Play Down Risk of Top Pill." The New York Times, December 17, 2006. https://www.nytimes.com/2006/12/17/business/17drug.html
- "Eli Lilly and Company Agrees to Pay $1.415 Billion." U.S. Department of Justice, January 15, 2009. https://www.justice.gov/archive/opa/pr/2009/January/09-civ-038.html
- Nickerson, R. S. (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology, 2(2), 175–220.