In the spring of 1943, a group of military officers walked into a nondescript building at 401 West 118th Street in Morningside Heights, Manhattan, carrying a problem they couldn't solve. The building housed the Statistical Research Group, a classified wartime program that had assembled some of the most brilliant mathematicians in America. Milton Friedman was there. So was Leonard Jimmie Savage, Frederick Mosteller, and Norbert Wiener. The program operated like a quieter, less photogenic Manhattan Project, except the weapons being developed were equations.
The officers brought data. American bombers were getting shredded over Europe, and the military needed to add armor plating to protect them. But armor is heavy. Too much and the planes can't fly far enough or fast enough. The question was where to put it. So they'd done the logical thing: they examined the bombers that came back from missions and mapped every bullet hole. The data showed a clear pattern. The fuselage was riddled, nearly two bullet holes per square foot on average. The fuel system was chewed up almost as badly. The engines, though, showed barely one hit per square foot. The officers' conclusion seemed obvious. Armor the fuselage and the fuel system. That's where the bullets are landing.
They presented this analysis to a thirty-nine-year-old Hungarian mathematician named Abraham Wald. Wald had been born in Cluj (then Austria-Hungary, now Romania) in 1902, the grandson of a rabbi and the son of observant Jewish parents who homeschooled him rather than send him to classes on Saturdays. He'd earned a PhD in mathematics, but antisemitic laws barred him from academic positions in Austria. When the Nazis annexed the country in 1938, economist Oskar Morgenstern helped Wald flee to the United States. He ended up at Columbia, technically classified as an "enemy alien" and barred from reading classified documents, yet functioning as the group's most formidable mathematical mind.
Wald looked at the bullet-hole data and said something the officers hadn't considered. You're looking at the wrong planes. The holes in the returning bombers didn't represent the most dangerous places to get hit. They represented the places a plane could get hit and still make it home. The bombers with engine damage weren't in the data set. They were scattered across the fields of Europe, in pieces. The military was studying the survivors and mistaking their wounds for the whole story. Armor the engines, Wald said. Armor the places where the returning planes aren't hit, because those are the shots that are killing the planes you never see again.
The military implemented his recommendation. The insight saved an unknown number of aircraft and crews through the rest of World War II, through Korea, through Vietnam. And it gave a name to one of the most dangerous thinking errors in business, investing, and life: survivorship bias, the systematic mistake of drawing conclusions from winners while ignoring the silent, invisible evidence of everyone who did the same things and failed.
The Invisible Graveyard of Startups
Every year, somewhere around 500,000 new businesses launch in the United States alone. Roughly ninety percent of startups fail. That means for every success story that lands on a podcast or in a bestselling business book, there are nine graveyards' worth of companies that did many of the same things and got nothing but debt, divorce, and a domain name nobody will ever type again.
Yet the advice industry runs almost entirely on survivor testimony. A founder who built a billion-dollar company gets interviewed, and the interview always follows the same structure. What did you do? What habits did you have? What was your morning routine? The implicit promise is: do these things and you'll get these results. The problem is that nobody interviews the 90 percent. Nobody asks the founder who also woke up at 5 AM, also moved fast and broke things, also pivoted three times, and still ended up shuttering the company, what they did. Their data doesn't exist in the public record. Their evidence is silent.
Nassim Nicholas Taleb, the former options trader and author of The Black Swan, gave this phenomenon a name that cuts deeper than survivorship bias alone. He called it "silent evidence." In the book, he retells an ancient story about the Greek philosopher Diagoras of Melos. Someone showed Diagoras a set of painted portraits hanging in a temple, depicting worshippers who had prayed to the gods and survived a shipwreck. The paintings were offered as proof that prayer worked. Diagoras asked the question that no one in the temple had thought to ask: "Where are the portraits of those who prayed and drowned?"
That question is the knife that cuts through every startup success narrative ever published. Where are the portraits of the founders who followed the same advice and drowned?
Consider Y Combinator, widely regarded as the most selective and successful startup accelerator in the world. The acceptance rate hovers around 1 to 2 percent. Even among this hyper-selected group, roughly 40 percent of companies from the early batches are now inactive. If the most carefully vetted startups in the world fail at 40 percent, what does that tell you about the advice extracted from the ones that survived? It tells you that the advice is contaminated. The signal you're getting from survivors is tangled up with luck, timing, market conditions, and a thousand variables that the founder herself may not be aware of, much less able to articulate in a thirty-minute podcast interview.
The "drop out of college like Steve Jobs" narrative is the clearest case study. Jobs dropped out of Reed College. Bill Gates left Harvard. Mark Zuckerberg did the same. The story writes itself: formal education is optional, maybe even a hindrance. Except the story only writes itself because you never hear from the millions of college dropouts who didn't build Apple, Microsoft, or Facebook. Bureau of Labor Statistics data consistently shows that college graduates earn significantly more over a lifetime than those without degrees, and that dropouts face 70 percent higher unemployment rates. Jobs, Gates, and Zuckerberg aren't the rule that proves education is unnecessary. They're the exceptions so luminous they blind you to the rule itself.
Why Does Your Brain Fall for This?
The question isn't really whether survivorship bias exists. Anyone who thinks about it for thirty seconds can see the logic. The real question is why we keep falling for it even after we know about it. The answer lives in the intersection of two cognitive mechanisms that psychologists Daniel Kahneman and Amos Tversky spent their careers mapping.
The first is the availability heuristic. In a landmark 1973 paper published in Cognitive Psychology, Tversky and Kahneman demonstrated that people estimate the frequency of events based on how easily examples come to mind, not on actual statistical frequency. In one elegant experiment, they read participants lists of names containing either nineteen famous women and twenty less-famous men, or nineteen famous men and twenty less-famous women. When asked which gender appeared more often, the majority got it wrong. They picked the gender whose names were more famous, because those names were more available in memory. Fame made the names easier to recall, and easier recall was mistaken for greater frequency.
This is exactly what happens with startup advice. You can name five college-dropout billionaires off the top of your head. Can you name five college-dropout failures? Of course not. Their stories never made it into your memory in the first place. The availability heuristic doesn't just make you overestimate the success rate of unconventional paths. It makes the failures literally invisible to your reasoning process. You're not ignoring the evidence. The evidence was never available to ignore.
The second mechanism is what Taleb calls the narrative fallacy: the human brain's compulsive need to take a sequence of events and weave them into a story with causes and effects. Neuroscientific research confirms that the brain actively seeks patterns and causation to make sense of incoming information, a hardwired preference that leads us to favor vivid stories over statistical data even when the statistics are more accurate. When you hear that Steve Jobs dropped out of college, audited a calligraphy class, and later built a computer company famous for its typography, your brain can't help constructing a causal chain. The dropout led to the calligraphy, which led to the Macintosh, which led to the empire. The narrative is irresistible. It has the structure of a fable, complete with moral.
But strip the narrative away and look at what's left. A person dropped out of college. Many years later, a company was built. Between those two events lies an ocean of contingency, luck, privileged access to technology and mentors, and economic conditions that had nothing to do with the dropout decision. The narrative fallacy fills that ocean with causation. Survivorship bias makes sure you never learn about the thousands of people who had a similar ocean and sank in it.
Here's the napkin version: your brain overweights what it can easily remember and underweights what it never learned. That's survivorship bias in a sentence.
How Survivorship Bias Kills Scaling Companies
Survivorship bias doesn't just infect the advice you read. It infects the decisions you make inside your own company, especially as you try to scale.
The most common version looks like this. You study your best customers, the ones who renewed, who upgraded, who referred three friends, and you build your product roadmap around their feedback. This feels like good practice. Listen to your happiest users. But you're making the same mistake the military officers made in 1943. You're studying the bombers that came back. The customers who churned, who signed up and never logged in again, who used the product for a week and quietly disappeared, are your missing planes. Their reasons for leaving are the bullet holes you can't see.
A/B testing can fall into the same trap. You run ten experiments. Three show statistically significant lifts. You implement those three and call it a win. But what about the seven that "failed"? Those failed tests contain information about what your users don't want, what confused them, what made them bounce. Ignoring failed experiments is ignoring crashed bombers. The winners in your testing program are not a representative sample of your users' psychology. They're the survivors of a process that selected for a very specific outcome.
The mutual fund industry has turned survivorship bias into an art form. Funds that perform poorly are quietly merged into better-performing funds or shut down entirely. The historical record of the surviving funds looks spectacular, because the failures have been erased from the data set. Academic research has estimated that survivorship bias inflates the average reported return of US equity mutual funds by nearly 1 percent per year. Over a decade, that phantom performance gap means the difference between a fund that looks like it beat the market and one that clearly didn't.
If you're making hiring decisions based on the traits of your star performers without studying why your failed hires didn't work out, you're running on survivorship bias. If you're studying your competitors' successful product launches without looking at the launches that flopped, you're reading the bullet holes on the fuselage and ignoring the engines. If your confirmation bias is making you cherry-pick data that supports your hypothesis, survivorship bias is the mechanism making sure the contradicting data never reaches your desk in the first place.
What Would Abraham Wald Do With Your Data?
Wald's genius wasn't mathematical complexity. It was the discipline to ask a simple question that nobody else was asking: what am I not seeing? The hardest part of survivorship bias isn't understanding it. It's building the habit of looking for the missing planes.
In your business, the missing planes are everywhere. They're the leads who visited your pricing page and left without converting. They're the users who completed onboarding and never came back. They're the job candidates who turned down your offer. They're the product ideas your team killed before they reached the customer. Each one of these is a data point from a crashed bomber, and each one contains information about where your real vulnerabilities are.
The Dunning-Kruger effect amplifies this problem. The less you know about why customers leave, the more confident you feel about why customers stay. That confidence feels like strategic clarity. It's actually the same blind spot the military officers had in 1943: a confident interpretation of incomplete data.
First principles thinking is the antidote. Instead of studying what successful companies did and reverse-engineering a playbook, start from the raw constraints of your specific market, your specific customer, your specific problem. Wald didn't look at what other statisticians recommended about armor. He went back to first principles: the data set is biased because it only contains survivors. What conclusions can I draw given that bias? That reframe changed everything.
The second napkin line for this post: the most important data in your business is the data you don't have.
Try This: The Survivorship Bias Audit
-
Run an exit interview sprint. For the next thirty days, contact every customer who cancels, every lead who goes cold, and every user who signed up but never activated. Ask three open-ended questions: What were you hoping this product would do? Where did it fall short? What did you switch to instead? You're reverse-engineering the crashed bombers. The answers will almost certainly contradict the story your active users are telling you, and the contradictions are where your actual product gaps live.
-
Conduct a failure autopsy on your last three strategic decisions that didn't pan out. Not the ones that succeeded. The ones you quietly shelved, pivoted away from, or stopped talking about in team meetings. Document what you thought would happen, what actually happened, and what variables you underweighted. Most companies have a process for celebrating wins. Almost none have a process for studying losses. Build one.
-
Reverse your competitive analysis. Take the three competitors you admire most and research their failures instead of their successes. Every company you're benchmarking against has shut down products, lost major customers, and made public bets that flopped. Those failures are the engine damage the company survived. Study them with the same intensity you study the company's wins and you'll learn more about the actual dynamics of your market than any success narrative can teach you.
-
Stress-test your hiring model. Pull the profiles of your last ten hires who didn't work out, whether fired, quit within six months, or underperformed. Compare their interview scores, backgrounds, and attributes to your top performers. If the two groups look surprisingly similar, your hiring criteria are selecting for fuselage traits, things that correlate with getting the job, but not necessarily with performing well in it.
-
Before you follow any piece of startup advice, ask the Diagoras question: where are the founders who did this exact thing and failed? If you can't find them, it's not because they don't exist. It's because survivorship bias erased them from the record. The absence of counter-evidence is not evidence that there's no counter-argument.
Abraham Wald died in a plane crash in southern India in 1950, at the age of forty-eight. The irony is almost too precise. The man who spent his career thinking about the planes that didn't come back was killed in one. He was traveling with his wife Lucille to deliver a series of lectures. Neither survived.
But the thinking framework he left behind outlives him in every field it touches. In medicine, researchers now routinely account for survivorship bias in clinical trials by tracking patients who drop out, not just those who complete treatment. In finance, regulators require funds to report returns inclusive of closed funds. And in entrepreneurship, the founders who build companies that last are almost always the ones who develop the discipline to study what they can't see, to interview the customers who left, to learn from the experiments that failed, to seek out the evidence that would disconfirm their favorite narrative.
The most dangerous story in business is the one told only by the people who won. It sounds like wisdom. It feels like a roadmap. And it's missing exactly the information you need most. Chapter 7 of What Everyone Missed covers the specific strategic blind spots that kill companies during scaling, the places where survivorship bias, confirmation bias, and the Dunning-Kruger effect converge to create confident leadership teams making decisions on half a data set. If you've ever felt certain about a strategy and turned out to be wrong, that chapter explains how the certainty itself was the warning sign.
FAQ
What is survivorship bias and how does it affect decision-making? Survivorship bias is the cognitive error of drawing conclusions from a data set that only includes successes while ignoring the failures that were filtered out. The term traces back to Abraham Wald's work during World War II, when he demonstrated that studying only the bombers that returned from missions gave military planners a dangerously incomplete picture of aircraft vulnerability. In decision-making, survivorship bias causes you to overestimate the likelihood that a particular strategy will work, because you're modeling your approach on the handful of visible winners rather than the vast, invisible majority who tried the same approach and failed. It affects everything from startup strategy to hiring to investment decisions.
Why do people keep falling for survivorship bias even when they know about it? Two mechanisms reinforce the bias even after you learn its name. The first is the availability heuristic, identified by Tversky and Kahneman in 1973, which causes you to estimate the probability of an outcome based on how easily you can recall examples of it. Success stories are published, shared, and discussed. Failures are quiet. So the ease of recalling successes distorts your sense of how common success actually is. The second is the narrative fallacy, described by Nassim Taleb, which is the brain's compulsive tendency to weave events into causal stories. When you hear that a founder dropped out of college and later built a billion-dollar company, your brain constructs a causal chain even though the two events may be only loosely connected. Together, these mechanisms make survivorship bias feel like pattern recognition rather than a statistical error.
How can I identify survivorship bias in my own business decisions? Ask one question about every data set you use: what is missing from this sample and why? If you're studying your best customers, ask what your churned customers would tell you. If you're analyzing your most successful product launches, ask what your shelved or failed launches would reveal. If you're benchmarking against competitors, ask about the competitors who attempted similar strategies and no longer exist. The signature of survivorship bias is a data set that feels clean, compelling, and actionable. Reality is rarely that tidy. When your data tells a neat story, it's worth checking whether the messiness was removed by a selection process you didn't account for.
What is the connection between survivorship bias and confirmation bias? Confirmation bias is the tendency to seek and favor information that supports what you already believe. Survivorship bias is the mechanism that pre-filters your information environment so that confirming evidence is easier to find. A founder who believes that "hustle culture" drives success will find an endless supply of survivor testimonials confirming the belief, because hustle-culture failures don't get interviewed or published. Survivorship bias doesn't just feed confirmation bias. It makes confirmation bias feel rational, because the available evidence genuinely does seem to support the conclusion. The missing evidence, the counter-examples that would challenge the belief, was never available in the first place.
How is survivorship bias related to the availability heuristic? The availability heuristic, first described by Tversky and Kahneman, is the cognitive shortcut where you judge the probability of an event based on how easily you can recall examples. Survivorship bias is one of the forces that determines which examples are available to recall. Successful people write books, give talks, and appear on podcasts. Failed people do not. This creates an information environment where success stories are disproportionately available in memory, which the availability heuristic then converts into an inflated estimate of how likely success is. The two biases don't just coexist. They form a feedback loop: survivorship bias curates the examples, and the availability heuristic mistakes that curated sample for the full population.
Works Cited
- Wald, A. (1943). "A Method of Estimating Plane Vulnerability Based on Damage of Survivors." Statistical Research Group, Columbia University. Center for Naval Analyses CRC 432. https://apps.dtic.mil/sti/citations/ADA091073
- Ellenberg, J. (2014). How Not to Be Wrong: The Power of Mathematical Thinking. Penguin Press.
- Taleb, N. N. (2007). The Black Swan: The Impact of the Highly Improbable. Random House.
- Tversky, A., & Kahneman, D. (1973). "Availability: A Heuristic for Judging Frequency and Probability." Cognitive Psychology, 5(2), 207–232. https://doi.org/10.1016/0010-0285(73)90033-9
- Rohleder, M., Scholz, H., & Wilkens, M. (2011). "Survivorship Bias and Mutual Fund Performance: Relevance, Significance, and Methodical Differences." Review of Finance, 15(2), 441–474. https://doi.org/10.1093/rof/rfq023
- Elton, E. J., Gruber, M. J., & Blake, C. R. (1996). "Survivorship Bias and Mutual Fund Performance." The Review of Financial Studies, 9(4), 1097–1120. https://doi.org/10.1093/rfs/9.4.1097
- Bureau of Labor Statistics. (2023). "Employment Projections: Education Pays." U.S. Department of Labor. https://www.bls.gov/emp/chart-unemployment-earnings-education.htm