During World War II, the United States lost almost 95,000 planes. Hundreds of thousands more pilots lost their lives in the combined total across the world.
It is a truly harrowing thought and a statistic that marks a sacrifice that should never be forgotten. But had an unassuming Jewish mathematician by the name of Abraham Wald not sought refuge in the United States, nor had the courage to speak up, it is a statistic that could have been much, much worse.
After moving to the United States, Wald became a member of the Statistical Research Group (SRG) at Columbia University during World War II. Along with his colleagues, he would apply his statistical skills to a range of wartime problems. And perhaps the most pressing of these problems was the number of aircraft the United States was losing.
As planes returned, the most-hit areas became clear. The wings, tail and parts of the fuselage were sustaining serious damage. To the US military, the conclusion was simple: bolster the planes’ armour in these areas to protect from damage.
But Wald had a different view.
The military, you see, had only considered planes that had survived their missions. Planes that had been destroyed by damage were invariably not part of their assessment. To Wald, then, the damage sustained on surviving planes represented areas where planes could take damage that wouldn’t prove fatal. It followed that they needed to arm areas of the surviving planes that sustained no damage – areas like the engine, the fuel tank and the propeller. Planes that weren’t returning were sustaining damage in these areas.
Wald had exposed a remarkably common cognitive bias and with it saved countless lives. We will never know how many, but his work starkly reveals the dangers that unchecked survivorship bias can pose.
What Is Survivorship Bias?
The US military were guilty of an increasingly common type of cognitive bias. Survivorship bias is the tendency to focus conclusions on the visible people or things that made it past some form of selection process (the survivors) and to overlook those that did not.
This tendency to ignore failures and focus exclusively on the characteristics of visible successes can drastically impact our choices. It can provoke misplaced optimism and encourage misplaced energy as we seek to follow the “proven path”. And it can distort our analysis of the probability of success in following that path.
Real-World Examples of Survivorship Bias
#1: Evaluating Strategies: When deciding on a business strategy, our immediately visible reference points are surviving businesses. The long list of businesses that followed a similar strategy but ended up bankrupt requires much more digging to discover. We are therefore naturally biased towards the strategies that appear to underpin the success of survivors. A commonly cited example is college dropouts. For every Bill Gates and Steve Jobs, there are thousands of college dropouts and failed garage businesses.
#2: Comparing Investments: Comparisons of active investment funds often exclude failed funds from historical analysis. This falsely inflates the average performance of funds, ignoring the possibility of failure and provoking excess confidence. Research has shown this to be a particular issue with smaller mutual funds. Similarly, comparisons by sector often take the current list of constituent companies and “backtest”. Again, this ignores those companies that failed in the same sector in the past.
#3: Comparing People: Social media encourages us to reach conclusions without a complete picture. When we see timelines full of friends holidaying abroad four times per year, we assume everyone does the same. We start to experience Fear of Missing Out (or “FOMO”). But this is a result of focusing on the “survivors”: in this case, the people who actively and regularly use social media. Of course, this doesn’t provide a complete picture.
Dive Beyond the Visible
It is perhaps only natural that we celebrate and study survivors to understand their formula for success. Failures don’t receive equal attention for obvious reasons.
People aren’t interested in the 10,000 college dropouts that failed to grow their garage businesses. They are interested in how Steve Jobs and Bill Gates established multi-billion-dollar businesses after dropping out of college.
People aren’t informed of the thousands of books rejected by publishers every year. They are invited to buy those that reached shelves and enticed by the possibility of creating successful work of their own.
It is ironic that now, in the most information-abundant age in our history, this asymmetry between the visibility of success and failure is arguably greater than ever before. Modern-world media homes in on immediately visible success from more angles than ever. It’s our personal responsibility to find and understand the silent majority that failed.
Rolf Dobelli puts this rather fittingly in The Art of Thinking Clearly:
“The media is not interested in digging around in the graveyards of the unsuccessful. Nor is this its job. To elude the survivorship bias, you must do the digging yourself.”
In other words, when confronted with survivors, we should ask ourselves, “what of the people who failed?” We must dive beneath the immediately visible part of the iceberg to understand its full size. Only when we do that can we assess whether survivors’ success is an expected value or an outlier.
Put another way, failures are an enormous blindspot. At the bare minimum, our choices must recognise success is never the full story.
Hack Your Own Path
And even when we’re confident and informed in that winning formula – even when we believe we have determined an accurate expected value – our decisions must recognise that the world is a dynamic place. What worked over the last ten years will not necessarily deliver by default over the next ten.
If the formula for success were as easy as copying what is visible in those who are successful, the divide between success and failure would become increasingly blurred.
Better beware of the hidden dangers that lurk beneath the iceberg and beware of the possibility that the visible part can melt away in the heat of innovation.
But a critical point: understanding these realities is not an endorsement of paralysis by analysis. It is not a case for doing nothing for fear of hidden dangers and the dynamicity of success.
Instead, it is a welcome reminder that copying a model for success sometimes carries a unique set of risks. A welcome reminder to consider the stories that aren’t told. And a welcome reminder that we may be better off investing our energy on something we love and hacking our own path.
After all, many of the new batch of survivors will have fought battles we haven’t even thought of yet.