One of the R.A.F.’s top priorities during WW2 was improving the odds that its notoriously expensive and strategically crucial bombers will make it back from bombing runs over Germany.
The returning airplanes were rigorously scrutinised, and military experts were busy designing improved armouring along the wings and the rear gunner’s station – the bits of the planes that tended to have the most holes in them.
This seemed like a sensible thing to do. Until a statistician by the name of Abraham Wald came along.
“You fucks”, I imagine him saying, “the bits where the planes that came back have holes are precisely the bits where it doesn’t matter. The planes are weakest where the ones you’re looking at are unscathed – precisely because the ones that got hit there didn’t come back. Reinforce the armouring where the planes that came back are immaculate.”
And substantially more planes started coming back.
Obvious. Ex post facto. Ex ante, practically invisible. Most people would make the same error as the experts.
When “fails”, or the things that didn’t make it through a certain selection process, are absent from statistical samples, which is most of the time, identifying decisive factors and points of failure becomes nontrivial – they cannot be divined just from the “wins”, or the things that did make it, without serious lateral thinking.
This principle is heavily in action whenever we try, for instance, to isolate the common features of successful athletes, artists, scientists, therapies, investment vehicles, companies or anything else in order to imitate them. And then, reinforce the armouring along the wings.
This was my reaction when I figured this out and appreciated it fully:
© Warner Bros
This, friends, is survivorship bias.