Looking past what survived to what fell

Published by Tony Quinlan on

In among the various podcasts I listen to is You Are Not So Smart – an educating and entertaining podcast much of the time. I caught the latest episode when I was travelling and it's highly pertinent to a couple of projects I'm working on at the moment – it was titled Survivorship Bias.  From the summary:

The Misconception: You should focus on the successful if you wish to become successful.

The Truth: When failure becomes invisible, the difference between failure and success may also become invisible.

It's the tendency we have to look at ways of improving a system by looking at what has come through the system – not what failed along the way. It's prevalent in the hundreds of articles of "10 characteristics of successful businessmen" and in far more besides.  But the real problem is brilliantly illustrated in the podcast, with this example from World War II, where statistician Abraham Wald was asked to help decide how/where to put armour plating on aircraft to improve their (and their crews') chances of survival in a hostile environment:

How, the Army Air Force asked, could they improve the odds of a bomber making it home? Military engineers explained to the statistician that they already knew the allied bombers needed more armor, but the ground crews couldn’t just cover the planes like tanks, not if they wanted them to take off. The operational commanders asked for help figuring out the best places to add what little protection they could. It was here that Wald prevented the military from falling prey to survivorship bias, an error in perception that could have turned the tide of the war if left unnoticed and uncorrected.

The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn’t improve their chances at all. 

The mistake, which Wald saw instantly, was that the holes showed where the planes were strongest. The holes showed where a bomber could be shot and still survive the flight home, Wald explained. After all, here they were, holes and all. It was the planes that weren’t there that needed extra protection, and they had needed it in places that these planes had not. The holes in the surviving planes actually revealed the locations that needed the least additional armor. Look at where the survivors are unharmed, he said, and that’s where these bombers are most vulnerable; that’s where the planes that didn’t make it back were hit.

It's an easy mistake to make – focus on the evidence in front of us, the successful ones.  But the risks are huge – if we're in an ordered system, where things repeat predictably, where the context or environment are not shifting and changing, then it might be appropriate. But in many situations, things are more complex – and copying others' success is a poor strategy.  Avoiding others' mistakes is better.

If you're working with a big system, trying to improve the overall results – like in an aid project or government policy – then focusing on the successes is even more dangerous. Understanding the obstacles for those who did not succeed is more critical – doing more of what successful people say they want is rarely going to coincide with what those left behind need to be able to start succeeding.

 

So, if we're looking to increase the incidence of success, first we need to understand the points at which failure happened – and that we can only find out by looking in the corners.  Looking at the survivors will mislead – and the effect could be disastrous.