During Wwii, The British Devised A Strategy To Protect Their Planes, But A Mathematician Proved Them Wrong

Very often we tend to focus on our successes and overlook our past failures. This is a very common mistake that contributes to partially distorting our view of the world. We focus on what went well and neglect what went wrong. This mechanism is called survivorship bias or survivorship bias. Let’s see what it is.

Survivor bias is a logical error that we can make in different areas when, when evaluating a given situation, we only take into account the positive elements (also called survivors), neglecting the negative elements (the survivors). non-survivors). The result is an erroneous perception of reality, which is based only on physically available facts. Survivor bias is in fact cognitive bias. This cognitive bias does not only affect people but also machines or companies: we tend to draw conclusions by considering only the positive attributes, statistically ignoring the examples that did not work.

#1

image credit: Martin Grandjean Wikipedia

One of the main examples of survival bias dates back to World War II. The mathematician Abraham Wald was summoned by the US military, who asked him to study the best way to protect planes against gunfire: their fear was that the entire plane’s armor would prevent it from flying properly. . So they started looking at each plane to determine the riskiest points, and then reinforcing those areas. But the mathematician quickly realized that they were making an error in judgment: they were victims of a survivor bias.

The US military had not taken into account planes that had been hit but had not returned: it ignored non-survivors and performed analyzes based solely on survivors. In this way, the military tried to protect the wrong parts of the planes: the points where the planes had been hit did not represent weaknesses, but strengths; even though the planes had been hit in these areas, they were still able to fly: they did not need additional reinforcements.

There are many other cases, even in our daily life, where we can observe this mechanism: Gyms tend to show you the results of people who signed up for their classes, but they will never show you the people who – despite having followed the instructions of the experts – did not obtain results. Knowing this mechanism can therefore help us not to base our understanding of events solely on the people or facts that thwarted the prognoses, as these do not constitute a representative sample and risk giving us the wrong picture, or distorted, of the situation.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker