In data analytics, you can’t have the whole picture if you look only at those who “survived”! Discover why this cognitive bias has a huge impact on the decisions that shape human resources.
What is it?
You fall prey to survivorship bias when you draw hasty conclusions from an incomplete dataset. It’s a type of selection bias that consists in overestimating the chances of success of something by focusing all your attention on those subjects that succeeded (the “survivors”). But these “survivors” should be seen for what they are: statistical exceptions rather than representative cases. In other words, survivorship bias leads you into logical error because you can’t see the whole picture.
The most famous illustration of survivorship bias concerns the world war two allied fighter planes sent to bomb the zones occupied by the nazis. In view of the damage suffered by the planes that came back to base, a group of experts recommended that the parts of the plane that were riddled with bullets be armored further. But statistician Abraham Wald, determined to reduce the losses caused by enemy fire, identified a logical flaw in that reasoning. Indeed these “experts” had only taken the “surviving” planes into account, not those fallen under enemy fire. According to him, it was the parts of the planes that were least riddled with bullets that needed to be armored, because, he said, the surviving planes showed where the planes could suffer damage and survive. It meant that if a plane received a bullet somewhere else, it couldn’t survive. So these were the areas that needed to be shielded!
When it comes to interpreting data, we constantly fall prey to survivorship bias. You always have to wonder where the data come from, how they were collected and selected and what they really mean. For example, should you necessarily interpret a reduction in the number of reported rapes in a given country as a positive thing? Not if the reduction means fewer victims report the crime for fear of being exposed. Conversely an increase in the number of reported crimes could mean there is a climate of trust and more victimes feel safe to speak openly and press charges.
What does it mean for human resources?
Data analytics is becoming increasingly critical in human resource management. Consequently, HR analytics is constantly threatened by flawed reasoning and biases. Numerous biases are in fact incorporated in the algorithms designed to select candidates.
As far as HR is concerned, artificial intelligence can only utilise the data collected for analysis, often “survivor” data marked by a long history of human bias. In some companies, the data collected are far from representative of the population at large. In tech and finance in particular, historic data do not include many women. Therefore survivorship bias is perpetuated and often even amplified by AI.
How can it be overcome?
There are ways to neutralise some of the biases by “deactivating” the criteria that might be problematic (for example gender, ethnic origin, or age) when some group categories are underrepresented in the sample. Most of all, it is essential to start collecting more representative data so the biases won’t be reproduced by the machine. A machine is only as intelligent as the data it is fed. AI tools should be regarded as ways to assist and complement humans within systems designed to neutralise biases and limit discrimination. To find out more about those bias-neutralising systems, you should read this article about Iris Bohnet’s must-read book What Works: Gender Equality By Design.