Article prepared: 10 July 2014
On May 10th, 1996 five mountain climbers were caught in a blizzard and died on Mount Everest during their summit attempt. Two of the climbers were world-renowned mountaineers, both having extensive experience at climbing in high altitude. Another had guided nearly forty climbers to the summit over the previous six years. There was no single explanation as to why such experienced climbers died; rather it was the interaction between several wrong decisions made on the day. The climbers chose to ignore the weather warning, as there had been no severe spring storm in several years. They chose not to turn around, despite failure to reach the summit by 2:00pm, as they had invested too much time and money. Lastly, as all were experienced climbers, they were over confident and underestimated their chances of trouble.
You may begin to question how these expert leaders and mountain climbers could make such misguided decisions that lead to a catastrophic end. However, misguided decisions are more common than you think – we make them and experience the effects of them frequently.
As it takes a great amount of energy to consciously work through different possibilities and make a decision, we normally use subconscious short cuts. These short cuts help us to cope with complexity and manage new information by aligning it with past similar information. Though such short cuts often work well, they can also lead us to make the wrong decisions. These impairments in judgement are called cognitive biases and they arise automatically in decision making processes.
Cognitive Biases for Safety Leaders
It is not uncommon for safety leaders to make decisions without complete consideration of cognitive biases and how they affect decision making. However, given that safety leaders are in charge of making critical decisions, it is important that they understand and appreciate the role cognitive biases play in their decision making. Cognitive biases can cause leaders to underestimate the risks that may be present or overestimate the capability of safety systems to manage hazards. One small wrong decision may be insignificant by itself, but a series of small wrong decisions can create a path to disaster.
Some types of cognitive bias common in the workplace safety context that safety leaders should be aware of are:
- Recency bias: the tendency to focus on information that is most recent and thus easiest to remember. This type of bias can arise when making crucial safety decisions in workplace safety. e.g. A company may have experienced no accidents in the last 2 years, and because of recency bias, safety leaders may use this information to inform their decision to place less effort on safety operations in future.
- Attribution bias: the tendency to believe one’s own success is due to ability rather than to surrounding factors, while the success of others is due to their surrounding factors, rather than their personal ability.
e.g. A worker is injured on site. The safety leader may attribute this injury to the worker’s carelessness, rather than the potential flaws of the safety management system in place.
- Overconfidence bias: the tendency to over-estimate one’s abilities and knowledge. This bias can be dangerous in high risk situations.
e.g. Safety leaders may believe that because they have been managing safety within the company for many years without any incidents, all future safety decisions should be made by them.
- Confirmation bias: the tendency to interpret and remember information in such a way that confirms one’s existing views over those that challenge them.
e.g. A safety leader may believe that his team operates safely, therefore will search for information that confirms this notion, while ignoring any information that opposes it.
Dealing with Cognitive Bias
The unfortunate truth is that we are unable to completely de-activate or ignore cognitive biases. What we can do is manage them, and reduce the influence that they have on our decision making process.
By using the following strategies, we can ensure that our decisions are well informed and as free from bias as possible:
- Improve awareness and understanding of cognitive biases: If you are aware of the cognitive biases that exist, you are less likely to let them influence your decision making. One strategy to promote awareness is to use a case study activity. Think of a tragedy that has resulted from human error (e.g. the Gretley mining disaster, the Challenger shuttle disaster) and focus on the cognitive biases that might have played a contributing role.
- Check assumptions and actively seek disconfirming evidence: Intentionally seeking information to determine if your assumptions are wrong can lead to more effective decision making. If you believe that your team operates safely, check this assumption by looking for information available that may contradict this notion.
- Collaborate with colleagues or management: When making critical safety decisions, team up with other people during the decision making process. It is always easier to identify cognitive biases in others, and in turn, other people can lead you to view information in a different light.
- Review your past work: Write down all the important decisions you have made in the past year and try to identify if any cognitive biases may have played a role. Understanding what cognitive biases normally impact your decisions can help you be more aware of them cropping up the next time you make similar decisions.
How PSB Solutions Can Help
PSB Solutions understand human behaviour and the influence of cognitive biases in safety-related decision making. PSB Solutions can assist your organisation by providing Safety Leadership and Front-line Leadership Development workshops that deal with human factors and the errors that may arise in decision making. For more information, or if you would like to discuss what you’ve read today, please contact us on (08) 6272 3900 or email us.
References and Further Reading:
Campbell, M. P. (2010). Battling Cognitive Bias. The Stepping Stones, 38.
Krause, T. R. (2005). Leading with Safety. New Jersey: John Wiley & Sons.
Roberto, M. A. (2002). Lessons from Everest: The interaction of cognitive bias, psychological safety and system complexity. California Management Review, 45.