The benefits of investigating incidents – near misses as well as accidents – is widely recognised. The UK HSE reminds us that ‘Health and safety investigations form an essential part of the monitoring process that you are required to carry out. Incidents, including near misses, can tell you a lot.’
When companies end up in court following a serious injury, penalties are harsher if there is evidence that organisations did not learn from earlier near misses and minor accidents.
The HSE provide detailed advice on how to investigate incidents in HSG 245. This explains that ‘the investigation should be thorough and structured to avoid bias and leaping to conclusions.’
We all like to think we are not biased – that we are fair, objective and reasonable. But our minds play tricks on us, and unconscious bias can affect how we review an incident report, and the conclusions we come to.
We’ll consider a typical incident investigation and use it to illustrate 4 common biases we are all prone to – and what we can do to overcome bias when investigating a safety incident.
1. Hindsight bias
We hear that that Joe and Pete were involved in a lifting operation near miss on a building site. Scaffolding tubes had to be moved onto a platform. Joe attached a sling around the scaffolding tubes and stayed nearby to monitor and signal, while Pete operated the overhead crane. As Pete raised the load it struck the platform. The poles spilled from the sling and landed next to a team of scaffolders. A little to the left, and someone might have been badly injured.
Hindsight bias tricks us into thinking that Joe and Pete should have known when they started the task everything that we now know after the incident. They should have realised that the load could hit the platform, and that the poles could fall. It’s obvious to us that they should have cleared the area underneath the lift. It should have been obvious to the scaffolders to move when they saw the crane nearby.
To combat hindsight bias we have to ask: what did everyone know before the incident? The scaffolders knew that cranes came back and forth all day, and they were unlikely to interrupt their work every time they saw one; Joe and Pete had seen successful lifting operations before, and didn’t expect this to be any different.
If risk assessments and permits are completed using an online system, we'll have a more reliable source of information about what Pete and Joe and their colleagues knew before the accident.
2. Availability bias
Our minds seek short cuts, to avoid having to think too much. This made sense when we were hunter gatherers – we didn’t have time to ponder whether to chase our prey, or run from the predators. But even when we have time to ponder an incident investigation, our minds are very ready to jump on the most available solution.
Imagine you’ve just read about a prosecution where someone was injured because lifting equipment hadn’t been inspected. Then you hear about Joe and Pete’s near miss. Ah, perhaps the lifting equipment hadn’t been inspected?
If you spend a lot of time looking for evidence to support one hypothesis, because it is more ‘available’, you could expend a lot of resources without success.
Overcoming this bias requires time and determination. If detail is collected consistently, as close in time to the incident as possible, there is less time for memories to fade, or for the availability of explanations to impact how the incident is reported.
During the investigation, consult other people, and ask for their explanations. Sometimes, you need to give an investigation a break – go for a walk to clear your mind, or swap to a different task, and come back to the investigation with a fresh mind.
3. Illusory correlations
It is useful to form associations between things or actions, and responses. Knowing that touching a flame hurts stops us putting our hand into a flame.
But sometimes we form unhelpful connections – known as illusory correlations. This includes stereotyping people from a particular group.
Illusory correlations include halo effects around a particular individual. Because you saw Pete wearing his hearing protection when other workers didn’t, you can’t believe this near miss is his fault. But Joe on the other hand never wears his safety gloves, so of course, he must have made a mistake.
One way to overcome this is to anonymise your reports. If the investigator sees only “person A” and “person B” they can’t apply any stereotypes they have developed. However, this can only be successful for a remote investigation. Once you start interviewing people, you will know who they are.
Widening the investigation team can help – the more diverse the team, the more you can counter each other’s illusory correlations. As widening the team adds to the complexity of an investigation make sure you have a secure way of sharing incident information, and a system that tracks actions assigned and taken.
4. Confirmation bias
Confirmation bias is the way our mind tricks us into thinking that whatever we first thought of is correct. It reinforces availability bias and illusory correlations.
We think it must be Joe’s fault, so we look for evidence to confirm this. Did he fail to check the lifting gear? Did he give Pete the wrong signals? Did he forget to ask the scaffolders to move out of the lifting zone? All the questions seek to confirm the opinion we already have.
Having standard, open questions in your investigation process helps to overcome this bias. These will need to be tailored for each type of incident, but starting with open questions such as ‘What did you see?’ or ‘What instructions were you given before you started this task?’ can be applied in nearly all investigations, and allow for better quality information to be collected.
Sometimes, even the way you ask the question reveals your bias. You suspect Joe and feel sorry for Pete. You ask them the same question ‘What did you do?’ Is your tone of voice the same in each case?
In the early stages of an investigation you can reduce bias if the questions are presented on a screen, with Joe and Pete given the time to consider their responses without feeling they are being accused.
Overcoming bias in investigation is easier with the right tools and the right attitude. Having access to the risk assessments and permits completed before the job, helps us to see what the workers saw, rather than using the hindsight.
Rapid reporting of near misses, using a consistent set of questions reduces the strength of our availability and confirmation bias to take us off course. Having a reliable system for tracking actions across a team makes sure the investigation is completed and reported, that lessons are learned and improvements made.
Hopefully, that means you avoid ending up in court, but if you do, the evidence trail will be your best defence.