The biggest challenge when it comes to systems thinking is a fixation on ‘human error’ and the behaviour of workers at the so-called sharp end when trying to understand and manage safety.|The biggest challenge when it comes to systems thinking is a fixation on ‘human error’ and the behaviour of workers at the so-called sharp end when trying to understand and manage safety.
The biggest challenge when it comes to systems thinking is a fixation on ‘human error’ and the behaviour of workers at the so-called sharp end when trying to understand and manage safety, according to the University of the Sunshine Coast.
It is normal to see human error stated as the primary cause in up to 90 percent of all accidents in most safety-critical domains, said Professor Paul Salmon, director of the Centre for Human Factors and Sociotechnical Systems at the University of the Sunshine Coast.
“This is both misleading and unhelpful for OHS,” he said.
“While errors might occur, they are consequences of interacting conditions within the broader work system – workload pressures, tools, procedures, training, management decisions, culture, financial pressures, and so on.
“Errors are not a cause of anything; they are a consequence of other interacting factors across the work system.”
Professor Salmon, who was speaking ahead of the 2021 Dr Eric Wigglesworth AM Memorial Lecture on Thursday 17 June, said a great example of this is the tragic Air France 447 crash that occurred in 2009.
The aircraft’s autopilot disconnected as it was receiving spurious airspeed data, and in response to this, the pilot placed the aircraft into a climb.
It subsequently stalled and crashed into the Atlantic Ocean, killing all 228 people on board.
“I have spoken to many aviation safety researchers who blame the incident entirely on the pilot,” he said.
“This is based on the belief that he had ‘lost situation awareness’ and made an erroneous response to the autopilot disconnection which caused the aircraft to stall and eventually crash.”
Viewing the incident through a systems thinking lens and instead asking “why did the pilots actions make sense to him at the time?” gives a very different perspective, and Professor Salmon said this shows that the incident was in fact created by a set of interacting factors relating to the weather, the aircrew and their communications, the design of the cockpit, its displays and its alerts, the autopilot, air traffic control as well as the training received by the aircrew.
“The actions taken by the pilot in fact made sense to them at the time based on their experience, the training they had received, and what information the cockpit was giving them,” he said.
“So, without looking at the broader work system and all of its interacting parts, it is not really possible to understand and manage risks.”
Professor Salmon said this is because the interacting conditions that created the ‘error’ in the first place are left alone as interventions focus on the worker (such as retraining or reprisals).
“For this reason, I have spent much of my career in OHS trying to kill off human error as a viable safety concept,” he said.
“It is my strong view that it is not at all useful for OHS professionals.”
Beyond this, Professor Salmon said there is also a research-practice gap where state-of-the-art safety science methods are not being applied in practice.
“What we are seeing here is that, whilst there is a strong appetite for systems thinking in OHS, there are few who are actually applying appropriate methods,” he said.
This gap is also driving a focus on human error as many of the methods being applied in practice are from the human error era, and Professor Salmon said they focus on sharp-end workers and the errors that they could or did make.
Professor Salmon explained systems thinking is a way of thinking about the world that allows us to understand why workers, teams, organisations and systems behave as they do.
“It is useful for OHS professionals because it helps us to understand work systems, their many components, and how they interact with one another to influence behaviour and safety,” he said.
“Proactively this enables us to understand where and why hazards exist and how we can manage them and then when accidents occur it enables us to understand how they were allowed to happen and what we can do to prevent similar things from happening in future.”
Professor Salmon said one of the biggest lessons he has learned about OHS management is that trying to blame somebody for an incident or trying to find the root cause is not only pointless, but it can actually be dangerous.
“This kind of approach simply prevents learning. In the case of incident reporting, for example, if the information is used to try and find a root cause or pin an incident on a particular worker, then nobody will report incidents and the organisation won’t learn anything,” he said.
“Instead, if a systems approach is adopted, and all incidents are treated as learning opportunities with no fear of blame or reprisals, then it is possible to develop a reporting culture where the organisation generates extremely useful safety data. A no-blame approach is critical in OHS.”
There are also a number of emerging challenges for OHS, and Professor Salmon said the increasing use of advanced automation in work systems will be a key challenge for OHS professionals in years to come.
“Advanced automation, robots, and artificial intelligence are changing the nature of work and at the same time safety and risks,” he said.
“The problem here is that these technologies are not being designed with full consideration of how they will interact with human workers and how they could potentially behave under different conditions.
“They are ‘unruly technologies’ in the sense that they will often behave in ways not foreseen by designers and not expected by OHS professionals.”
Driverless vehicles or the recent Boeing MCAS are good examples of this, according to Professor Salmon, who said it will be “extremely important” to apply systems thinking to try and forecast some of the risks that might emerge once these new advanced technologies are introduced into work systems.
“The key here is to do this before they are introduced; we have seen many examples now of where advanced technologies have been introduced with catastrophic consequences,” he said. “Looking even further down the track these technologies may become so advanced that they could even decide to get rid of humans altogether. That is another story though.”
Professor Salmon will be presenting at the 2021 Dr Eric Wigglesworth AM Memorial Lecture on Thursday 17 June from 7:30-9:30 AEST. The event will be held at Pullman & Mercure Brisbane King George Square, Corner Ann & Roma St, Brisbane City QLD 4000, and also live-streamed. For more information call (03) 8336 1995, email events@aihs.org.au or visit the event website.
Article originally published by the Australian Institute of Health and Safety.
Discover how Smart Inspections™ and the Rules Engine are used to manage Critical Control Effectiveness, status, and reporting.