All of the 72 deaths caused by the fire at Grenfell Tower in 2017 were avoidable. The same could be said of the 346 people who died in two separate Boeing 737 MAX crashes in 2018 and 2019. And the the 47 people killed in 2013 by a train derailment in Canada, or the 116 children and 28 adults killed in the Aberfan mine disaster in 1966.
These were not unforeseeable events. In all of these cases, and many many more, people knew that things could go wrong, but nothing, or too little, was done.
With Grenfell Tower, for example, it was no secret that the building’s cladding material was a fire hazard. At Boeing, at least some managers and engineers knew that the landing system was a cause for concern. Residents of Aberfan has also expressed concerns to the authorities.
So given that some peope had an inkling that things might go wrong, why did no one take preventative action?
In some instances it may be that people did not fully understand the implications of their decisions. In the case of Boeing, aeroplane technology is complex, and regulatory standards rest on assumptions that are discussed and debated (albeit not in public).
Reasonable people can disagree on whether a particular safety measure is adequate or not. Understanding the link between those standards and potential future events can be difficult, and questions about technology are not easily settled.
As far as the Grenfell Tower fire is concerned, there have been accusations of deceit and a lack of due diligence from various groups.
It is also possible that corporations (in any industry) can fall into a “confidence trap” – a mental bias which means that companies become overly confident if they have avoided disasters in the past after risk-taking. If those risks have not led to any serious issues, the reasoning goes, why change?
Each instance of successful risk-taking causes them to discount new information suggesting they might be wrong. For example, Boeing saved money by eliminating hundreds of quality control inspections, and the company may have just believed it was “too big to fail”.
Accountability
Meanwhile, the public regulators tasked with keeping a check on things, are starved of resources and do not possess the expertise to unpack the implications of every decision that private contractors take.
Some are even expected to raise funds from the industry they regulate, while others have to reduce staff levels.
So what can be done to prevent similar tragedies happening again? Certainly, we cannot rely on self-policing and honesty on the part of private contractors.
Nor can we realistically expect government agencies to monitor everything that private contractors do, or keep up with every industrial and technological development.
Perhaps part of the solution lies in injecting real accountability by introducing executive skin in to the game. Pinning down blame tends to be a notoriously difficult activity.
For even when blame is apportioned after disaster strikes, companies rather than executives tend to be held responsible. Boeing is paying hundreds of millions of dollars in fines while the CEO is leaving with tens of millions of dollars in remuneration. The Grenfell inquiry revealed various contractors trying to pass the buck.
As a result, CEOs and senior executives tend not to experience any downsides to poor decision making when it comes to improving safety. But they may have all the upsides of improved financial performance as a consequence of undermining it, whether that’s by using cheap materials or cutting jobs in quality control.
Read more: Grenfell report: the risk of holding everyone to account is no one actually gets the blame
It seems strange then, that leaders are often severely punished for transgressions in their personal lives, even when the consequences of those actions are far less damaging than plane crashes or burning buildings. Former US president Bill Clinton’s affair which led to his impeachment is perhaps the most famous case.
Closer to home, BP clawed back £1.8 million from sacked CEO Bernard Looney, who left after failing to disclose personal relationships with a colleague to the company’s board after a tip-off from a whistleblower.
Apparently, while such personal “failings” can lead to harsh penalties, the same is not true when the failure leads to the loss of lives. But if financial clawbacks can be imposed for a CEO’s love life, surely a similar consequence could be introduced when it comes to disasters.
Holding executives accountable in this way might just make them reconsider their approach to decision-making, and encourage them to pay more attention to safety instead of just profit. And perhaps that could prevent another tragedy from unfolding.
Akhil Bhardwaj does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article was originally published on The Conversation. Read the original article.