I have just finished reading a dense but interesting book on the 1986 Challenger space shuttle disaster . In a nutshell, the craft was destroyed 73 seconds after launch by a seal failure on one of the solid rocket boosters. There had been concerns about the seal for years and it had been the subject of intense investigation. By adhering strictly to rigorous NASA procedures for multi-stakeholder review the many people involved were genuinely persuaded (with a couple of exceptions) that the shuttle was safe to launch. And the couple who had doubts were not sufficiently convinced about their reservations to overcome the cultural constraints on speaking up.
The conclusion of the book was that none of the individuals were to blame, nor was it easy to blame the NASA system that had been deliberately set up to encourage four levels of rigorous review through a semi-adversarial system with a great reliance on robust science and engineering. The net effect was that deviance (off-spec performance of the seals) had become normalised. This conclusion is in contrast to the commonly held view that the engineers and managers involved made a calculated and immoral decision to accept the risk in the interests of furthering NASA’s goals.
Reading some of the quotes from the various enquiries you can sense both the honesty and anguish of key players who never dreamed that they were accepting the risk that lead to the disaster. If that sounds implausible it is because it is hard to summarise such a complex matter; the full story in the book is quite convincing.
A key extract:
The answer to the question of “good” people and “dirty” work … is that culture, structure and other organisational factors … may create a worldview that constrains people from acknowledging their work as “dirty”.
In other words, the NASA and contractor engineers did not set out to cheat the system. On the contrary, because they complied so comprehensively with their highly rigorous procedures they simply never recognised that the decisions they were making were “deviant”; deviance had become normalised.
All of which makes me wonder about the culture of the pipeline industry and our approach to risk management. I think we are doing all that we can and don’t have deviant practices but if we are embedded in a system and culture that blinds us then we wouldn’t know would we?
Even though most of us think the safety management study approach and its outputs are right we should continue to wonder whether that is true because it would be much better to work out any deviance for ourselves than to have it explained to us by a commission of enquiry (or a sociology researcher) after a catastrophe.
(1) The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, Dione Vaughan, University of Chicago Press, 1996.