"The fatally flawed decision to launch the shuttle did notresult from negligence or wrongdoing, but from a tendency by engineers andmanagers to follow the accepted rules of procedure too blindly."

— Sociologist Diane Vaughan

Risk Analysis and Common Sense

Ten years ago, the space shuttle Challenger exploded after takeoff in adisaster that seared the nation's psyche. The cause was incomplete understandingof the reaction of rubbery O-ring seals to temperature changes, but even more tothe inability of managers to think beyond the rigidities of their administrativerules.

The presidential investigating committee concluded that the decision tolaunch "was based on incomplete and sometimes misleading information, aconflict between engineering data and management judgments, and a NASAmanagement structure that permitted internal flight safety problems to bypasskey shuttle managers."

Richard Feynman, the Nobel-prize-winning physicist on the investigatingcommittee, was more blunt. His theory, discussed at length in his book "WhatDo You Care What Other People Think?", is that pressures on managers to getapproval of the project from Congress and to attain a successful flight led themto downplay (maybe subconsciously; maybe consciously) the significance ofnegative engineering reports.

My guess is he's right; Feynman usually was. People often follow ruleswithout thinking much, but more important, they are strongly influenced by theirsubconscious desires — even more than we may realize. These desires deeplyinfluence judgments about probability.

Engineers had come up with a probability estimate of failure as one in100,000 — a figure which Feynman, in characteristically non-technicalterms, called "a bunch of crap." This is not to say that probabilityestimates are worthless, but that they must be based on data that give somecredence to the figure, and that the range of possible errors be taken intoaccount.

Formal risk analysis is now being given more attention at NASA. One ofits technical consultants says: "There is an inherent range of uncertaintyin risk analyses, which are couched in terms of probabilities and averages. But the studies can show trends and help managers determine where they need to placemore attention." Which is to say they must be put in context and modifiedby common sense.

It's no different in our type of risk management. Though most of ourdecisions do not involve life or death, some safety decisions do. When suchdecisions are faced, the probability estimate must be based on data whose degreeof reliability is known, and the decision should not be made by someone whocould be affected by the outcome.

Copyright © 1996 by David Warren
[Return to Warren Report]