A real emergency: suicide for fear of death

–What else can we do, senior executives and company boards tell themselves all the time, when business is entirely on the line? In this emergency, we have to risk failure in order to succeed!

But what if the business is in a critical service sector? Here, when upper management seeks to implement risk-taking changes, they rely on middle-level reliability professionals, who, when they take risks, only do so in order to reduce the chances of failure. To reliability-seeking professionals, the risk-taking activities of their upper management look like a form of suicide for fear of death.

–When professionals are compelled to reverse practices they know and find to be reliable, the results are deadly. Famously in the Challenger accident, engineers had been required up to the day of that flight to show why the shuttle could launch; on that day, the decision rule was reversed to one showing why launch couldn’t take place.

Once it was good bank practice to hold capital as a cushion against unexpected losses; capital security arrangements now mandate they hold capital against losses expected from their high-risk lending. Mortgage brokers traditionally made money on the performance and quality of mortgages they made; in the run-up to the 2008 financial crisis, their compensation changed to one based on the volume of loans originated but passed on.

Originally, the Deepwater Horizon rig had been drilling an exploration well; that status changed when on April 15 2010 BP applied to the U.S. Minerals Management Service (MMS) to convert the site to a production well. The MMS approved by the change. The explosion occurred five days later.

–In brief, ample evidence exists that decision rule reversals that required professionals in high-stakes situations to turn inside out the way they managed for reliability have instead led to system failures and more: NASA was never the same; we are still trying to get out of the 2008 financial mess and the Great Recession that followed; the MMS disappeared from the face of the earth.

Forcing cognitive flips on the part of reliability operators and operators—that is, exile them to conditions they do not know but are told they must nonetheless be skilled for—is the surest way to throw acid into face of high reliability management.

–“But, that’s a strawman,” you counter. “Of course, we wouldn’t deliberately push reliability professionals into unstudied conditions, if we could avoid it.”

Really?

The oft-recommended approach, Be-Prepared-for-All-Hazards, looks like the counsel of wisdom. It however is dangerous if it flips mandates around to requiring emergency organizations to cooperate around many more variables, using information they will not have or cannot obtain, for all manner of interconnected scenarios, which if treated with equal seriousness, produce considerable modeling and analytic uncertainties.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s