–Formal risk management frameworks are, in my experience, apt to reduce the complex of “reliability standard, system and possibility” to “risk, asset and probability”, and in the process commit a major category mistake. It’s as if in talking about water you’re immediately asked to think “H2O” and to separate out oxygen and hydrogen from each other along with best measuring each—while all along assuming that this analysis enables you to talk about water as water per se, e.g., having the property of “wetness.”
–“Safety culture” is a redundant phrase, and misleading so, I believe. “Safety” is its most problematic as a noun, when instead “it” is better thought of as a set of adverbial properties associated with really-existing behavior and practices, e.g., “the control room is managing reliably and safely at the same time.” Yes, the behavior and practices in question constitute a culture of sorts, but that culture is the set of practices and not something in addition to be fostered or prior to those practices.
In this way safety is no different from, say, democracy or intelligence. Here too democracy is not a noun so much as an adverb—“behaving democratically in that s/he evidences the following practices, i.e., voting in elections, paying taxes and more—and intelligence is “thinking intelligently,” i.e., “s/he is doing so by virtue of behaving in x, y and z ways”). If I am right, this matters because to think of safety, democracy and intelligence otherwise is like thinking you make fish from fish soup.
–One very major problem with “start simple and then scale up” is that each scale/level is complex in its own right. The map smooths out shoreline, but visit the shore and there’s nothing so smooth for any such border there. To start simple and scale up makes as much sense as trying to pinpoint the shoreline through the eye of a needle.
In reality, there’s nothing more difficult than being simple about complexity. “A maximum of simplicity goes with a maximum of difficulty. . .Being simple is not simple; it is attempting the impossible,” wrote French writer, Georges Perros.
–What crisis scenario do I have in mind? The earth releases gases into the atmosphere that are then triggered by sunlight into storms, droughts and other natural disasters. No, not global climate change, but Aristotle’s theory of comets. I read that a new advance in science and technology threatens to set abroad grey goo. But which grey goo? The one predicted from recombinant DNA experiments at Harvard in the 1970s, the genetically engineered “ice-minus” bacterium for Berkeley strawberry fields in the 1980s, the genetically engineered crops of the 1990s, or the nanotechnology of the 2000s, or something newer?
You think I’m implying that we shouldn’t worry. Wrong. The point here is that we are once again back to a key narrative discrepancy in crisis scenarios—between the stated urgency to innovate and experiment on one side, and the stated requirement for reliability and safety on the other, yet both claims underwritten by demands of unpredictability at the same scale of analysis, the system level. This is a narrative discrepancy because it can’t be written off or talked out of; it however can be managed as one of the messes we are in.
–We hanker after the old language, like that of Baroque music or Mozart, and keep asking why we can’t have more of the same old good. But it’s not only that the language has changed, that we can’t go back, and that new language is needed for the new meanings once pushed further. They want more Bach because that way they don’t have to think about the new meanings nor the older changes.