When I started out in rural development in the early 1970s, one challenge was to manage for optimal ignorance: Professionals should manage to the point where what they are learning is not worth knowing. Learning need take them only to where what they don’t know doesn’t add or subtract value for their acting now. Managing for optimal ignorance and its variants got a good deal of press from a range of writers at that time, notably social scientists Warren Ilchmann and Norman Uphoff, the development scholar Robert Chambers, and Peter Berger the sociologist.
The appeal of optimal ignorance waned when I implemented projects that I had helped plan. On those occasions, I’d find myself mulling over what my first boss, the district commissioner, told me when I arrived in rural Botswana: “A piece of advice, my dear boy. Either stay in the kitchen all the time or never go in.” Nothing major gets implemented as planned, and only by staying in implementation (later, management) did I appreciate how little I knew with my formal education in public policy analysis.
My view is that “optimize” should be banned, as a term, from policymaking and management. Like the dog returning to its vomit, optimality criteria are never satisfied in the imperfection of circumstance. But I didn’t fully understand that until later when I started researching large critical infrastructures, their control rooms and control operators. These large sociotechnical systems are so complex that their managers cannot really “know” what are inevitably unstudied conditions and their real-time inexperience and difficulties are permanent reminders of this. Optimizers with whom I’ve worked, on the other hand, seemed to think it’s better to burn the building down to save the rest of us the trouble of repairing it.
Yes, of course, studying and adapting to unknown unknowns are important and that’s why the idea of “chipping away at ignorance” is not all just hubris. But control room operators are attuned to stay out of unstudied conditions not because some things are not worth knowing but for the opposite reason: No way can these professionals afford to be in prolonged ignorance when the safe and continuous provision of critical services, like water and electricity, is paramount. “[I]f the grid fails and there are blackouts, people die,” one control room executive told us. Control rooms put up with uncertainties they can live with in order to avoid unknown unknowns they can’t or mustn’t tolerate.
But you press: What could be more respectful of complexity than managing and learning adaptively? Change course as uncertainties are reduced and more is learned. No one can be against learning, right?
That may be true as far as it goes, but even then it doesn’t go far enough. Here’s a story from my time as an advisor in Kenya. I had oversight responsibilities for a handful of integrated rural development projects in Kenya’s arid and semi-arid districts. One of the worst projects, in my judgment, was fixed around soil and water conservation measures. You asked villagers there what their three most important development priorities were and they’d say: water, water, water. Water for drinking, water for cooking, water for their livestock, water for everything that mattered in their daily lives. Here instead the donor was spending a fortune on ditches and bunds to prevent soil erosion on the hillsides primarily for crop purposes, without any direct increase in water supply for the households.
Unsurprisingly, villagers just wouldn’t “participate” in the project: Food-for-work schemes didn’t work, giving them hoes or such didn’t work, nothing worked. Later on, I tracked down one of the project’s designer and asked: “Why ever was the project designed that way? Absolutely no one there was for soil and water conservation.” It was like he’d been waiting years for someone to ask him that question. He leaned forward, “But who can be against soil and water conservation?”
So too for managing adaptively: Who, really, can be against it? Why, that would be like arguing against norms of rationality, the scientific method or evidence-based policymaking, worse yet trial and error learning.
Yet, as with soil and water conservation and other projects, we must ask: managing adaptively for what? And here too, what is often desired is its own version of high reliability water, water, water—reliable water for urban use, for agricultural use, for ecosystem rehabilitation and the environment; for ports, for shipping lanes, for recreation, for hydropower, for…you name it, water is needed for it. And a very great deal of that provision depends on large-scale water supplies, electricity supplies and other infrastructures—which is why I keep coming back to their importance.
Obviously, control room operators of large infrastructures (and not all critical infrastructures have control rooms) are from time to time pushed into the unknown unknowns by contingent events. (In case it needs saying, unknown unknowns have been contingent in the myriad ways past “unobservables” have been, e.g., Epicurean atomism, phlogiston, and aether.) But it’s too easy to confuse being pushed into ignorance unwillingly with a much valorized albeit half-blind trial and error learning that places a premium on “testing” unstudied conditions intentionally. (Not to worry when being “fully creative, imaginative, and inspired” means being bereft of vigilance and self-reflection at the same time!)
For control operators, real time is too important to experiment in when their first error ends up being our final system trial. The last thing we want is our airplane pilots “to embrace failure” mid-flight, notwithstanding all those anodyne business and management articles on the virtues of error, failure and unstudied conditions. Too much of that privileging borders on modern-day priestcraft, miracle-mongering, and the criminal.