More on “control”

–Like the poverty premium (where poor people have to pay more for key services, such as insurance, credit, energy, shelter), people seeking full control of uncertain task environments pay a “control premium”: Control strategies cost them—and us—more than would be the case were they able to cope ahead or manage the uncertainty.

When their control excesses make the lives of others difficult or worse, this isn’t an externality to be corrected by taxing them or having the rest of us bribe them to become better uncertainty managers. Instead, their controlling behavior shifts the costs onto us. They might as well be demanding money with menaces from us.

–Here’s a different analogy to reinforce the point. Compare algorithmic decisionmaking (ADM) and the current technology for gene editing known by the acronym, CRISPR. When it comes to ADM, the worry is that we don’t know how the algorithm works. What’s happening, we ask, because of the cultural biases imported via the original data into the algorithm? As for CRISPR, the worry is that, even when we know that this rather that gene is being edited, we’re still not sure it’s the right thing to do.

Suppose we had a CRISPR for ADM: We could go into the algorithm and excise cultural bias. But even then we’d worry about, e.g., what is bias to some is not to others. For that matter, is there any doubt that a new mechanism promising greater control in addressing one worry won’t produce another worry, equally if not more important?

The short and not-too-sweet is: Control cannot answer the questions control poses.

–So what? It’s hard to believe, for example, that all the talk about artificial intelligence (AI) “controlling” behavior will not need to be far more nuanced and contextualized, when it comes to really-existing policy and management implications.

Consider underwater oil and gas exploration. Alarms produced by autonomous systems have turned out often to be false alarms occurring when at turbulent seas. Indeed, operating at a higher level of autonomy and having to cope with indiscriminate false alarms may no longer permit the real-time operators to revert, just-in-time, to lower levels of autonomy, e.g., managing via more manual operations, as and when nothing else works in the context under consideration.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s