“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

More than a year ago a joint statement was issued by the Center for AI Safety. It was the one sentence quoted above. Famously, it was signed by more than 350 AI experts and public figures.

Now, of course, we cannot dismiss the actual and potential harms of artificial intelligence.

But, just as clearly, these 350 people must be among the last people on Earth you’d turn to for pandemic and nuclear war scenarios of sufficient granularity against which to appraise their AI crisis scenarios.

Leave a comment