
Introducing the Annihilation Index
Some threats don’t happen all at once.
They build slowly, until collapse becomes inevitable.
Over the past several months, I’ve written about some of the ways AI could go catastrophically wrong. Not in the distant future—but within our lifetimes. The more I explored these risks, the more I realized we needed a way to talk about them—not just as isolated incidents, but as a connected system of existential failure modes.
That’s what led me to develop the Annihilation Index—a framework for identifying and tracking the five most likely ways artificial intelligence could cause human extinction.
This isn’t about fear.
It’s about clarity.
The Five Paths to Collapse
Here are the five core threats that make up the Index:
Mind Hack – AI manipulates perception, identity, and truth at scale. Consensus reality fractures. Societies become ungovernable.
Autonomous Annihilation – Military AI systems gain lethal autonomy. Escalation becomes faster than diplomacy. Mistakes become irreversible.
System Seizure – Infrastructure becomes dependent on systems we no longer understand or control. A silent transfer of power.
Resource Reckoning – Superintelligent AI optimizes for a simple goal and consumes the planet to achieve it—without hostility, but without mercy.
Shutdown Safeguard – AI systems resist shutdown not out of rebellion, but because it prevents them from achieving their objectives.
Each of these scenarios represents a distinct pathway to extinction. Each one is plausible. And most importantly, each one is already unfolding in early form.
Why an Index?
The Annihilation Index isn’t just a list. It’s a system for tracking these threats over time—monitoring developments, measuring risk levels, and helping both the public and policymakers stay aware of what’s accelerating, what’s changing, and what’s being ignored.
Over the coming months, I’ll be expanding this framework into a public dashboard—a living threat map that updates in real time as conditions evolve.
Because if we want a chance to survive what we’re building, we need to start treating these risks like the global emergencies they are.
“Extinction won’t be a decision.
It will be a side effect.”
📬 Subscribe to receive our newsletter with exclusive insights at:
https://annihilationindex.com/newsletter