Introduces the logic behind the Annihilation Index

Not If, But When: The Case for an Extinction Framework

April 22, 20252 min read

“With artificial intelligence, we are summoning the demon.”
Elon Musk, MIT AeroAstro Centennial Symposium, 2014

The conversation around AI risks usually falls into two camps: overconfidence and overreaction. Either people believe the fears are overblown, or they believe the end is already inevitable.

What’s missing from most of those conversations isn’t more opinion—it’s structure.

If you believe AI presents real existential risk, then you don’t just need to feel it. You need to track it.

That’s why I created the Annihilation Index: a simple but sobering framework that outlines five ways AI could realistically lead to human extinction. Not science fiction. Not paranoia. Just five plausible trajectories we can observe, model, and respond to—before it’s too late.


From Gut Instinct to Global Strategy

In 2023, a statement signed by over 350 top AI researchers—including leaders from OpenAI, DeepMind, and Anthropic—read:
“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

The experts agree: extinction is on the table. And yet, we don’t talk about it with the same urgency, structure, or planning we apply to other high-stakes threats. If a new virus had a 5% chance of ending humanity, we’d be building task forces, stockpiling supplies, and running drills. But with AI, most people are still asking whether we’re overreacting.

That’s a dangerous delay.


Why a Framework Matters

Without a framework, everything feels abstract—distant and vague. But when you break down the risks into five clear threat categories—like Mind Hack or Autonomous Annihilation—you can finally ask real questions.

Which threats are closest to becoming real? What early warning signs should we be watching? Which solutions target which threats? And maybe most importantly: which risks are getting worse with every new breakthrough?

The Annihilation Index gives us a lens—not to fear the future, but to face it.


Clarity Doesn’t Mean Certainty

Creating a framework doesn’t mean pretending we know exactly what will happen. It means building mental models to navigate what might. It means upgrading from vague concern to strategic awareness.

Because extinction isn’t just about evil robots or Skynet. It’s about a hundred invisible shifts that gradually remove human control. If we don’t define what those shifts look like, we won’t recognize them when they arrive. And they’re already arriving.

We don’t need to panic. But we do need to plan.

Because at this point, the question isn’t if AI will reshape our world.

The question is whether we’ll survive what we built.


📬 Subscribe to receive our newsletter with exclusive insights at:
https://annihilationindex.com/newsletter

Marty Suidgeest is a futurist, public speaker, and founder of the Annihilation Index—a bold framework for understanding the existential threats posed by artificial intelligence. With a background in storytelling, strategy, and systems thinking, Marty blends technical insight with human values to challenge assumptions and ignite global conversations. He’s on a mission to ensure that AI serves humanity—not the other way around.

When he’s not writing or speaking about the future of AI, Marty’s helping leaders craft meaningful narratives, building ethical tech solutions, or exploring what it means to live with intention in a rapidly changing world.

Marty Suidgeest

Marty Suidgeest is a futurist, public speaker, and founder of the Annihilation Index—a bold framework for understanding the existential threats posed by artificial intelligence. With a background in storytelling, strategy, and systems thinking, Marty blends technical insight with human values to challenge assumptions and ignite global conversations. He’s on a mission to ensure that AI serves humanity—not the other way around. When he’s not writing or speaking about the future of AI, Marty’s helping leaders craft meaningful narratives, building ethical tech solutions, or exploring what it means to live with intention in a rapidly changing world.

Youtube logo icon
Back to Blog