Existential risk

Existential risks are risks that threaten humanity's existence.

Very low-probability

 * 1) killer natural virus.
 * 2) alien invasion
 * 3) asteroid impact
 * 4) simulation shuts down
 * 5) gamma ray burst
 * 6) supervolcano eruption
 * 7) black hole impact

Wouldn’t kill everyone, but still worth preventing

 * 1) nuclear holocaust
 * 2) runaway climate change
 * 3) repressive global dictatorship

The really important ones

 * 1) superintelligence - not just AI - but superhumans too
 * 2) deliberate misuse of nanotech (arms race, nanoweapons)
 * 3) accidental misuse of nanotech
 * 4) killer artificial virus
 * 5) antimatter holocaust?
 * 6) particle accelerator disaster

Ways to counteract

 * 1) friendly superintelligence
 * 2) nanofactory restrictions
 * 3) universal sousveillance
 * 4) ocean habitat
 * 5) subterranean habitat
 * 6) antarctic habitat
 * 7) space habitat

"Grid of risk"
Source: Comprehensive List of Existential Risks, Part 2 by Michael Anissimov