Existential risks

From H+Pedia
(Redirected from Existential risk)
Jump to: navigation, search

An existential risk or existential threat is a potential development that could drastically (or even totally) reduce the capabilities of humankind. According to the Global Challenges Foundation a typical person could be five times more likely to die in a mass extinction event compared to a car crash.[1] The Global Priorities Project later issued a retraction of the statement.[2]

Dystopia and Risk map

Map created by Deku-shrub from the data below.


Key risks and scenarios

The following list of risk encompasses long-term dystopian scenarios in addition to traditionally defined existential risks.



  • Nuclear war
    • Examples: Anything from a single accidental detonation or rogue state up to collapse of MAD and muti-national warfare
    • Causing: Accelerating warfare or environmental collapse
  • Biological warfare or another catastrophic virus
    • Examples: National state attacks through to a 28 days later style ultra-transmissable and deadly plague
    • Causing: destroying humanity, accelerating warfare, natural disasters or environmental collapse
  • Cyberwarfare, terrorism, rogue states
    • Examples: Contemporary nation-state hacking, WannaCry[4] type events, terrorist use of social media, terrorist attacks, contained biological or nuclear attacks
    • Causes: Accelerating warfare due to national existential risks[5], social collapse[6] or reactionary governments[7]


  • Accelerating climate change
    • Examples: This happened now :(
    • Causing: Further natural disasters, depletion of the world's resources or driving dangerous bio or geo engineering projects
  • Errant geo-engineering or GMOs
    • Examples: Solar radiation management, carbon dioxide removal projects
    • Causing: Depletion of resources, further environmental disasters, nano-pollution or environmental collapse
  • Natural disasters
    • Examples: Asteroid strike, deadly cosmic radiation, ice shelf collapse, bee extinction, crop plagues
    • Causing: Resource depletion, social or environmental collapse
  • Resource exhaustion
    • Examples: Fossil fuels, clean water, serious air quality impact, rare earths, localised overpopulation
    • Causing - Social or environmental collapse
  • Nano pollution
    • Examples: Gray goo
    • Causes: Destruction of humanity, natural disasters or environmental collapse

Future computing and intelligence



Stances that can be adopted towards existential risks include:

  • Inevitabilism - the view that, for example, "the victory of transhumanism is inevitable"
  • Precautionary - emphasising the downsides of action in areas of relative ignorance
  • Proactionary - emphasising the dangers of inaction in areas of relative ignorance
  • Techno-optimism - the view that technological development is likely to find good solutions to all existential risks
  • Techno-progressive - the view that existential risks must be actively studied and managed[disputed]

Things that are not existential risks

  • Declining sperm count in men whilst problematic for fertility, will not spell the end of humanity[10]
  • Increased sexual liberal attitudes will not lead to a Sodom and Gomorrah[11] scenario
  • The coming of a religious apocalypse or end times are unlikely
  • Sex robots, no matter what Futurama says
  • Overpopulation caused by life extension will not lead to Soylent Green type scenarios, rather potentially cause localised instabilities and resource depletion[12]

See also

External links


  1. Human Extinction Isn't That Unlikely
  2. Errata to Global Catastrophic Risks 2016
  3. https://en.wikipedia.org/wiki/Knowing_(film)
  4. https://en.wikipedia.org/wiki/WannaCry_ransomware_attack
  5. e.g. Russian interference in the 2016 United States elections
  6. e.g. spread of Islamism
  7. e.g. Theresa May's mass surveillance programmes
  8. https://en.wikipedia.org/wiki/Khan_Noonien_Singh
  9. https://en.wikipedia.org/wiki/Dune:_The_Butlerian_Jihad
  10. http://www.bbc.co.uk/news/health-40719743
  11. https://en.wikipedia.org/wiki/Sodom_and_Gomorrah
  12. Deconstructing overpopulation for life extensionists