Jump to content

User:Jkramar/sandbox

From Wikipedia, the free encyclopedia


An existential risk is a possible future event or scenario in which Earth-originating intelligent life goes extinct, or its potential for future thriving development is drastically and permanently curtailed. [1] These are a particularly severe subset of global catastrophic risks.

Some potential sources of risk

[edit]

Probability of an existential catastrophe

[edit]

The following are examples of individuals and institutions that have made probability predictions about existential events. Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century,[3] have had their probabilities predicted with considerable precision (though some scholars claim the actual rate of large impacts could be much higher than originally calculated).[4] Similarly, the frequency of volcanic eruptions of sufficient magnitude to cause catastrophic climate change, similar to the Toba Eruption, which may have almost caused the extinction of the human race,[5] has been estimated at about 1 in every 50,000 years.[6] The relative danger posed by other threats is much more difficult to calculate. In 2008, a group of "experts on different global catastrophic risks" at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19% chance of human extinction over the next century. However, the conference report cautions that the methods used to average responses to the informal survey is suspect due to the treatment of non-responses.[citation needed] The probabilities estimated for various causes are summarized below.

Risk Probability of human extinction before 2100
Molecular nanotechnology weapons 5%
Superintelligent AI 5%
Wars 4%
Engineered pandemic 2%
Nuclear war 1%
Nanotechnology accident 0.5%
Natural pandemic 0.05%
Nuclear terrorism 0.03%
Table source: Future of Humanity Institute, 2008.[7]

During the Cuban Missile Crisis, then President Kennedy estimated that there was between a third and a half chance of nuclear war.[clarification needed]

There are significant methodological challenges in estimating these risks with precision. Most attention has been given to risks to human civilization over the next 100 years, but forecasting for this length of time is difficult. The types of threats posed by nature may prove relatively constant, though new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have only been an issue since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited. Man-made threats such as nuclear war or nanotechnology are harder to predict than natural threats, due to the inherent methodological difficulties in the social sciences. In general, it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.

Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects. Unlike with most events, the failure of an complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced one has no observers, so regardless of their frequency, no civilization observes existential risks in its history.[8] These anthropic issues can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.

Fermi paradox

[edit]

Given the frequency of extra-solar planets, the speed with which the Earth spawned life, and the size of the observable universe, it may seem likely that life would have independently arisen on other planets. However, our planet has not been colonized by aliens, and we cannot see any large-scale astro-engineering projects anywhere in the known Universe; this is known as the Fermi paradox. One proposed, but not widely accepted, explanation for this paradox is that many planets develop civilizations similar to ours, but almost none of these civilizations survive to colonize space.[8][9]

Moral importance of existential risk

[edit]

Some scholars have strongly favored reducing existential risk on the grounds that it greatly benefits future generations. Derek Parfit argues that extinction would be a great loss because our descendants could potentially survive for a billion years before the expansion of the Sun makes the Earth uninhabitable.[10] Bostrom argues that there is even greater potential in colonizing space. If future humans colonize space, they may be able to support a very large number of people on other planets, potentially lasting for trillions of years.[11] Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people that will exist in the future.

Little has been written arguing against these positions, but some scholars would disagree. Exponential discounting might make these future benefits much less significant[citation needed].

Some economists have discussed the importance of global catastrophic risks, though not existential risks. Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.[12] Richard Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale catastrophes.[13]

Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they’re willing to give does not increase linearly with the magnitude of the issue: people are as concerned about 200,000 birds getting stuck in oil as they are about 2,000.[14] Similarly, people are often more concerned about threats to individuals than to larger groups.[15]


Existential risk reduction organizations

[edit]
  1. ^ Nick Bostrom; Milan M. Cirkovic (29 September 2011). Global Catastrophic Risks. Oxford University Press. ISBN 978-0-19-960650-4.
  2. ^ Nick Bostrom; Milan M. Cirkovic (29 September 2011). Global Catastrophic Risks. Oxford University Press. pp. 504–. ISBN 978-0-19-960650-4.
  3. ^ Matheny, Jason Gaverick (2007). "Reducing the Risk of Human Extinction". Risk Analysis. 27 (5).
  4. ^ Asher, D.J., Bailey, M.E., Emel’yanenko, V., and Napier, W.M. (2005). Earth in the cosmic shooting gallery. *The Observatory*, 125, 319-322.
  5. ^ Ambrose 1998; Rampino & Ambrose 2000, pp. 71, 80.
  6. ^ Rampino, M.R. and Ambrose, S.H. (2002). Super eruptions as a threat to civilizations on Earth-like planets. *Icarus*, 156, 562-569
  7. ^ Global Catastrophic Risks Survey, Technical Report, 2008, Future of Humanity Institute
  8. ^ a b Observation Selection Effects and Global Catastrophic Risks, Milan Cirkovic, 2008
  9. ^ Ventrudo, Brian (5 June 2009). "So Where Is ET, Anyway?". Universe Today. Retrieved 10 March 2014. Some believe [the Fermi Paradox] means advanced extraterrestrial societies are rare or nonexistent. Others suggest they must destroy themselves before they move on to the stars.
  10. ^ Parfit, Derek (1984). Reasons and Persons. Oxford University Press. pp. 453–454.
  11. ^ Bostrom, Nick. "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas. 15 (3): 308–314.
  12. ^ Cite error: The named reference weitzman1 was invoked but never defined (see the help page).
  13. ^ Posner, Richard (2004). Catastrophe: risk and response. Oxford University Press.
  14. ^ Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), *Contingent Valuation:A Critical Assessment,* pp. 91−159 (Amsterdam: North Holland).
  15. ^ Eliezer Yudkowsky, 2008, Cognitive Biases potentially affecting judgments of global risks
  16. ^ The Lifeboat Foundation
  17. ^ "About the Lifeboat Foundation". The Lifeboat Foundation. Retrieved 26 April 2013.