Jump to content

Talk:Global catastrophic risk/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2Archive 3

obesity

do you think obesity would be something plausible to add the the list? Sancro 02:40, 20 October 2006 (UTC)

Great one! I am in favour.--Daanschr 09:03, 20 October 2006 (UTC)
Certainly greed and gluttony aided the collapse of the Roman Empire, so I can´t see why our current civilization would be any different. However, prior to making any edits I would suggest that the above discussion is resolved, as there is a debate whether entries such as this should be added to this article or societal collapse. nirvana2013 17:30, 11 November 2006 (UTC)
So long as you have a source to show that society is at threat to a future collapse from obesity (and I doubt one exists). If you want to show that past societies failed for that reason, and have a source, that would go in societal collapse. -- Stbalbach 18:41, 11 November 2006 (UTC)

Misunderstanding of the nature of the problem

The end of civilisation is completely misunderstoodby this article. Civilisations are complex unstable societies and are highly prone to collapse. There have been between 20-40 civilisations existing to date on planet earth, and every one except the modern Western Globalisation has either been absorbed by its neighbours or collapsed - so the end of civilisations is a regular occurrence. Civilisation as a form of human society only came into existence about 5,000 years ago and for more than 95% of the period of modern human existence on earth there were no civilisations. As a transitory feature we will eventually move to a post civilised condition as different from civilisation as pre-civilised communities were. Why doesn't this article really talk about the end of civilisation instead of all of this catastrophism scenarious about commetry impacts and the like, John D. Croft 10:58, 8 November 2006 (UTC)

Well, we are discussing the naming issue above. We already have an article that discuss societal collapse. This article is representative of the existing literature out there that discusses "doomsday" events (however you define and name that). -- Stbalbach 15:05, 10 November 2006 (UTC)
Absolutely John, I agree with you 100 percent. The article title and content are completely at odds with each other. However following Stbalbach call for consensus to be reached, he has unfortunately been unwilling to engage in the process of finding consensus. He is currently a lone voice in asking for no changes. nirvana2013 18:41, 13 November 2006 (UTC)
For those of you interested in exploring that topic, please see the stub "Fall of a civilisation"John D. Croft 10:27, 9 January 2007 (UTC)

Dysgenics

I removed the following citing WP:V

  • Dysgenics. A lack of natural selection and the tendency of the more intelligent to have fewer children would lower the average health and intelligence enough to lead to an eventual collapse of civilization, associated with eugenics theories.<ref name="dysgenics">[[Richard Lynn]], ''Dysgenics: Genetic Deterioration in Modern Populations'' (1996). Cf. Colum Gillfallen (1970), "Roman Culture and Dysgenic Lead Poisoning" in: ''The Fall of Rome: Can it be Explained?''.</ref>

The citation Dysgenics Genetic Deterioration in Modern Populations, Richard Lynn, Praeger Publishers, 1996, is not a reliable source, in my opinion. WP:V is Wikipedia policy and states, "'Verifiable' in this context means that any reader should be able to check that material added to Wikipedia has already been published by a reliable source." Since this book is not available from most libraries, even from major university libraries in English speaking countries, such as Columbia, Ohio State, University of Texas at Austin, University of Washington, Winnipeg, University of Oxford, Exeter and University of Glasgow (please see ISBN 0275949176). Also, it is not stocked by Amazon. [1] (I don't think that volunteer editors should have to purchase references to verify content, but include the link for completeness.) This book is not available to any reader and does not satisfy WP:V. This title is absent from university libraries because its topic is not one of serious academic scholarship or research. Please see related discussions at Talk:Dysgenics and Talk:Richard Lynn. Walter Siegmund (talk) 15:13, 10 November 2006 (UTC)

It does seems to be a rare book, probably in part because it cost $60 for 237 pages when it came out - which is typical for academic books on very specialized subject with a small print run. Here is a review of the book which appeared in the Journal of Social, Political, and Economic Studies, Volume 23, Number 2, Summer 1998. Here is another review from Population and Development Review, Vol. 23, No. 3 (Sep., 1997), pp. 664-666 -- I think the fact it was reviewed in academic journals says something. Richard Lynn is of the University of Ulster in Northern Ireland, so he's not a self-publisher and has academic credentials. It is a controversial view on a controversial topic, but that's OK we can talk about controversial views. Also the book is available on Amazon, although currently out if print, it could be purchased on the used market and Lynn seems to be a fairly well published author with many recent titles to his name. I really don't see why this is an unverifiable source, there are many sources used on Wikipedia that are even more difficult to come by (19th century books). BTW I'm not trying to defend Lynn, there are other ways to deal with the problem of non-mainstream views without removing his points of view entirely. Also the linked journal reviews above are pretty detailed and provide verifiability for the basic sentence written here. If you want we can replace the link to the book with a link to the journal reviews. -- Stbalbach 16:11, 10 November 2006 (UTC)
I'll place those links in the dysgenics article when I have time. The main issue here is that User:MONGO and User:Wsiegmund are trying to delete the dysgenics article, and in the process are deleting all links leading to dysgenics, and try to discredit and remove sources so they can strip the dysgenics article of any useful information. --Zero g 16:21, 10 November 2006 (UTC)
Hmm.. if true, that's not good. Not that I agree with dysgenics. Verifiability, Not Truth.. -- Stbalbach 16:40, 10 November 2006 (UTC)
Chris Brand is notable for being fired by Edinburgh University in 1997 for conduct that had allegedly "brought the university into disrepute." According to the web link, a personal web site not meeting the minimal standards of WP:V, the book review was first published in the Internet magazine, PINC (Politically Incorrect).[2] The editorial policy of PINC is, in part, "Material that is published here is published because the editors consider it interesting and worthy of debate, not because they believe it to be "correct" or "virtuous".[3]
I agree that WP:V is Wikipedia policy and should govern our work here. Also, I agree that it is easy to find examples where this policy has not been followed. My opinion is that dsygenics may be included in the article if the sentence, "This theory has no scientific basis." is added. This is stating the obvious, that no citations of peer-reviewed scientific literature have been provided.
Zero g misrepresents my efforts on Dysgenics and has tried to personalize our differences. I try to encourage my fellow editors to follow Wikipedia policies such as WP:V, WP:NOT and WP:NOR. Also, I don't know why I shouldn't remove links to Dysgenics from scientific articles, when Dysgenics has no scientific base. --Walter Siegmund (talk) 19:32, 11 November 2006 (UTC)
Actually when I first saw the dysgenics entry appear, I'm pretty sure I added a disclaimer, but it appears to have been deleted along the way. You can add it back, but I'm not sure how to ensure its "stickiness" so someone doesn't delete it in the future on POV grounds. -- Stbalbach 19:40, 11 November 2006 (UTC)
If the others concur, we can restore it with a disclaimer. By the way, cf. in the text that I moved to this talk page (above) is used to refer the reader to a work that is cited more completely nearby. In this case, the full citation should be given since it is the only reference to the work and "cf." should not appear. Walter Siegmund (talk) 19:51, 11 November 2006 (UTC)
I think a disclaimer would be POV pushing because other theories aren't treated in the same manner. After all, dysgenics has a better foundation in literature than most of the theories presented in this article.
Regarding my misrepresentation, I made the following observations:
  • Walter Siegmund asked for help on the talk pages of several people sharing his POV before any attempt at discussion on the talk page of the dysgenics article.
  • The arguments for their case is very poor, mostly consisting of the demonisation of authors, claiming sources do not meet standards, and the seemingly inability to quote sources (which could be included in the article) that back up their POV.
  • The unwillingness to improve the actual article
  • The attempt to delete every single link on Wikipedia leading to the dysgenics article, as well as the way MONGO went about it leaving several pages with grammatically incorrect lines and the odd argument left in each edit summary, 'no science'.
I think it's important for people to take these issues into account, and do not decide to violate Wikipedia's neutrality for the sake of avoiding a nasty edit war. --Zero g 18:43, 12 November 2006 (UTC)
When something is controversial and non-mainstream, particularly on sensitive topics like racial issues (such as scientific racism) a disclaimer is very appropriate. This is also a middle ground compromise - I would hope you would recognize this is a non-mainstream controversial view? -- Stbalbach 22:15, 12 November 2006 (UTC)
Evolution is a non-mainstream controversial view as well, disbelieved by over a billion Muslims and Christians. One problem I have here is that Siegmund and Co seem unable to find sources that disprove the dysgenic theory. I'm actually not sure what exact ideology is causing them to push their pov. Besides, dysgenics fits perfectly into the mainstream evolutionary theory, which is another thing that puzzles me. I was actually of the opinion there is broad acceptance of the fact that genetic deterioration kicks in the moment selective pressure vanishes. I feel like I'm discussing evolution with my Christian neighbor, please explain to me how exactly this entire argument isn't ridiculous? --Zero g 10:38, 13 November 2006 (UTC)
Evolution is mainstream within the scientific community, which is what we are talking about in Wikipedia, not what the man on the street in New Deli thinks (unreliable source). What is "genetic deterioration"? That sounds like a social value judgment. Also selective pressures never "vanish" they just change, another value judgment. BTW I am not Christian and believe evolution is accurate. -- Stbalbach 16:13, 13 November 2006 (UTC)
No, these are simple logical value judgments. I understand where you are coming from, but not everyone, including scientists, are supporters of relativism. So lets make this clear, some people believe value judgments are just, others don't, and leave it at that.
Assuming you have the intellectual capability to think outside of your relativistic state of mind, you should understand the point of view that dysgenics is a logical concept.
As an interesting side note, when I choose to examine this problem from a relativistic approach I don't see why a relativist would bother to edit wikipedia to begin with? Care to enlighten me? --Zero g 18:55, 14 November 2006 (UTC)
I assume when you say "relativism" you are talking about Moral relativism? That certainly would be an interesting discussion which many people have had throughout history and probably will forever, but in the context of Wikipedia, you have not shown there is a mainstream acceptance among scientists of the idea of "genetic deterioration" as being anything but an opinion. -- Stbalbach 13:55, 15 November 2006 (UTC)

Dysgenics - the whole paragraph is ridiculous, please can somebody remove this. —Preceding unsigned comment added by 84.74.154.198 (talk) 12:33, 9 August 2008 (UTC)

Done.--Ramdrake (talk) 12:54, 9 August 2008 (UTC)

I would strongly object to the bit about dysgenics to be included, at least if there isn't a disclaimer saying that this theory has no mainstream scientific endorsement.--Ramdrake (talk) 17:50, 16 March 2009 (UTC)

It's POV pushing, pure and simple, and not the first time for Verwoerd. I've warned him again. --Ckatzchatspy 21:02, 16 March 2009 (UTC)

NASA quarantine

Kwarizmi placed a cite request on end of civilization for the quarantine claim, and I was wondering what he would like to do. I don't know who wrote it, but I tried to find a source. I looked at The Quarantine and Certification of Martian Samples which contains a history and analysis of past techniques, and yet, nothing definitive except for proposals. Should I just remove the text to talk? —Viriditas | Talk 05:52, 13 January 2007 (UTC)

Looking at End_of_civilization and the section in question, it may appear to the reader that NASA agrees with the position that extraterrestrial life poses a danger to humanity and for this reason all materials returning to space are sterilized.
  • I'm not so sure that all materials returning from space are sterilized, which is why I put the ‹The template Talkfact is being considered for merging.› [citation needed] there. Sterilization (in the biological sense) implies some rather harsh treatment using temperature and/or radiation which I'm not certain NASA would perform upon the hypervaluable space-stuff as a matter of course.
  • I'm also not liking the way the paragraph reads in toto, since it implies that NASA acknowledges the risk of extraterrestrial life and sterilizes as a defense mechanism.

Kwarizmi 21:39, 13 January 2007 (UTC)

Article needs a historic context

this article could really benifit from some summaries of world regions whose civilisation has been largely lost. The Mayans would be an excellent example. Besides adding breadth, this treatment would give the article a sense of realism and remove it one step from some of the low probability cosmic risks that undermine importance of the subject. Anlace 19:13, 14 January 2007 (UTC)

This is a future studies article. Historical events are covered in societal collapse. -- Stbalbach 22:38, 14 January 2007 (UTC)

Important name change occurred without consensus

An editor changed the name of End of Civilization without consensus. i have no objection to creating a new article in the name chosen and i have no objection to using much of the text from end of civilization; however there is a need for a separate article on End of civilization, since that is a distinct and separate notion from extinction of all planetary biota (and in fact much more likely) and also distinct from the destruction of the planetary mass integrity. i propose we restore the End of civilization article and refine it to focus on scenarios that involve civilisatio's end; it can be a sub article to the Existential risks to civilization, humans and the planet. regards Anlace 16:35, 18 January 2007 (UTC)

Check out the article Fall of a civilisation. nirvana2013 16:32, 21 January 2007 (UTC)

See protracted discussions above, the move was made to address previous confusions over the article title. No need to re-create those problems with YADA (yet another doomsday article). I'm starting to loose count (still waiting for End of mammals). -- Stbalbach 18:11, 21 January 2007 (UTC)

RS

In some of the inline comments in the article text, Discover magazine is being used as a RS. I quote the lead paragraph of the WP article on it "Discover is a science magazine that publishes articles about science for a general audience. The monthly magazine was launched in October 1980 by Time Inc. It was later sold to the Walt Disney Company in 1991, but in October 2005 was sold again to Bob Guccione Jr., founder of Spin and Gear magazines and son of Penthouse founder Bob Guccione." DGG 22:15, 3 March 2007 (UTC)

Telomere Length Degredation

This is blatently false and should be removed. This is not occuring, and is so fringe most people have never even heard about it. 129.59.52.135 04:45, 24 March 2007 (UTC)

Verifiability, Not Truth. -- Stbalbach 01:17, 25 March 2007 (UTC)

Order Listing of "Future scenarios"

Because this is an emotionally charged topic that many people look quickly to discredit wouldn't it be more effective as a learning tool to list "Future scenarios" in order of most likely... such as "Human" causes first, rather than "Cosmic" causes? - Steve3849(talk) 23:27, 30 March 2007 (UTC)

Systemic collapse

Why does this article make no mention of the possible threat to human civilisation (though not particularly the species itself) of a staggered, widespread collapse of communication and utility systems, as described, for example, by Roberto Vacca in 'The Coming Dark Age'. Even if the advance of computerisation has left Vaccas predictions somewhat innacurate (he assumed civilisation would have degraded to medieval levels by the beginning of the twenty-first century), surely our current reliance on the internet and similiar networks makes such events more likely (Vacca proposed a scenario in which the widespread failure of one communication system or utility could lead to the consequent collapse of others, pushing people past the 'three hot meals' point in an escalating spiral of social collapse). Are there any futurists looking at something similiar today? --KharBevNor 17:58, 7 April 2007 (UTC)

if you can find a source, then possibly. i think it would be quite unlikely loss of internet would lead to the collapse of humanity; if you're thinking of overreliance on technology or technology advancing beyound the requirement to serve humanity, this is covered under technological singularity. --Jw2034 16:07, 28 October 2007 (UTC)

Contrary to popular belief, people can survive and prosper without cellphones and the internet. If mass digital communication systems break down, we would still be left with radio, television, mail, and other forms that have been around for much longer than "teh intarwebs", and civilizations throughout history did just fine without even half of that. Civilization, and indeed humanity, would not collapse - it would simply be different. DySWN (talk) 00:30, 27 September 2008 (UTC)

Vacca's proposal was more complex, and was made far before the invention of the internet or cellphones: he pointed out that much of modern society is dependent on networks of utilities that are dependent on each other. For example, a widespread loss of electrical power could result in a loss of pressure in water systems, leading to their contamination, with no power to boil the water, and hospitals disabled. Breakdowns in chains of supply could also lead to cascading effects, ie no diesel fuel means no trains, commuters and goods not able to get to their destination, etc. He posited a sort of tipping point where a large enough collapse in one large economy might cause one or more networks of global trade and communication to collapse, leading to economic strife and war which would trigger further system collapses, etc. I wish I could furnish more details but I lost the book. I admit that in practice the highly global nature of the world and the seeming robustness of many utility systems makes his predictions unlikely, I was simply wondering (two years ago, wow) whether the increasingly integrated nature of many of our systems and utilities (for example, medical records being stored in centralised databases rather than at individual hospitals and doctors surgeries) could make us more vulnerable to this sort of thing, and if any modern scholars, perhaps more high profile ones, are talking about this idea. --KharBevNor 22:44, 11 March 2009 (UTC)

Einstein bee quote

I removed the mention of this probably spurious quote, as it has also been removed from the Bee article, also see Snopes. OttoMäkelä 15:03, 8 May 2007 (UTC)

Redundant definition

This article's first line reads:

Risks to civilization, humans and planet Earth are existential risks that would imperil humankind as a whole and/or have major adverse consequences for the course of human civilization, human extinction or even the end of planet Earth.

So the reader clicks over to the article Existential risk, which begins:

In future studies, an existential risk is a risk that is both global and terminal. Nick Bostrom defines an existential risk as a risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential."

By that definition, how can there by any existential risk that would not "imperil humankind as a whole an/or have major adverse consequences [etc.]"? It's as if the article is admitting up front that it covers the same content as Existential risk, which is a good sign a merge is in order. If there is a real difference between these articles' intended subjects, it should be made more clear in the intro. -- Schaefer (talk) 16:16, 16 May 2007 (UTC)

Uranium 238 Half-life

What about when the majority of Uranium 238 decays in just a short while based on its half-life which now approaches the age of Earth? Does this qualify? —Preceding unsigned comment added by Wonkipedian (talkcontribs) 16:12, 7 June 2007 (UTC)

I've removed this section, it was based on a flawed understanding of radioactive decay: just because U-238 has a half-life approaching the age of Earth does not mean it will all decay at once. Slug 03:16, 10 June 2007 (UTC)
Correct: U-238 is already decaying all around us. Solution: don't be near it, handle it, and definitely do not ingest it! :) Thank you for being bold and adding in your information, even if it has since been cut out. Sláinte! --Bossi (talkgallerycontrib) 03:29, 10 June 2007 (UTC)
I didn't suggest it would decay all at once. I'm suggesting the statistical probability increases as time approaches the half-life marker with increasing decay as it gets closer to the half-life. Of course U-238 is decaying all the time. Any further explanation why this doesn't qualify?Wonkipedian 16:01, 10 June 2007 (UTC)
Also where any radioactive particlate is introduced into a biosphere, such as DU in Bosnian, Afghanistani and Iraqi villages there is also the matter of increased concentration of toxins through food chain. There are many reports of ghastly birth defects already in such areas. These populated areas will likely remain inhabited. It is uncharted territory scientifically whether the uranium will disperse over time compared with becoming more concentrated related to food chain. Any references on this topic would be appreciated. - Steve3849 talk 16:58, 21 June 2007 (UTC)
Wonkipedian, that's simply not how radioactive decay works. The U-238 atoms don't have little clocks in them that measure how long it's been since they were created. They don't know what time it is, and the rate of decay will not increase "as time approaches the half-life marker". The probability of any given atom decaying before time t+1, given that it has not decayed at time t, is constant for all t (with t and 1 in whatever time units you like). Consider reading Exponential decay for more. Also, information must be attributed to a verifiable source to be included in a Wikipedia article, so even if your theory were valid it would need attribution. -- Schaefer (talk) 03:31, 19 August 2007 (UTC)

Telomeres

I removed this material because it seems completely implausible. All organisms with linear chromosomes have telomeres and I have never heard of any suggestion that long-lived species such as coelacanths have a problem with this. It needs a reliable source.

"Telomere : Some researchers theorize a tiny loss of telomere length from one generation to the next, mirroring the process of aging in individuals. Over thousands of generations the telomere erodes down to its critical level. Once at the critical level we would expect to see outbreaks of age-related diseases occurring earlier in life and finally a population crash;[1] however, this possibility may not result in extinction due to the self-reinforcing effects of natural selection."

As a note I can find this hypothesis article (link) but there doesn't seem to be anything apart from a suggestion from a single scientist that it might happen. This idea neither seems plausible or notable. Tim Vickers 19:50, 19 October 2007 (UTC)

Earth's core freezing over

I have read from an Australian Science magazine stating that in approximately 8 billion years, as the Earth's core gradually cools down, the Earth will decline to a temperature where life would not survive. The Inner Core (which is cooler) is gradually growing larger, and the Outer Core is shrinking - in the event where the core becomes entirely solid, the Earth would no longer be able to sustain life. HOWEVER, by this time, the Sun would have already transformed into a Red Giant. Thus impossible. Benlisquare 08:30, 2 November 2007 (UTC)

The sheer amount of weight and gravity pushing material from the crust down into the core would creates everlasting friction, heat, magma, etc. The core will never cool down.

-G —Preceding unsigned comment added by 70.24.148.249 (talk) 22:01, 22 June 2008 (UTC)

Essay

While well referenced, can this article be anything more than a postulative essay rather than encyclopedic article?

Attributable statements are not enough to warrant inclusion of such a hypothetical topic--ZayZayEM (talk) 02:26, 28 November 2007 (UTC)

I agree. I did some copyediting to make it a smoother reading, but I have to stop here (no time). My hope is that someone else will pick up from here. But my general impression is that this article is quite unorganized (it mixes short term scenarios with end of the universe scenarios; scenarios based on scientific models with pure speculation; etc.). The article seems to be the baby of a few editors who cling to it. Inviting serious outside editors would be very helpful. And the article definitely needs more citations. Northfox (talk) 23:33, 26 February 2008 (UTC)

Peak oil?

The article references that oil production peaked in December of 2005, but I do not believe this is the case. At any rate, it does not cite and article that backs up the statement. I think the oil production has continued to increase. 75.72.48.36 (talk) 00:39, 9 December 2007 (UTC)Jason

other Y2K style days

I've heard some expect September 9, 1999 to cause many problems (as older electronics often output 99999.. as an error code), and recall some fax machines and copiers not working properly on the day. There's also the 2037 timestamp bug. Should more information be added about these in the artcile? --NEMT (talk) 03:55, 7 January 2008 (UTC)

Apophis

Apophis is a strong possibility. It is worthy of this article. It is set to hit Earth on Easter Sunday of 2036. —Preceding unsigned comment added by 76.252.18.100 (talk) 04:16, 18 February 2008 (UTC)

No original research. Carl.bunderson (talk) 04:39, 18 February 2008 (UTC)

Death subsection

Don't this section unneed in this article? Because this article is about humans, and risk like greenhouse global warming. Fate is little relevant to this article and we have this in future of the solar ssytem , Earth, and Sun. What about removing all fate subjects from this article.--Freewayguy (Webmail) 19:51, 1 April 2008 (UTC)

No. The article is about "Risks to civilization, humans and planet Earth" i.e. not only humans, the Earth also. Otolemur crassicaudatus (talk) 11:18, 8 April 2008 (UTC)

Non-global-terminal risks

I removed "Gulf Stream shutdown" and "Famine" as there's no claim that they're global terminal risks (also, the former would logically be subsumed under the "Humanity" section with global warming). (Actually, I see physics disasters, at least, are also duplicated.) If they're to remain, then other endurable risks should be included, and the "Types of risks" section amended. I wonder if this would make the article uselessly broad or not. Including global endurable risks might not, but then is Gulf Stream shutdown a global risk at all? ~~ N (t/c) 16:05, 16 May 2008 (UTC)

IPv4 address exhaustion

It is obvious that IPv4 address exhaustion, while it may be global, is certainly not a terminal risk (and the cited article doesn't support such an interpretation). I tried removing this section, but JCDenton2052 apparently thinks it belongs. Jbshaler (talk) 03:56, 27 May 2008 (UTC)

Andromeda

As the reference itself states, it is unlikely that you will see the earth's orbit be disrupted:

Calculations indicate that the Andromeda Galaxy is on a collision course with the Milky Way.[2] Andromeda is approaching at an average speed of about 140 kilometres (87 miles) per second and thus impact is predicted in about 3 billion years. This merging could eject the solar system in a more eccentric orbit and an unwanted position in the merged galaxy causing our planet to become uninhabitable, even if an actual collision does not take place.

Roadrunner (talk) 05:03, 11 July 2008 (UTC)

Though, looking at the articles on stellar evolution, the article on the sun, they seem to indicate that the sun's own life-cycle would have already rendered the earth uninhabitable. --Alphamone (talk) 08:07, 5 December 2008 (UTC)

Sinking of California

Edgar Cayce mentions that "He may have been the source for the idea that California would fall into the Pacific ocean (though he never said exactly this)." I don't think that we already have an article on this idea (catastrophic subsidence of California), but I think that it's been mentioned enough times in pop culture -- by Cayce, by Madame Blavatsky and the Theosophists, in Curt Gentry's 1977 book Last Days of the Late, Great State of California -- to justify creating one. -- 201.17.36.246 (talk) 23:02, 11 September 2008 (UTC)


Zombies

Why did you guys remove the Zombies scenario? It was a credible threat under scientific reasoning.99.132.162.242 (talk) 20:18, 3 November 2008 (UTC)


Renaming

Maybe we just should rename it Global Crisis it links here anyway and is a clear term. The article should focus on currant risks to the world because there are many future risks, lets say 10000 plus years and past risks like the Cuban Missile Crisis. The only problem is keeping it NPOV. I suggest anything that is POV be verified as such. We could add a Theories section. On what is causing this as a global crisis. Stream-lining the article with a focused narrow time frame and scope. There are several articles on the Economic Crisis, and Financial Crisis, and the Food Crisis. They all have a better scope as they deal with a specific time frame and subject. The main problem is righting an article like this is where it is a very currant topic. For example if Wikipedia were around in 1929, how long would it be before there was an article on the Great Depression. Saying such I will be working trying to find some sources that are apt and reliable to place in this article.User:Empireheart (talk) 11:55, 2 December 2008 (UTC)

I do agree that renaming the article might be a good idea simply because it's incredibly long. The term 'Global Crisis', though, is perhaps not the best choice. Something shorter than the current title but not misleading. Sparkstarthunderhawk (talk) 00:54, 25 March 2009 (UTC)
Title length is less of an issue than accuracy and clarity. There is no commonly accepted shorthand phrase for this subject. It would be original research to create a phrase like "Global Crisis" - the first thing any editor would do is Google on that phrase and since there is little supporting evidence for it being used here, there would be a problem - or editors trying to make this article fit that phrase by privileging sources that happen to use it. Both of these would result in weird editing of the article trying to make it fit the title, the tail wagging the head. The current title has been stable for a long time, unlike all the previous shorter titles that this article once had. Accuracy and clarity is better than shorthand when there is no commonly accepted shorthand. Green Cardamom (talk) 03:04, 30 March 2009 (UTC)

Political Mass Movement

Risks to civilization should feature the possibility of a highly destructive political mass movement (as defined by philosopher Eric Hoffer). The Taliban in Afganistan, with its rejection of art and destruction of historical structures gives some hint. An example from science fiction can be found in Arthur C. Clarke's story about a perfectly designed organization that starts out as a women's sewing circle and takes over the planet. DonPMitchell (talk) 02:06, 14 April 2009 (UTC)

Pole shift theory removal

I'm being bold here and removing the "pole shift theory" section. The article currently miscites the given source (the source talks about orbital cycles, and the Wiki article talks about abrupt changes), and the source says nothing about abrupt extinctions (only about cyclic ones). I won't oppose some kind of restoration of the section that is true to the science. Awickert (talk) 03:39, 24 June 2009 (UTC)

LHC

Are those arguments are complete speculation, I'd like to see it removed in the same way 'zombies' was removed, if it has no credibility why is it on the page? —Preceding unsigned comment added by 147.188.248.115 (talk) 10:28, 26 January 2010 (UTC)

The article is incomplete

The article misses a LARGE number of scientifically accepted possible scenarios. Some of the most important ones are the theories on how life could have got extinct from planet Mars, had it harbored any in the distant past. These include weakening of the planet's magnetic field and sudden increase in solar activity. Also the geological activities (geothermal activities in specific) of a planet is crucial in sustaining microbial life, which in turn is the most resilient and abundant form of life. Subh83 (talk) 15:28, 22 April 2010 (UTC)

Rename to "Risks to Earth and human civilization"?

I think we can agree that the current article title is too long and unwieldy; the question is what we should rename it to. I propose Risks to Earth and human civilization. The word "planet" seems unnecessary, as it's clear from the capitalization of "Earth" that we're talking about the planet. Additionally, "risks to humans" seems like it could refer to small groups of humans, which is not what this article addresses. In fact, just Risks to Earth and civilization might work also. Thoughts? —tktktk 20:41, 1 May 2010 (UTC)

After looking over previous discussion, it seems like these three words were chosen to represent three different levels of potential impact. However, my previous two criticisms (about the superfluity of "planet" and the ambiguity of "humans") still stand. I suggest Risks to Earth, humanity and civilization, which solves both issues and is still a bit shorter than the current title. If there are no objections within a couple of weeks, I'll be bold and move the article myself. —tktktk 00:54, 14 May 2010 (UTC)

Statements of clarity

The article needs some expanding in a couple of areas:

  1. Under 'other cosmic threats', the example of Mercury colliding with Earth is given. It should be pointed out that such a collision has already occured early in Earth's history, when a Mars-sized object collided with proto-Earth, destroying both. The resulting matter reformed into today's current Earth-Moon system. This theory is very widely accepted within the scientific community.
  2. Under 'global pandemic', such a pandemic need not just affect humans to have a catastrophic effect - if a pandemic occured in, for instance, grain crops, the effect would be just as severe. Examples include the current decline in honeybee population, and blights affecting banana/plantain crops (a staple crop in tropical regions) [compare with the much smaller scale Irish famine and potato blight].
  3. Under 'climate change and global warming' and 'climate change and ecology' (yep, its mentioned twice!) it states: Around 70 percent of disasters are now climate related – up from around 50 percent from two decades ago... In the last decade, 2.4 billion people were affected by climate related disasters, compared to 1.7 billion in the previous decade... but does not say why. The reason is twofold: a) the increase in the human population overall; and b) the tendency of humans to densely aggregate in areas where there is increased risk (eg) coastal ports which are subject to tsunamis, flooding, sealevel rise, hurricane impact, etc.
  4. Supervolcanoes are mentioned, but not mass flood basalt eruptions, which have occured a number of times in Earth's history. These are even more catastrophic than supervolcanoes.
The Yeti (talk) 13:55, 4 July 2010 (UTC)

Even if (an) effect of global warming might be a "Risk to civilization, humans, and planet Earth", that doesn't mean it fits in the category. — Arthur Rubin (talk) 09:30, 29 August 2010 (UTC)

Now that is more how you use "Even if ...", as any proper Anglophone would Know, pompous AR. 99.190.88.77 (talk) 09:39, 29 August 2010 (UTC)

Merge Future scenarios with Possible scenarios section of Human extinction

The Future scenarios section should be merges with the Possible scenarios section of Human extinction. --Chealer (talk) 16:42, 30 July 2010 (UTC)

Some merger with human extinction seems called for. These articles are about very similar topics. K. the Surveyor (talk) 08:32, 2 November 2010 (UTC)

In a sentence

"The two greatest risks to Civilization, Humans, and the Planet Earth are Civilization and Humans"

How many thousands of times in how many thousands of years in how many thousands of ways has this been said?

Or at least civilization is the greatest risk to those others. Why are we lumping these three ideas together in one article? Wolfdog (talk) 17:12, 28 December 2013 (UTC)
There are potential risks stemming from our larger cosmos (e.g., asteroid or comet impacts, gamma ray bursts from supernovae, the aging and expansion of our star, galactic collisions... potentially even entropy that pose grave risks to humans and civilization), so I wouldn't necessarily agree with the statement that the two greatest risks to civilization humans, and Planet Earth are civilization and humans. — Preceding unsigned comment added by Annaproject (talkcontribs) 00:48, 7 March 2014 (UTC)

Near Tautology

How about shortening that to "the Greatest Risk is Risk"? Human Civilization per se does not pose a risk, unless we (wrongly) claim that every civilization that is not a globalized industrial forced-growth money focused civilization is none. In that case we could equally well argue that this "globalized industrial forced-growth money focused" monstrosity is not a civilization but something entirely different (see definition of Civilization). It is only this particular brand of human civilization (of which there were many)that poses existential risks not only to itself but to the Biosphere at large. The Khmer Civilization, the Inka, Mayans and countless other civilizations did not pose any risk to the planet.

Suggestions

  • Change "Planet Earth" to "Earth's Biosphere". The (rocky) planet itself is rather unimpressed by human activities.
  • Change "Risk to civilization" to "Risk to human livelihood" (Civilizations constantly change - that is not a risk but normal cultural evolution)
  • Change "Civlization" on the risk side into "Perpetual growth based industrial civilization"

Wassermensch 15:29, 12 April 2014 (UTC)

Hyperinflation

Hyperinflation and economic collapse should be included as a cause of and contributor to civilizational collapse. Mustang19 (talk) 02:54, 26 July 2013 (UTC)

Can you provide any citations from the peer-reviewed literature? Rolf H Nelson (talk) 17:41, 6 August 2013 (UTC)

Distance Conversions: How many signifacant digits?

Converting 1, 3, 10 and 140 kilometres into miles: Since the kilometre figures are accurate to only one or two significant digits (not five or six), I have so rounded the converted miles. - Glenn L.

End of the Earth

People say that the world is going to end on the 21st of December 2012


http://www.washington.edu/newsroom/news/2003archive/01-03archive/k011303a.html

(reply to unsigned comment) Hmm, we're still here! 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:17, 17 April 2014 (UTC)

Self collapse

this user did not use original research for this " Self-collapse - - Permanent settlement must always end in a collapse. The idea depends on infinite resources, which we simply do not have. There is a great dependancy on future generations or scientists to overcome this, however reliance on this may not be the best idea. Civilizations have crashed over mere intervals of only hundreds of years throughout history, and the next one may be defined by running out of oil, or even water. Civiliation has expanded for 10,000 years since the neolithic revolution, and has been dependant on the expansion of agriculture. When there is not enough water to support more agriculture, or the larger populations that inevitably follow, there could indeed be a crash. 70% of the fresh water available today goes toward agriculture, with 17-20% more expected by 2020. <http://news.bbc.co.uk/2/hi/science/nature/755497.stm>" but i will agree to have it removed until i add sources and wikfy it

-Ishmaelblues (—The preceding unsigned comment was added by Ishmaelblues (talkcontribs) .

What is the focus of this article?

According to the into: "The risks discussed in this article are at least Global and Terminal in intensity." So why is there a section on "Climate change and global warming" and another section on "Climate change and ecology"? First why are there two, and second those are neither global nor terminal. (Just because some people die does not make it "terminal" by the usage in this article, otherwise old age should be listed.) This entire article is littered with minor threats and needs a large trimming. Ariel. (talk) 09:12, 24 January 2011 (UTC)

Global warming has global/terminal potential. Green Cardamom (talk) 07:52, 7 February 2011 (UTC)
Global Warming, however, does not. The only actual extinction risks here are AI, Grey Goo, the cosmic risks and possibly Nuclear Winter. Civilization collapse is obviously easier (though most of these aren't even that). 82.11.1.60 (talk) 21:41, 7 May 2011 (UTC)

Proposed article split

I think this article should be split into natural and artificial risks (to civilization, etc.). I think that areas like the possibility of humans knocking a meteor out of a collision course with Earth counts as a natural risk.--Meximore (talk) 09:03, 28 March 2011 (UTC)

Good idea. 99.181.133.112 (talk) 21:32, 28 April 2011 (UTC)

It might be preferable to sort risks by probability. On a log scale the rough probabilities would sort according to their order of magnitude. The important risks would stand out.TeddyLiu (talk) 15:25, 14 April 2013 (UTC)TeddyLiu

Reorganize the articles?

I think this article try to put too many things together. It could get more manageable if was divided in:

  • Existential risk - risks to civilization/end of civilization (human maybe still existing but non in a civilized society - à la Mad Max - but one could argue that this too is a type of civilization. Ok, maybe we need a definition)
  • human extinction (no more human anywhere)
  • end/destruction of planet Earth (humans on another planet/space station or extinct) [article to be created?]

...and I'm tempted to use Doomsday event as the main article. Has anyone got other suggestions? --Dia^ (talk) 20:46, 17 August 2011 (UTC)


I strongly recommend the avoidance of the term "existential threat". This term is incorrect, and basically an error. If it is intended to mean "threat to existence" then that is what should be written. "Existential" means something entirely different.203.184.41.226 (talk) 02:17, 19 August 2012 (UTC)

I agree, the article is way too long and broad. I'd say a split between Existential Risk and Global Catastrophic Risk would be best, as this is currently how it's discussed in the field . “There are two commonly used definitions of "existential risk." The first definition is a threat that could cause the extinction of all humans. The second definition is broader: an event that could greatly diminish the accomplishments of humanity and humanity's descendants. The second definition is the one used by Nick Bostrom of the Future of Humanity Institute at Oxford. "Existential risks are catastrophes that end humanity's existence or have a drastic permanent disruptive effect on the future potential of human-derived civilization."[3] This seems like a logical categorisation of risks. I'd be happy to help someone reorganize, if they wanted. This is something that does need to be done though.KenTancwell (talk) 06:52, 27 March 2015 (UTC)

@203.184.41.226 - yes, this phrase is correct. "Existential threat" means "threat to the existence of..." (it is often used in a specific form, e.g. "existential threat to America," or "existential threat to the Middle East," or "Existential threat to the world economy." If you are referring to "existentialism" (the philosophical topic) that is a [term of art] specific to the field of philosophy. The word "existential" means what colloquial rules of English suggests it would mean: related to existence. See, e.g., Merriam-Webster, or Cambridge University (which uses the similar term "existential risk." — Preceding unsigned comment added by 24.130.70.182 (talk) 04:42, 17 January 2014 (UTC)

References

  1. ^ "What a way to go", The Guardian (April 14, 2005). See External links.
  2. ^ Hazel Muir, "Galactic merger to 'evict' Sun and Earth," New Scientist 4 May 2007
  3. ^ GiveWell Labs Report on Global catastrophic risks

The unsourced section removed by Special:Contributions/Arthur_Rubin (see below) can have elements added.

Escalating Societal Disparity

Right now we are witnessing a greater disparity than there has been for centuries. The ability for small financial elites to make themselves richer through access to expanding technologies is considerable. This danger has been well documented by Marshall Brain, Martin Ford, Jeremy Rifkin and Noam Chomsky. Noam Chomsky writes:

"In this possible terminal phase of human existence Democracy and Freedom are more than 'values to be treasured' - they may well be essential to survival"

The problem is not affluence - it is the ability of the few to acquire exclusive means of getting richer, or to consolidate this power, at the expensive of everyone else. A phenomenon very common in the third world is the very richest few percent of society buying all the agricultural land and using it to grow export crops, often being the root cause in these countries of widespread poverty and cyclical famines. It is very difficult to make this argument as it is constantly opposed to far right or pro market ideologues. Key technologies are likely to contribute to this escalation of wealth, namely 3D printing and nanoreplication, robotics, ubiquitous computing. The idea of a Singularity makes assumptions that are equally valid on what has become known as a 'disparity hockey stick'. While mass reproduced products benefit many, only a very small selection of humans has been enjoying the extreme riches that come with owning widespread automated infrastructure for production of goods or services. This point is emphasized by martin ford in the book the lights in the tunnel and has been touched upon by Jeremy Rifkin in the end of work. These two both argue strongly that in the next decade most paid labor will become automated or done by robots. This will end widespread societal access to jobs. That in itself creates a faster pace of irreversible unemployability or "jobless growth" that we will have ever seen before in human history. Already the middle class is evaporating in most western societies. In every society before in human history disparity and similar power asymmetry of the current epidemic has led to massive killings, or revolt. It is no surprise we have in fact been seeing some early types of revolts in the Arab world, and more recently in London.

In our currently prevailing macro-economic model there is no solution to such a hypothetical crisis, other than the masses realizing this would imply the cancellation of the social contract and engaging in open revolt against the state. The troubling question remains how easy it would be to force out a hyper-empowered elite, since they could use the same automated technologies to protect their property rights and interests in arguably ruthless ways.

99.35.12.88 (talk) 02:48, 27 August 2011 (UTC)

This was removed by Special:Contributions/Arthur_Rubin ... "rift between the poor and the wealthy widens." where Economic inequality wiklinks "between the poor and the wealthy widens." with only the comment "Inappropriate edits.". Why inappropriate edits, Art? 97.87.29.188 (talk) 19:26, 27 August 2011 (UTC)
Why inappropriate edits, indeed. Yours is a clear WP:EGG. Perhaps "[[economic inequality|rift between the poor and wealthy]] widens" might be appropriate, although I don't think so. What you have is just absurd. — Arthur Rubin (talk) 22:49, 27 August 2011 (UTC)
The current issue of Foreign Policy magazine might be useful Rich Country, Poor Country; The economic divide continues to expand by Joshua E. Keating (September/October 2011). 99.181.138.168 (talk) 06:12, 28 August 2011 (UTC)
I suppose we have to assume that is correct, even though editorial-like and contradicted by the real evidence. That doesn't support your particular Wikilinking, though; mine might be close to the truth as seen in that article. — Arthur Rubin (talk) 06:27, 28 August 2011 (UTC)
"The Truth"? Maybe see Wikipedia:Truth ... Special:Contributions/Arthur_Rubin? 99.181.139.210 (talk) 01:53, 29 August 2011 (UTC)

Clarify "Runaway greenhouse effect" with Runaway climate change (Earth habitability specific). 216.250.156.66 (talk) 18:38, 29 August 2011 (UTC)

Mercury vs Venus == lots of big chunks too close for confort?

If Mercury collides with Venus, isn't there the risk lots of significantly sized chunks will be flung out and stay close enough to hit Earth? --TiagoTiago (talk) 05:11, 10 November 2011 (UTC)

That is about as likely as the Earth falling into Sol. Orbits don't just change. Venus's orbit would have to magically decay at an incredible rate to somehow be near enough to Mercury to come remotely close. Mercury would probably become a moon, in that event. Much more likely is a moon-sized extra-solar-system object hitting one of the planets. And even that is pretty rare. In fact, the moon is suspected by some to be result of such a collision. Note that even 'mere' asteroids like the one that hit at the last mass extinction event ('XLE' or extinction level event) are rarer and rarer every day as the solar system ages. It's kind of like if you start with a drawer full of random utensils and keep picking spoons out. Eventually, it gets pretty hard to find a spoon. In other words, there's only a finite amount of asteroids above a certain size, and once they become part of a planet, they can't hit another. 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:12, 17 April 2014 (UTC)

{{Portal box|Extinction|Society|Environment}} 99.181.156.221 (talk) 01:21, 18 November 2011 (UTC)

Why? Extinction seems reasonable, but not Society or Environment. — Arthur Rubin (talk) 03:29, 18 November 2011 (UTC)

From Talk:Planetary_boundaries#resource.3F ... Acidifying oceans helped fuel mass extinction; Great die-off 250 million years ago could trace in part to waters' change in pH by Alexandra Witze October 8th, 2011; Vol.180 #8 (p. 10) Science News

99.181.134.6 (talk) 06:39, 18 November 2011 (UTC)

Identical sections of text in article

The following text, occurring both in section 2.1.3 Climate change and ecology(3rd paragraph) and section 2.1.5 Climate change and global warming is identical. One of them needs to be removed or at least rewritten:


Around 70 percent of disasters are now climate related – up from around 50 percent from two decades ago.[30] These disasters take a heavier human toll and come with a higher price tag.[29] In the last decade, 2.4 billion people were affected by climate related disasters, compared to 1.7 billion in the previous decade and the cost of responding to disasters has risen tenfold between 1992 and 2008.[30] Destructive sudden heavy rains, intense tropical storms, repeated flooding and droughts are likely to increase, as will the vulnerability of local communities in the absence of strong concerted action.

117.206.41.60 (talk) 11:32, 19 November 2011 (UTC)

Not about eschatology or eschatological questions

   I've removed the following language --

The concept is expressed in various phrases such as "End of the World", "Doomsday", "Ragnarök", "Judgment Day", "Armageddon", "the Apocalypse", "Yawm al-Qiyāmah" and others.

-- whose markup reads

The concept is expressed in various phrases such as "[[Eschatology|End of the World]]", "[[Doomsday event|Doomsday]]", "[[Ragnarök]]", "[[Last Judgment|Judgment Day]]", "[[Armageddon]]", "the [[Apocalypse]]", "[[Yawm al-Qiyāmah]]" and others. <!--[is it really pertinent to definite Future studies here?]The prediction of future events is known as [[futures studies]].-->

Actually (contrary to the piping of its first link) "End of the World" does not mean eschatology, tho that study does embrace many non-scientific approaches to the question

What will happen in the future, foreknowledge of which would put my mind at ease by making all this anxiety and/or torturous suffering in my own life seem worth enduring?

The actual relation between various end-of-the-world accounts and eschatology is this: Those who ask the eschatological question are more likely to be satisfied with answers that involve the end of the world, which is why the other actual 6 phrases (and likely the unspecified "others") tend to involve EOTW events.
   More to the point, that 2nd of the lead 'graph's two sentences has nothing to do with the entire remainder of the article (including the title and the lead sent), which is about big events entirely subject to scientific inquiry. How the eschatology sentence got into the article is a tantalizing question, but not worth pursuing IMO. It just needs to go out.
--Jerzyt 04:19, 20 November 2011 (UTC)

   BTW, note that the scope is accurately stated in the first titled section, "Types of risks": It does not extend its ambit to theories like Steady state universe, Big Crunch, Heat death, and coasting on the momentum of the Big Bang to arbitrarily large, arbitrarily rarified, and arbitrarily interactionless voids, each of which anyone who thinks there is value in eschatology should feel obliged to either address or explicitly deny the reality of; that failure is further evidence of the article's topic and eschatology passing only like two ships on a foggy night.
--Jerzyt 04:19, 20 November 2011 (UTC)

What is the probability that anyone of these fatal risks could happen?

If we consider flipping a penny or getting a 6 from a dye (dice?) 1/2 + 1/6 = 8/12 = 2/3. So the question is, how many events could lead to a fatal scenario? And what are there probabilities?

I think that is what this article is trying to get at. 24.25.237.226 (talk) 20:50, 5 December 2011 (UTC)

This is important, and poorly presented. We can ignore the eventual burn out of the sun if the Human species is most likely to go extinct before then. I think it would be valuable to assign a rough order of magnitude probability to each possibility so that that can be sorted by importance.TeddyLiu (talk) 15:29, 14 April 2013 (UTC)TeddyLiu

Human survival

This article, as well as Human extinction, present a rather gloomy view of the future. I'd like to suggest we contemplate putting together a complementary article called "Human survival" that summarizes information about potential technological solutions to the survival threats that humans face. It could cover the material in this article's "Precautions and prevention", as well as material from some other articles where there is coverage of the topic; namely: Space and survival, Geoengineering, and Planetary engineering. I've seen interest in this idea in other forums, so I think it can fly.

Here is one possible organization for the content:


Human survival in the future may depend on cultural adaptation to changes in the environment and the judicious use of technology to compensate for significant risks.

  1. Adapting to the environment
    1. Modified crops
    2. Genetic engineering of humans
    3. Alternative energy
    4. Renewable resources
  2. Adapting the environment to us
    1. Geoengineering
    2. Displacement of the Earth
    3. Space mining
    4. Modification of the Sun
  3. Changing to a new environment
    1. Space colonization
    2. Transhumanism
  4. Recovering from a massive die-off

What do you think? Regards, RJH (talk) 20:06, 22 June 2012 (UTC)

Proposal: Create a Portal for 'Existential Risk, Human Survival, and the Future of Complex Life in the Universe'

In response to RJH (talk) 20:06, 22 June 2012 (UTC):

You have an excellent idea there, and since the work being done on Existential Risk (see for instance the Cambridge Centre for the Study of Existential Risk prospectus - http://www.cser.org/) covers a wide range of crucial concepts, I believe it would make for an incredible Portal project.

I would propose, further, that the range of this Portal should include research on Existential Risk factors (sterilizing asteroids, etc), as well as the Fermi Paradox and the unresolved question of how widespread complex life is in the universe.  I have particular interest in sustainability and in forward-looking methods for safeguarding human survival, such as the building of Arcologies to minimize our footprint and provide stable havens from catastrophe, or archives for rebooting society.  And, should we survive, this theme merges into the interests of Transhumanism, as only by surviving the 'Great Filter' (if there is one) could humanity give rise to anything beyond itself.

(Apologies for not inline-linking these references; I'm on my iPad and it's a bit hard.  Will flesh out links from home.)

What might we need to do to proceed on such a project?

http://wiki.riteme.site/wiki/Fermi_paradox http://wiki.riteme.site/wiki/Arcology http://wiki.riteme.site/wiki/Transhumanism http://wiki.riteme.site/wiki/Great_filter

--Aqaraza (talk) 18:34, 16 July 2012 (UTC)

Thank you, Aqaraza. With regards to Portals, anybody can put one together per Wikipedia:Portal. You might try fleshing one out on a sandbox page in user space, then asking for feedback from some of the appropriate WikiProjects. Regards, RJH (talk) 21:26, 16 July 2012 (UTC)

Add book?

The Fate of Species: Why the Human Race May Cause Its Own Extinction and How We Can Stop It. by Fred Guterl the executive editor of Scientific American. Here is an op-ed by Guterl Searching for Clues to Calamity July 20, 2012 NYT. 108.195.138.171 (talk) 07:05, 22 July 2012 (UTC)

ignition of atmosphere nonsense

Hi guys, I have just found this statement in the Article:

"Experimental accident: Investigations in nuclear and high energy physics could conceivably create unusual conditions with catastrophic consequences. For example, scientists worried that the first nuclear test might ignite the atmosphere"

It is a well known fact, that the burning of nitrogen is endotherm... — Preceding unsigned comment added by 91.146.151.79 (talk) 11:28, 16 September 2012 (UTC)

If you're interpreting "ignition" to mean "oxidation," Teller's concern was not with oxidation but fusion of nitrogen, ostensibly an exothermic process when radiation losses are neglected. --Vaughan Pratt (talk) 11:27, 5 April 2014 (UTC)

"Existential"?

It has become common to refer to a "existential threat" to humanity. But that is a misuse of the word existential, surely. "A threat to human existence" is a very different thing.203.184.41.226 (talk) —Preceding undated comment added 00:27, 30 September 2012 (UTC)

"Existential" can mean "...relating to existence..." (Webster). So "Existential risk" and "existential threat" are semantically correct and also (to me) clear (obviously it means a threat to someone or something's existence), and anyway seems to be a prominent term in the peer-reviewed literature. The exact phrase "existential threat" also pops up in geopolitics, usually in the context of "existential threats to Israel".Rolf H Nelson (talk) 21:12, 15 June 2013 (UTC)

Article Name

Ok, I know this doesn't affect the quality, but does anyone else find the title a little bit cheesy? Surely, couldn't one think of one that didn't sound like a sci-fi novel? Like, I don't know, "List of potential dangers that could lead to mass death or annihilation"? All right. That title's not that great. But can somebody please think of something? It's kind of bugging me. 101.165.7.81 (talk) 06:16, 6 December 2012 (UTC)

I agree. How about "Survivability of humankind". I don't know how to edit titles, or I would. Anybody?
Will9194 (talk) 04:56, 8 December 2012 (UTC)
To change the title, the page must be moved. Detailed instructions can be found here. Mysterious Whisper (SHOUT) 14:52, 8 December 2012 (UTC)

I disagree the title should be changed. The title was discussed years and ago (see talk archives) and it was very difficult to arrive at something people can live with. Remember that the title needs to reflect what is contained in the article. The suggestions above do not reflect the contents of this article. They reflect "human extinction" which is only one possibility. Other possibilities include the end of civilization but humans remain, or the entire planet is destroyed and no life remains. There is no easy short way to put it unfortunately, it has to be literally spelled out. -- Green Cardamom (talk) 17:33, 8 December 2012 (UTC)

I suspected as much, which is why I simply provided the guide for controversial moves. Consensus can change (particularly from "years ago"), especially after a major rewrite (as is being proposed below), although I too think the current title is more appropriate than either suggested. Mysterious Whisper (SHOUT) 17:58, 8 December 2012 (UTC)
Here are the previous titles. The current literal title has had the fewest complaints over time (no title has been complaint-free). I personally think 'Existensial risk' is the most academic and appropriate, if we were to rename, though it doesn't make it as clear as the current literal title. Concerned about the 'major rewrite' for same reasons you raised. -- Green Cardamom (talk) 18:34, 8 December 2012 (UTC)

I think "humans" should be replaced with "humanity". Jruderman (talk) 05:04, 1 January 2013 (UTC)

Theories that don't believe in global warming

You know, not everyone believes in the whole "end of the world due to man made global warming" thing- and plenty of them have good evidence. Shouldn't someone point that out, or at least acknowledge that some people don't think that it will happen? — Preceding unsigned comment added by 101.165.7.81 (talk) 06:29, 6 December 2012 (UTC)

What is the "it" in your "some people don't think that it will happen"? And for that "it", did this article mention anyone who is just as certain that "it" will happen? Vaughan Pratt (talk) 06:54, 2 September 2014 (UTC)

Proposed major revision

I'm composing a major revision to this article. Would like your comments before posting it.

  • Article correctly states that man-made (and man-exacerbated) risks greatly exceed natural risks. However, scenarios for natural hazards get sub-sub-sections while the important man-made hazards are relegated to sub-sub-sub-sections. Man-made and natural hazards should be two section headings to emphasize that the former is serious while the latter is light reading. This is in accord with Meximore's suggestion.
  • I shall include a summary of the book Apocalypse When? (Springer, 2009) by Wells. Of the 5 or 6 books on this topic from mainstream publishers this is the only one that provides a top-down mathematical formula for a best estimate of survivability.
  • Let's omit, "Cambridge identified the "four greatest threats" to the human species: artificial intelligence, climate change, nuclear war and biotechnology." I think irresponsible geoengineering is a greater threat than nuclear war because the latter would be confined to the Northern Hemisphere where the nuclear powers are. Tierra del Fuego and South Island, NZ are isolated by two atmospheric Hadley cells. Already a rogue geoengineer, Russ George, illegally dumped 110 tons of iron into the Pacific. Another serious threat is 'miscellaneous', the hundreds of tiny risks that become serious in aggregate.
  • Article twice recites the names of rivers in southern Asia. Once is enough, or maybe zero.
  • I would remove Wiki links to ordinary words: food, arctic, carbon dioxide, Africa, California.
  • Add examples of hazards that have already happened: rogue astronomers sent signals to potentially hostile exoplanets, mousepox experiment, Stuxnet.
  • Add the Lifeboat Foundation and the think tank Global Catastrophic Risk Institute to the list of concerned organizations.

Will9194 (talk) 07:06, 8 December 2012 (UTC)

So, you want it to look something like this. As was suggested, the content is largely OK, but the tone is a bit off. Check out the Manual of Style, specifically WP:TONE, WP:NPOV, & MOS:LEAD. More references wouldn't hurt, although what's there is probably adequate. (I haven't been able to peruse the entire revision (yet), but this is what I've gathered from the first third or so.) Perhaps more pressingly, did you just state that you are the author of Apocalypse When?? Mysterious Whisper (SHOUT) 14:52, 8 December 2012 (UTC)

Will9194, there is a serious WP:COI (Conflict of Interest) and maybe should not be editing this article at all. You attempted to re-write this article in a way that is favorable to the POV of your book and ideas and concepts. That's not how Wikipedia works (even if it currently did favor the POV of Bostrom, but it was not written by Bostrom and it was done years ago when Bostrom was the main reliable source available). You may be an expert on this topic, but by including your book and ideas into this article you crossed the line and there has been a breach of trust. I have no problem including the POV's of Gott, Rees, Leslie, Wells, Posner, Joy and anyone else so long as it is handled neutrally as Mysterious Whisper said. The best way to do it just just create sub-sections for each author and their POV, for example see the theories of the Decline of the Roman Empire - each author and theory has its own sub-section and summary. --Green Cardamom (talk) 18:15, 8 December 2012 (UTC)

Well, I was going to wait for more information... but, yes, what Green Cardamom said. We take conflicts-of-interest very seriously because it is almost impossible to write from a neutral point of view while influenced by one. Which isn't to say you're barred from contributing; you're still free to edit this and other articles, but I'd recommend you create your sandbox (go here), and draft any suggested changes (to articles that deal with your book or related topics) there.
As for Green Cardamom's suggested layout, I don't think dividing all the topics by author would work well here, however, if the missing viewpoint(s) can be summarized into a dedicated section(s), that could then be added to what's already there, that would probably be the best way to include them (for now). It would help greatly if there is evidence of other published authors sharing the same views. Mysterious Whisper (SHOUT) 19:01, 8 December 2012 (UTC)
Green Cardamom & Mysterious Whisper,
It never occurred to me that my edit might be regarded as a conflict of interest, because the mainstream book authors who estimated survivability, namely Rees, Leslie, Wells, get very similar results and so does the median opinion at the Global Catastrophic Risk Conference. (You might go to my original submission of 30 November and search for the words 'eerily' and 'eerie'.) So what we have is not conflict but closure. Maybe synergy is a better word for the following reason: Each of the other authors does a bottom-up estimate in which he studies individual hazards until his intuition settles on some numerical measure of survivability. By contrast I used a top-down approach based on a general principle that transcends the list of individual hazards. Thus, our agreement is synergistic, hence all the more important to present to the public.
Wikipedia strives for articles that are verifiable. You can slog thru Apocalypse When? and verify it mathematically step by step. Well, almost; some assumptions approximate the real world but don't match it perfectly. Alternatively, you can go to the book's website and find a digest.
Finally, I daresay that many of Wikipedia's edits and articles describe the author's own work. They just have the good sense to keep quiet about it. I suspect that these are among the best articles because these authors have incentive to hold the reader's interest. I agree that they are not neutral but suspect that other editors soon neutralize them. I've already had a small experience like that. On 1 May 2010 I edited the satellite de-spin mechanism to credit myself and two others with the original invention. I was immodest by displaying our names prominently in the main text. Two hours later LouScheffer neutralized it. Overall, it seems to me that Wikipedia is a wonderful system of checks and balances if you just turn it loose and let it run its course.
Will9194 (talk) 06:12, 10 December 2012 (UTC)
This does seem to happen often, though the guideline on it is dreadfully short.
I appreciate the disclosure, but that one of your few other edits was also self promotion, doesn't help much.
It is immaterial whether persons of interest have edited other articles to their own benefit without being caught. If you had made several smaller revisions, used a more neutral tone, and focused less on your book, nobody would have cared if you were the author of one of the sources, as the result would have been inherently acceptable.
Whether or not the material in your book is definitively and demonstrably true also matters not. WP:Verifiability, used in this context, means simply that it must have been published previously by a reliable source. Hence, there is a difference between "verifiable" and "true". (Wikipedia concerns itself primarily with the former.)
Filling the article with 'connections', eerie or otherwise, between your and others works, unless others have made and written about said connections, in unnecessary WP:Synthesis.
As for "let it run its course", that's what's happening now. You made a big edit with enough problems that the most efficient way to "neutralize" it was to undo it, at least temporarily. This reversion is followed by detailed discussion. It's a well-defined cycle. Your smaller edit was simply fixed because it was so simple to fix.
As I said, you are free to continue editing the article, although your contributions on this topic will likely be given special scrutiny. An alternate method is to draft edits in your sandbox (here is the bare code of your 30 November revision), although this will be tricky as you are trying to rewrite the bulk of the article. It would, however, allow you to continuously improve while soliciting community input. What you don't want to do is change the entire article, all in one edit, with little-to-no discussion; such might be reverted even when flawless.
Also note that Wikipedia aims to summarize significant opinions with representation in proportion to their prominence. Which is to say, the more you talk about your theories in the article, the more evidence there should be of other, independent sources talking about your theories. Mysterious Whisper (SHOUT) 16:10, 10 December 2012 (UTC)
Mysterious Whisper, Premium Mobile, & Green Cardamom,
Two issues about my proposed major edit; see bullet list above in my post of 8 Dec. Except for the second item, the only criticisms were matters of style. I've studied the rules and believe I can implement these corrections easily. Now that 9 days have elapsed, would this be a good time to go ahead? MW, you suggested I use my sandbox. Is that because you have access to it and can make critiques? Or is my sandbox just my private place to make rough drafts?
The major issue pertains to Item 2 since I am the author of the book to be summarized. (Don't buy the book; you can find a detailed digest here.) MW, you wrote, "The more you talk about your theories in the article, the more evidence there should be of other, independent sources talking about your theories." Unfortunately, I’m only aware of a couple: some fans in the physics department at U. Alaska, Fairbanks, and a lovely PowerPoint presentation from U. of Western Ontario. The latter is no longer online, but I can attach it to an email. These citations may not be adequate, in which case I’d like to apply for a waiver for the following reasons:
  • Highly favorable book reviews, summary and links at the book's website.
  • Springer's reputation as a world-class science publisher.
  • Supreme importance of the topic, human survivability. The public should know that humanity's risk is greater than most people would suppose.
  • Absence of any other mathematical treatment.
  • My own qualifications in applied math; go here and find my name (W.H. Wells) in red in the sidebar on the right.
  • Mathematical results substantiate educated guesses by important authorities: Martin Rees, John Leslie, Stephen Hawking, Bill Joy, and attendees at the Global Catastrophic Risk conference, Oxford U.
Will9194 (talk) 22:34, 17 December 2012 (UTC)
I'm sorry, Will. I don't know enough about the rules concerning COI to give you an informed answer. Like I said before, I only objected to the writing style being too narrative. I didn't have a problem with the substance of the edits. Primium mobile (talk) 15:41, 21 December 2012 (UTC)
If everything else (tone, weight, POV, etc) is perfect, we can ignore the COI.
If few others have talked about your presentation of your theories, and there are no other similar theories ("Absence of any other mathematical treatment"), it sounds a lot like a fringe theory. Any mention of such must be roughly in proportion to the number of reliable sources mentioning it (thus, quite brief). I'm afraid we don't make exceptions to this for any of your bulleted points. And again, no matter how well your results agree with those of others, you can't fill the article with such connections unless others have explicitly made them.
Wikipedia:Best practices for editors with conflicts of interest recommends that you not make any major edits to relevant articles; "Instead, make suggestions on article talk pages and let others decide whether to implement them", hence why I suggested you draft large changes in your sandbox first. But if you've read, understand, and can implement WP:NPOV, WP:V, WP:WEIGHT, WP:SYNTH, WP:TONE, & WP:LEAD, as well as the points made above, there's nothing stopping you from editing the article. At the very least, I'd still make several smaller changes instead of one major revision (if only to better facilitate discussion of individual points). Mysterious Whisper (SHOUT) 17:02, 21 December 2012 (UTC)

Dust

68.188.203.251 (talk) 14:15, 23 December 2012 (UTC) Please include dust. During this last pass through the galactic equator the heliosphere was seemingly open to cosmic dust. Electrostatic builds up on dust here on Earth so the cosmic dust build up could possibly be the source of catastrophic stripping of E electricity and hence magnetism. The E could develop such a charge that space currents could be drawn to E and discharge. This appears a probable occurrence in the past for example when there was claimed to be no Moon. k sisco

Sources? Mysterious Whisper (SHOUT) 15:47, 23 December 2012 (UTC)

footnote 104 does not seem to support the claim brought forward o.O The article says 'The solar system passing through a cosmic dust cloud, leading to a severe global climate change', the article used as a ressource only writes about a recent increase in dust particles, but deosnt even mention the climate. 2001:630:12:242C:88BD:5444:49F1:DDA2 (talk) 23:09, 28 February 2013 (UTC) Mo

The Stars, Like Dust? — Arthur Rubin (talk) 00:49, 1 March 2013 (UTC)

Article Defaced

Under the heading of Chances of an Existential Catastrophe, the opening line appears to have been defaced: it reads "Some doo dars assasinated by flying pigs such as that from asteroid impact, with a one-in-a-million chance of causing humankind extinction in the next century,"

I would correct it but do not know enough about editing Wikipedia articles to correct them, and what the correct wording of this sentence was to begin with. 76.0.14.23 (talk) 11:11, 10 April 2013 (UTC)

Thank you for bringing this to attention. The passage in question was vandalized by 176.35.156.199 (talk · contribs) on 10:14, 10 April 2013‎, and restored by Wannabemodel (talk · contribs) on 11:24, 10 April 2013‎. - Mysterious Whisper 14:04, 10 April 2013 (UTC)


Published Research

I've been reading the literature on human survival and notice that some important research is not reported in this article. Somewhere (I'd prefer early on) we need a comparative discussion of the big picture, not just the identified risks discussed one at a time as in Section 4. Some of this is already included, Note 9, which refers to a conference at FHI, and Note 23, Richard Posnerís book. However, we need to include (listed here in chronological order) the following Ö

  • Gott, Implications of the Copernican Ö Nature 363 p.315, 1993
  • Leslie, The End of the World, Routledge, 1996
  • Posner, Catastrophe, Oxford, 2004
  • Rees, Our Final Century, Arrow Books, 2004
  • FHI conference, Oxford, 2008
  • Bostrom & Cirkovic, editors, Global Catastrophic Risks, 24 chapter authors, 2008
  • Wells, Apocalypse When?, Springer, 2009
  • Guterl, The Fate of the Species, Bloomsbury, 2012
  • Casti, X-Events (Extreme Events), William Morrow, 2012

Any more we should add?

Some comparisons: Unlike the others Judge Posner puts strong emphasis on cost-benefit analysis. Each threat has some probability of happening and some number of casualties if it does. The product of the two is called the expected number of casualties. So divide the cost of prevention by the expected number to get the cost/benefit ratio. Posner would allocate resources where this ratio is least. He advocates expensive public programs to reduce the risk to humankind. Reviewers of this book are skeptical whether the public will bear the expense for benefits not apparent during their own lifetime. [link to Amazonís reviews]

There are two ways to analyze human survival, top-down and bottom-up. Bottom-up begins with a list of threats and then synthesizes the resultant overall risk. Top-down finds some principle that transcends individual threats, thus avoiding the need to make a complete list of them.

Gottís top-down analysis begins with an observer who determines the age A of some entity (Homo sapiens in our case). If there is nothing special about the time of observation, a reasonable best estimate would assume that the observer arrives at a random time within the life of the entity. In this case Gott shows that the probability the entity will be alive at future F is

Prob(F|A) = A/(A+F).

This formula gives 1.0 at the time of observation (F=0) and then decays to zero in the infinite future. It works for many entities such as stage plays. But then Gott applied his formula to mankind, A=2000 centuries, just as though there were nothing special about the present. However, the other scholars on the list above think our time is very special because dangerous new technology is proliferating at an increasing rate. This would invalidate Gottís assumption and put us not at a random fraction of humanityís lifetime, but much closer to the end.

Wells developed arguments that alter Gottís formula and show that it will apply to humanity if we replace calendar time with some measure of cumulative risk exposure. Wells then estimated risk exposure based on measures of dangerously rapid "progress", especially statistics of U.S. patents issued and the number of papers published in science and engineering. These are reasonably consistent with other indicators, such as the number of pages published in Nature magazine and gross world product, (pp. 77 & 78). Some issues of statistical weight degrade Wellsí accuracy considerably, maybe a factor of two, but ultimately he derives a best estimate for survival probability expressed as a mathematical formula.

Guterlís book is strictly bottom-up and confined to orthodox threats, the ones nearly everybody recognizes: superviruses, big natural events (asteroid strikes, volcanoes, and the like), climate change, ecosystem degradation, synthetic biology, machines and artificial intelligence. He approaches these subjects like a journalist describing pertinent real-life events and interviews with people involved. Wellsí approach could not be more different. He interviews nobody, but he does stress the importance of many small inconspicuous threats, which in aggregate may comprise a big threat, perhaps as big as some of the orthodox ones. He gives a couple of examples. This inability to list all significant threats undermines the bottom-up approach and conveniently highlights Wellsí own top-down approach.

Four of the references listed above give numerical probabilities of survival. Gottís is the outlier by far for the reason explained above: 97.5% chance of survival for 51 centuries. Rees estimates 50% probability that civilization will suffer a major setback during the next 100 years, which might kill billions. Leslie thinks the probability of extinction is about 30% after 5 centuries. Since extinction is much less probable than civilizationís collapse, these two estimates agree fairly well. A poll of experts at the Oxford FHI conference asked the chance of extinction during the next 100 years. Their median answer was 19%, again in good agreement. These three estimates are bottom-up. The participants studied individual threats and then decided on numbers that seemed intuitively about right.

By contrast, Wells used his formula and found the current risk rate for civilizationís collapse to be about 9% per decade and the half-life of civilization to be about 9 billion people centuries. (Human life expressed in population-time is analogous to labor to do a job expressed in man-hours.) If the average world population over that period is 9 billion people, then the half-life of civilization is 1 century in agreement with Rees. However, Wells evaluated some parameters from empirical data, which gave him some wiggle room to adjust his answer. Moreover, his formulation contains simplifying assumptions that do not exactly match physical reality. Thus his answers could be off by a factor of two. Still, this can be regarded as decent agreement for a quantity as slippery as human survival.

TeddyLiu (talk) 22:30, 15 April 2013 (UTC)

We can report on these things. If you are familiar with this field, perhaps you could summarize the major writers, works and theories? See for example Decline of the Roman Empire#Theories of a fall, decline, transition and continuity for how it is done in a field with many competing POVs and no clear correct answer. -- Green Cardamom (talk) 23:48, 15 April 2013 (UTC)

Looking at your edit today, some suggestions:

Rolf H Nelson (talk) 04:52, 19 June 2013 (UTC)

→Reverted the edit until it can get more discussion. Also, TeddyLiu, are you Wells, or someone with a connection to Wells? If so you should disclose that. Rolf H Nelson (talk) 05:15, 19 June 2013 (UTC)

I agree with Rolf H Nelson on all points, including the similarity to and POV of Willard Wells, who was here a few months ago as Will9194 (talk · contribs), proposing similar changes. As Rolf said, if you have any connection to Wells, you should disclose it immediately.
Additional notes on your proposed addition:
  • Lose the "as of May 2013"; don't ask the reader questions; avoid "More discussion below" and similar (the text "below" may later change); don't link section titles (as you did with many authors' names); in general, see WP:Manual of Style
  • We could work with summaries by author, but they should be shorter, more closely grouped by concept, and more focused on the concepts, rather than the authors themselves.
  • Have other major publications categorized the relevant literature as either "Descriptive" or "Analytic"? If not, we should probably avoid doing so.
  • Ensure the works you cite are reliable sources. as Rolf indicated, only about half of the "references" you provided are reliable sources, and even then, some are being used inappropriately.
  • At over 70kB, this is already a long article. Your addition brings it to nearly 100kB. The rule of thumb is that at these lengths, we should be more concerned with splitting the article apart, rather than adding to it. Make any additions as concise as possible.
  • There are not nearly enough references, which indicates that large portions of your text is original research. Further, the style and tone suggests original synthesis. We cannot accept either. I even see direct quotations without attribution. Be especially wary of making "connections" between (or even comparing) unrelated works (which have not already been made in reliable sources). We do not deal in the "ironic", "coincidental", etc.
You're clearly well read on this topic, and I'm sure this page could benefit from your work, but the addition you've presented is unacceptable. Please look through our core content policies and manual of style, and continue editing it in your sandbox. Mysterious Whisper 12:48, 19 June 2013 (UTC)


Re: Organizing on person vs idea, the suggested model of Decline of the Roman Empire#Theories of a fall, decline, transition and continuity is mostly organized on idea first, person second. However in this case, what we have here is each author presenting multiple original ideas. It's difficult to imagine how to provide a full survey of the various POV's of this topic without breaking it down along author lines. -- Green Cardamom (talk) 15:03, 19 June 2013 (UTC)

→Cost-benefit ratios are discussed both by Posner and by Bostrom (somewhere). Space habitats are advocated by Hawking and Rees. Curtailing civil liberties (not exactly a crowd-pleaser) is apparently advocated by Posner and Rees, and opposed by Casti. If Casti is the only one who talks about complexity, then his overall thesis might be fringe, but we can still quote him briefly to provide an alternate POV on the civil liberties thing. So I guess I'm opposed to a literature review per se, but am in favor of integrating all the literature found into the article. And again, I'm envisioning that the notable books can get their own brief pages. But, that's just my opinion; and if we get to a good draft doing it author-first I'll change my mind. Rolf H Nelson (talk) 02:22, 20 June 2013 (UTC)

I addressed the above on http://wiki.riteme.site/wiki/User:TeddyLiu/sandbox. Please make further comments there.TeddyLiu (talk) 02:29, 4 July 2013 (UTC)

Organizations studying existential risk

I've been looking at the list of organizations in Section 6. I followed the first link for U. Cambridge and found it merely a place holder for a proposed research centre, CSER. I first heard of it in July 2012, and I'm underwhelmed by their pace of progress. Suggest we delete them from the list of organizations until they get underway and have progress to report.

However, the site does have a nice quote by Prof. Huw Price: "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology. When this happens we're no longer the smartest things around and will risk being at the mercy of machines that are not malicious, but machines whose interests don't include us." We could add that to the Quotations section following the one by Nick Bostrom.

I'd much rather see the Lifeboat Foundation listed among the organizations. [4] They have thousands of members and a scientific advisory board that includes every relevant specialty. Seems like the public should be aware of them.

GGGudex (talk) 00:31, 26 April 2013 (UTC)

As long as they have an article, I have no problem mentioning them there (In fact, I just added a mention). However, I see that said article is in jeopardy due to WP:NOTABILITY concerns. If the article gets deleted, we can still mention them here, although in that case I'd like to see a few good references. (Although, a few good references would also keep that article from getting deleted...) Also, while I'll assume good faith now, note that knowing the founder (as you stated at Talk:Lifeboat Foundation) presents a conflict of interest, and you should avoid appearing to advertise for them to prevent any misunderstanding. (Coming to the talk page, instead of just adding the information, was the best thing you could have done, and for that I thank you.) Mysterious Whisper 01:08, 26 April 2013 (UTC)
I did a Google search for the proposed Cambridge group, Centre for the Study of Existential Risk, and found myriad citations that meet your criteria, even though they are not underway yet. To me this suggests that the number of references is a poor way to evaluate "notability". It favors the elite and those who are already entrenched or well connected.
Incidentally, Jaan Tallinn, co-founder of CSER, jaan.tallinn@gmail.com, is a member of the advisory board of Lifeboat Foundation.
GGGudex (talk) 01:45, 4 May 2013 (UTC)
If you don't like the notability guideline, here is not the place to try to change it. In any event, "The notability guideline does not determine the content of articles, but only whether the topic should have its own article." I mentioned it only because 'whether the topic [has] its own article' is part of what appears to be the established basis for inclusion in that list (notice that every other entry has a main article), to prevent it from becoming an indiscriminate collection of information.
As I said, I don't necessarily mind mentioning them here even if the main article gets deleted, but in that case I'd like to see some references. Since my initial comment here, I've twice attempted a search for references, and the extreme lack of WP:RS coverage is astounding.
I currently support the list as it stands (with both the Centre for the Study of Existential Risk and the Lifeboat Foundation mentioned), although I'm becoming more disillusioned with the Lifeboat Foundation the more I look into it.
Mysterious Whisper 03:01, 4 May 2013 (UTC)

Proposal: change this article focus to 'global catastrophic risks', move 'human extinction' risks to 'human extinction' article

"Risks to civilization" is a more vague and much more broad category than "human extinction", and human extinction already has more than enough material to fill its own article. Currently there's significant overlap between the two articles.

Also, I'm confused as to what this article is about. The title includes 'Risks to civilization', but the header says it's about "existential risk". So is it about:

1. Human existential risk, as in the straightforward (and usual?) meaning of 'risks that the human race will no longer exist'?

2. Existential risk as used (coined?) by Bostrom 2002, who includes (idiosyncratic?) concerns about civilization being "permanently crippled" in the definition?

3. Risks to civilization, in the sense of global catastrophic risks that could kill billions but would not *directly* lead to human extinction or permanent crippling of the human race?

My proposal is that we:

1. Rename this article to 'Global catastrophic risks' or 'Risks to civilization'

2. Move all 'Human extinction' risks, except perhaps a brief summary, into the human extinction article, to reduce duplication.

Thoughts? Rolf H Nelson (talk)

This article is about risks to civilization and/or humanity and/or planet Earth. It covers full scale of possibility. We need a master top level article for disaster scenarios. If the article becomes too long split along logical lines into sub-articles (by way of subsections with "main article" links). However if you read this article, the sub-section on humanity is about risks from humanity, the cause of the disaster, not the scope of the disaster. Big difference. We don't try to slice and dice a scenario as only effecting humanity, or civilization, or Earth - because we can't know the scope of many disasters, never happened before. And each scenario could impact on multiple scales. That is the fundamental problem with the existence of human extinction, it tries too hard to be only about one thing and then runs into a problem with its sources, which are more inclusive. BTW that article was created basically as a fork of this one not long after this one was created. Its editors were never cooperative with a merger and they never really developed the sourcing for it. IMO it should be merged into this one, then we can discuss ways to create sub-articles, if needed. -- Green Cardamom (talk) 06:18, 2 July 2013 (UTC)
It sounds like one thing we're in agreement with, then, is that the parts of this page that say "this article is about risks that are global and terminal" should be removed, as the scope of this article is broader than that. If there's no objections, I'll take them out then if nobody else beats me to it. Any thoughts on the article title? It seems to me that "risks to civilizations, humans, and planet Earth" is redundant; isn't a risk to humanity also a risk to civilization? Rolf H Nelson (talk)

Heat death

Just edited the bit about intelligence surviving the heat death. Right now, the consensus is that it can't, but there are prominent physicists out there who think otherwise. — Preceding unsigned comment added by 82.22.36.36 (talk) 06:35, 28 June 2013 (UTC)

I believe the Omega Point hypothesis is non-peer-reviewed WP:Fringe, and as a bonus has been discredited in the peer-reviewed literature. Rolf H Nelson (talk) 01:24, 1 July 2013 (UTC)

What I added has nothing to do with the Omega Point? — Preceding unsigned comment added by 82.22.36.36 (talk) 22:46, 1 July 2013 (UTC)

You're right, my bad. The cited URL www.aleph.se/Trans/Global/Omega/dyson.txt and the content confused me into thinking it was about Tipler's Omega Point and Final Anthropic Principle. Sorry about that. Let me start over then:
Dyson's 1979 paper is legitimate and isn't pseudo-science, but wouldn't you agree it has since been completely invalidated by the other paper you cited (http://adsabs.harvard.edu/abs/2000ApJ...531...22K), and by the discovery of the cosmological constant? (As an aside, I am not a physicist, but the argument in the second paper that "Eventually, the probability of a catastrophic failure induced by quantum mechanical fluctuations resulting in a loss of consciousness becomes important" seems definitive.) The paper concludes that "we find that eternal sentient material life is implausible in any universe"; is there evidence that this is still an open issue in physics post-1999? If not, I don't see a strong reason to reference the Dyson paper in this article. But, that's just my opinion.Rolf H Nelson (talk) 02:53, 2 July 2013 (UTC)

Yeah, you're right. However, I think it's important to at least mention the other proposed possibilities of life surviving in an expanding universe... as I said, trying to "program" or find a ready-made wormhole that is connected to another Universe is something that an advanced civilization shouldn't have terrible difficulty doing. But it's speculation.

Can you clarify? What other notable possibilities currently exist besides a wormhole, given 2000ApJ? Also, the physics that would be required to escape into baby universes could be expanded on, presumably in Ultimate_fate_of_the_universe#Life_in_a_mortal_universe, which is currently short on citations. It would also be good to have, somewhere on wikipedia, information on what physics is required for traversable wormholes in general, since it comes up often in pop science writings about science fiction. I assume you have to postulate multiple additions to the current (non-quantum-gravity) accepted laws of physics to make escape possible, it'd be good to itemize what those postulated hypotheses are. Rolf H Nelson (talk)

Bostrom's risk graph

Looks like it was deleted for some reason[5]. Anyone know what happened (or how to view the reason deleted?) -- Green Cardamom (talk) 13:39, 22 August 2013 (UTC)

Hmm.. another copy on Commons: File:X-risk chart.jpg .. -- Green Cardamom (talk) 13:41, 22 August 2013 (UTC)
What happened was that I bothered to get permission from the copyright holder to post the graph rather than just pretending I'm the copyright holder when I uploaded it like most people do. My theory is that this is so rare that nobody at Wikipedia knows how to handle this situation. I sent them an email within the 7 days required to avoid deletion, but all I got was a response by either a bot that didn't understand my email, or a human who didn't seem to actually read my email (I can't tell which). It's a minor pity, since the one I uploaded was a nice scalable SVG file. Rolf H Nelson (talk) 22:38, 25 August 2013 (UTC)

Inappropriate title

The name of this article (which specifically includes "planet Earth") contradicts with the second and third sentences of the first paragraph, which state that this article will not cover global (i.e. planet Earth-scale) issues. It appears that we need to decide just what exactly this article is about. Also... why has the title "existential risk(s)" been passed up? Wolfdog (talk) 05:19, 27 August 2013 (UTC)
You're correct, the lead is wrong and confusing. Further it does not reflect the content and scope of the article, which catalogs all events that threaten humanity on all scales: end of civilization; and/or human extinction; and/or planet catastrophe. The term "existential risk" has been used by some writers but it's not well established what it means, usually understood within context of something else ie. existential risk from technology means something different from existential risk from a planet destroying meteorite. In the end we still have to define the scope of the disaster, as Bostrom's chart shows. That's why the current title defines all scopes of any disaster type and thus covers all bases. I think if we renamed to "existential risk" there would be dispute over what existential risk means, for example Bostrom's chart only includes a narrow window in the top right and scenarios in this article would arguably have to be excluded from the definition (then the article would have to be renamed "Bostrom's existential risks"). We need a central article on Wikipedia to include all these scenarios in one place, the current title serves that. -- Green Cardamom (talk) 06:05, 27 August 2013 (UTC)
There's going to be even more dispute of what constitute a "Risk to civilization, humans, or planet Earth". But the article should stay broader than just existential risk because that's what the scope of the article has been. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
Wolfdog, should the article on human extinction be renamed to "existential risk"? I would argue no, because "human extinction" is less jargon-y. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
I don't think being swallowed by the Sun in 6 billion years is a "risk to planet Earth" in the media or literature or the English language. A risk implies a probability or a choice. Also, we shouldn't give WP:UNDUE weight to things that happen in billions of years compared with things that happen in the forseeable future. For example, an article on the "future of North Ireland" wouldn't mention that North Ireland will be destroyed by continental drift in millions of years, because that's not the focus of mainstream discussion. By my logic, "risks" to Earth are a very small subcategory of risks to humanity, consisting solely of LHC-style threats that reliable sources give low weight to, and therefore "and planet Earth" should be dropped from the title as redundant and WP:UNDUE. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
A large-enough meteorite would in fact be a risk to humanity and planet earth, at possibly any time. Since these events have never happened it's very difficult to slice and dice based on scope or time of disaster. We just don't know when or how big the disaster could be. Thus it makes sense to be as inclusive as possible. -- Green Cardamom (talk) 19:56, 28 August 2013 (UTC)
A giant meteorite is usually framed as a risk to civilization or humanity or the biosphere, not the Earth. Unless you mean anything with a negative and global effect across the entire Earth is a "risk to the Earth", in which case we'd have to include things like "light pollution" and "extinction of the dolphins" as risks to the Earth. -- Rolf H Nelson (talk) 21:41, 28 August 2013 (UTC)
Let's at least remove the Earth being swallowed by the Sun et al if nobody objects, since we know the minimum timeframe of that disaster. -- Rolf H Nelson (talk) 21:41, 28 August 2013 (UTC)

Thoughts on what to do with the "Potential Sources of Existential Risk" section, which currently includes non-existential risks? Does that become "Potential Sources of Catastrophic Risks", or should there be two sections, one for existential risks, and one for catastrophic risks? Rolf H Nelson (talk) 21:59, 31 August 2013 (UTC)

Expansion of the Sun

The following statement cannot be true: "Ignoring tidal effects, the Earth would then orbit 1.7 AU (250,000,000 km) from the Sun at its maximum radius." 1.7 AU is 85% of the *diameter* of the Earth's orbit. The Earth would have to move to a higher orbit in order for this to be true. Perhaps the author meant to say 0.7AU? This still seems too large (see The Sun as a red giant.) — Preceding unsigned comment added by 96.237.189.88 (talk) 22:32, 2 September 2013 (UTC)

As the article states, the Sun loses mass during the red giant phase. This causes the radius of the Earth's orbit increases from 1AU to 1.7AU. Rolf H Nelson (talk) 03:08, 4 September 2013 (UTC)


Andromeda and the Milky Way collide

When Andromeda and the Milky Way collide, this may present a threat to planet Earth (if planet Earth still exists). — Preceding unsigned comment added by Rvam1378 (talkcontribs) 22:56, 1 November 2013 (UTC)


A major flaw in this article

Much of the threats outlined in this article cannot be truly construed as a 'existential risk'.

-Events such as nuclear warfare, dramatic climate change, collapse of agriculture, etc etc most likely kill a significant percentage of human being and would cause destruction of human civilization, but most likely will not result in a complete eradication of human species.

-More serious threats described here (mostly of cosmic origin) such as impact events, gamma ray bursts, etc will most likely nearly destroy all human population (99%+). However, unless the event is truly catastrophic enough to wipe out all of vertebrae phylum, the fact that current human populations are so widely distributed will allow small isolated pockets of several individuals to survive.

-Even global pandemic with truly exceptional virulence (i.e. HIV that evolve to spread airborne) will probably fail to cause human extinction, due to the fact that humans have tremendous genetic diversity. Any such pandemic will most likely be survived by small minority of individuals.


Humans not only have high population and global distribution, but also have advanced technology and intelligence at their disposal. Therefore only legitimate 'existential risks' I see in this article are:

Near future

-artificial intelligence

-experimental accidents

-biotechnology (engineered pandemic)

-single cosmic event (GRB, impact event etc) of apocalyptic magnitude

-multiple cataclysmic events occurring happens back to back (i.e. within few thousand years of each other) without giving humans chance to recover

Far future

-expansion of the sun (if humans have not colonized other star systems)

-heat death of the universe

This article includes catastrophic risks to civilization, not just existential risks; there's a separate article more narrowly about Human Extinction. BTW don't forget to sign your comments with four tildes. Rolf H Nelson (talk) 23:48, 24 November 2013 (UTC)

Removed "clarification needed"

"Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century, have had their probabilities predicted with considerable precision" has been tagged "clarify|reason=How can it be stated "considerable precision" if we can't test "precisely" what the odds really are?" This sounds more like a criticism or counterpoint than a request for clarification, so I removed the tag. Of course, as always, the criticism can be added within the article if desired, if a source is given. If something's actually unclear, rather than contested, please clarify the clarification request here on the talk page. Rolf H Nelson (talk) 06:30, 11 December 2013 (UTC)

Title - dubious

As has been repeatedly raised in many of the discussions above, this article's focus and thus its title are dubious. I have tagged the article with a "disputed title" template, and have also recommended that the article undergo some kind of splitting process. A risk to civilization is very different from a risk to humans, which is very different from a risk to the whole planet Earth.

Civilization, humanity, and planet Earth are items that are not necessary to the existence of one another (except perhaps in one direction, and I don't, by the way, mean to imply that they aren't related). For example, a (hypothetical) peaceful, intentional dismantling of hierarchical social structures around the world is a "risk" to civilization, but not at all necessarily to humanity or to the planet. A large meteorite is a risk to all three, but then shouldn't that just be placed under an article titled something like "Risks to planet Earth"? It's not a risk specific enough to civilization, for example, to be placed under an article called "Risks to civilization." A meteorite is also a risk to horses, trees, house flies, diamonds, cardboard, works of art, ecosystems as a whole, or specific human individuals. But each of those doesn't deserve its own "Risks to..." article. If there is enough literature on risks to civilization, for instance, then that idea deserves its own independent article. Civilization, humans, and Earth, however, should not be sloppily lumped together like this in one article. Some writers, apparently wanting to lump even more items here, have mentioned "the annihilation of... even the entire universe" -- just demonstrating how easily out of hand this is getting. We seriously need some coherence here and probably some splitting. Wolfdog (talk) 17:10, 28 December 2013 (UTC)

If there isn't sufficient literature on "risks to civilization" as a category to merit an article, then the problem solves itself; we can just delete much of the current article as insufficiently sourced. However, I suspect there is sufficient literature, and I suspect that said literature includes risks of human extinction to be within its purview. I know the FHI's Global Catastrophic Risks included human extinction. Rolf H Nelson (talk) 03:55, 1 January 2014 (UTC)
What would you propose should go in a "Risks to planet Earth" article besides (arguably) metorites? Rolf H Nelson (talk) 03:55, 1 January 2014 (UTC)
If that is the only sourced risk to planet Earth, then this article hardly needs to include "planet Earth" in its title. There doesn't, in that case, need to be any article about risks to planet Earth. We can just discuss such an issue under "Meteorites." Wolfdog (talk) 23:30, 5 January 2014 (UTC)

There are plenty of sources that cover the topic as a whole, so there is no problem with the topic itself, nor a need to split the article. As mentioned previously the article title is just a placeholder to help readers understand what the article is about. If you go to "What links here" and look at the "Redirects" you will see dozens of alternatives, the others are less clear and more ambiguous. Wolfdog .. how familiar are you with existential studies and scenarios? The article doesn't say the things you are saying, it doesn't say that one thing is dependent on another. Nor does the literature make such neat and fine distinctions between civilization and other risks .. we have no idea how far ranging these risks may be, they are just simply "risks" that could impact things on the various scales. The title should not be interpreted so literally the point of trying to split it into fine categories. The literature itself makes no such categorized distinctions, nor should we. The article doesn't make that distinction either. It can read "Risks to civilization and/or humans and/or and/or planet Earth", but that would be long and awkward. I'm afraid the confusion here is an overly-literal interpretation of the title without reading the lead section that is more nuanced, and where the actual scope of the article is defined. -- GreenC 16:34, 1 January 2014 (UTC)

We do not need to split the article if, as you say, it is sufficiently sourced and referred to in the sources as one unit. But doesn't an article about risks to three different (and specifically named) issues intend to cover those three as if they are similar enough to fall under a single page? If the article isn't about "such neat and fine distinctions" regarding risks, but rather about "just simply 'risks'" of various (but basically huge) scales, then the title still seems awkward and inappropriate. What is your position regarding other users' past suggestions that this title be moved back to the name "Existential risks"? Of course, the problem with this title in the past was that it seemed too unclear, a quality you mention as being undesirable. However, the phrase "existential risk (or catastrophe)" does recur quite often throughout the article (and without any set definition). How about something clearer but still broad, like "Global catastrophic risks" (which also shows up in the article)? (However, this seems biased to only one of the three risks aforementioned, which I feel is inevitable with most titles; that's why I suggested splitting to begin with.) I'm not an expert on existential risks, but I agree that since "the literature itself makes no such categorized distinctions, nor should we." However, I disagree with your comment that "The article [title(?)] doesn't make that distinction either." I feel that the title makes three blatant distinctions (perhaps arbitrarily chosen, but then sloppily so). If those are not the only three items in question, then the title is (amazingly) too narrow and not broad enough. What are we really talking about here in this article? Wolfdog (talk) 23:30, 5 January 2014 (UTC)
Is your current proposal "global catastrophic risks"? I'm in favor as I've stated before, but I'd like to hear what Green Caradamom and/or others think. My main personal concern is if anyone would find "global catastrophic risks" unclear to laymen or too jargon-y, I think it's fairly straightforward though. "Global catastrophic risks" gets about 10x hits as "Risks to civilization" in a quick Google Scholar search, though admittedly most of those uses are from Nick Bostrom and associates. Rolf H Nelson (talk) 02:54, 8 January 2014 (UTC)
This is a topic without a commonly accepted term, there are many terms, at least 42 of them, so there will always be lack of clarity no matter what title we choose. Looking at Bostrom's book who wrestled with this same problem, he titled it Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards .. using that as an example, could call it "Global catastrophic risk scenarios and related hazards" .. the "related hazards" leaves the title open to interpretation (ie. for more information read the article), although this too can cause complaint about lack of clarity. Ultimately it's up to us editors to decide what the scope of this article will be, and scope is defined in the lead section, not by the title, so any title is defensible so long as as it mirrors the article. But any title will be open to complaint since there is no commonly accepted term for it. -- GreenC 04:43, 8 January 2014 (UTC)
Interesting points. I was simply observing that, again and again, users have attacked the title as too broad, vague, etc. (and, of course, 42 other options doesn't help the fact that we must come to a conclusion on one name). I guess I would, yes, personally go with "Global catastrophic risks," though I too would like to hear other users' input since, again here, I'm not an expert on the issue at all. It's occurring to me now that what's making me most uneasy, I suppose, is that "civilization" is the least like the other two. If we can clarify the title even a little bit, why not? I'm looking now at the various risks (or sources of risks) themselves. Of them, almost all (warfare and mass destruction, man-made global warming, eco-disaster, population/agricultural crisis, experimental accident, biotechnology, pandemic, ice age, volcanism, etc.) in particular concern the destruction of planet Earth or enough of Earth's environment that it can be assumed humans could very suddenly go extinct. To me, to throw "civilization" in the mix as well is trivial and possibly misleading. The only two issues I can see that fairly specifically focus on the destruction of civilization(s?) are "Artificial intelligence" and "Extraterrestrial invasion." And even these two can be simply considered generalized global risks. So, to make this long story short: civilization appears the most out of place in the title. Wolfdog (talk) 22:11, 8 January 2014 (UTC)
"warfare and mass destruction, man-made global warming, eco-disaster, population/agricultural crisis, experimental accident, biotechnology, pandemic, ice age, volcanism" Is your point just that any risk to civilization could, in theory, cause humans to go extinct if we had enough bad luck? I think most sources would disagree that an ice age, for example, would likely somehow kill everyone off every last human being alive. Rolf H Nelson (talk) 04:24, 9 January 2014 (UTC)
Well now you're bringing in a whole new debate and I think a new set of premises. Calling an ice age a "risk to civilization" seems trivial (just like we wouldn't say that an ice age is a risk to computer technology or a risk to the literary arts or a risk to vacation resorts -- these are all true, but they're trivial). What is relevant is that an ice age is a "risk to human existence (as a whole, or, at least, in large part)" or a "risk to much of the life on planet Earth." And in response to your doubts about an ice age killing "everyone off every last human being alive," I must say that a risk that leaves only a handful of organisms of some species alive, from an ecological point of view, is still a risk to that whole species. It is ecologically highly unlikely that 10 surviving red foxes will reproduce successfully enough in a newly devastated ecosystem to continue existing as a species; in other words, those red foxes (and so their whole species) are in the midst of an extinction event. (Likewise, an ice age is certainly a risk to human existence.) Wolfdog (talk) 05:16, 9 January 2014 (UTC)
Now here are a few titles I'm proposing: [1] Global catastrophic risks | [2] Existential risks (despite this title's vagueness, at least this doesn't throw "civilization" bizarrely into the mix -- its definition can be discussed more fully in the article itself, and perhaps even explained as controversial with many interpretations) | [3] Risks to humanity (or to the human race or to the human species) and planet Earth (this is one that, personally, seems still too specific; since we have not defined this article and are allowing the opening paragraphs to flesh it out, it is probably best that we have a broader rather than narrower title) .........Other ideas or do any of these suffice? Wolfdog (talk) 05:29, 9 January 2014 (UTC)
Given that the concept of 'risks to civilization' passes WP:NOTABILITY, changing the set of items *in this specific article* covered to exclude risks to civilization, without placing them in a specific alternative article, is not an option. Direct, rather than indirect, risks and threats of human extinction are covered (which includes the Sun's expansion), in human extinction. The article on risks to civilization can't exclude human extinction risks because the sources cited don't appear to exclude human extinction risks (unless, *maybe*, if the article gets too large and needs to be split). So I don't see the article scope changing. Rolf H Nelson (talk) 05:04, 10 January 2014 (UTC)
I'm fine with Existential risk, it's more obtuse and academic but this is Wikipedia we are supposed to be academic. Bostrom's book is called Existential Risks so there is support in the sources. The definition of existential risk, according to Bostrom, covers the current title (civilization, humanity, planet earth) so no problem with article scope. I think "global catastrophic risks" works also but that would be my second choice because "global" will become a source of contention (a certain disaster scenario may not be global scale yet count as an existential risk). -- GreenC 15:59, 10 January 2014 (UTC)
I think there's some (understandable) confusion here. By Bostrom 2002's definition, most risks to civilization aren't existential risks; only risks that civilization would be permanently crippled, rather than being able to recover in a generation, are existential risks. "Repressive totalitarian global regime" is one example he gives. Outside of Bostrom, the scope of "existential risk" seems to be more narrow, focusing more squarely on human extinction. Bostrom et al's Global Catastrophic Risks has a broader focus, more along the lines of what most people would characterize as "risks to civilization" (although a bit broader, including catastrophies that kill a lot of people but don't imperil civilization.) Please let me know if you disagree, this is a pretty important consideration towards making the article consistent and understandable. Rolf H Nelson (talk) 23:10, 11 January 2014 (UTC)
I see. Sorry for the confusion. You're right Bostrom's book is called Global Catastrophic Risks, and the Oxford based Future of Humanity Institute has a http://www.global-catastrophic-risks.com/ (chaired by Bostrom) and there is the http://gcrinstitute.org/ (Global Catastrophic Risk Institute) founded by Seth Baum .. so those are some evidence-based reasons to use it. The danger becomes we don't have a POV problem supporting certain researcher's coinage. But it helps it's not just Bostrom. -- GreenC 04:51, 12 January 2014 (UTC)
Perhaps the problem for me here is that the sources all use civilization and then rarely define it, though it may have various definitions (and though I have a very strict definition of it in my head, used for example in social sciences like anthropology; civilization: a type of human society based on densely populated settlement(s), agriculture as the almost total source of food supply, importation of resources, and constant expansion). If the sources are unclear, I guess there is nothing to do about the title. It is merely reflecting those references who loosely throw the term around. The way I see the term "Risks to civilization" is used in contradiction with the sources who seem to use it almost to simply mean "Risks to a large portion of humanity." Is that the case? Wolfdog (talk) 02:03, 13 January 2014 (UTC)
I agree that our sources don't generally distinguish between "risks to a large portion of humanity" and "risks to civilization". Perhaps this is a "contradiction", or perhaps it's just pragmatic because most risks to one are also a risk to the other. Rolf H Nelson (talk) 01:51, 23 January 2014 (UTC)

Requested move

The following discussion is an archived discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. Editors desiring to contest the closing decision should consider a move review. No further edits should be made to this section.

The result of the move request was: Page moved to Global catastrophic risks Rolf H Nelson (talk) 04:42, 29 January 2014 (UTC) (non-admin closure)



Risks to civilization, humans, and planet EarthGlobal catastrophic risks – Nick Bostrom et al's term "Global catastrophic risks" is a more apt name. Although it may be considered a broader title than the current one, this is appropriate for the expanse of the article's content. Discussions and multiple citations here demonstrate Bostrom's position as the foremost scholar in this area, who has himself contended with the controversies of naming such class of risks. [6] Wolfdog (talk) 23:39, 20 January 2014 (UTC)

  • Regardless of any scholar's naming policy, we follow Wikipedia's here. But WP:CONCISE seems to lead me to strongly support. Red Slash 03:54, 21 January 2014 (UTC)
  • Support. -- GreenC 04:43, 21 January 2014 (UTC)
  • Support. Makes sense, including for succinctness. DA Sonnenfeld (talk) 14:56, 21 January 2014 (UTC)
  • Support. Moving since there's no evidence of controversy. Edit: actually I didn't, the 'move' button is missing, probably it gets suppressed while a Requested Move discussion is open. Rolf H Nelson (talk) 02:01, 23 January 2014 (UTC)
  • Support per conciseness as mentioned. Current title could conceivably allow for any and all human risks. Not sure the suggested title is ideal as I don't think it's a phrase in common usage but I lack a better suggestion. (Also it's nice to avoid the US English and Oxford comma.) benmoore 22:07, 23 January 2014 (UTC)
  • Support. Better than current title, although I would prefer "Risks of Human Extinction" or "Existential Risks." Although far less likely to end humanity that the classically included x-risks, generic GCR's can still be in this category since they could push us towards a more fragile state or even prevent long term "technological maturity." Anthropic Optimist (talk) 13:30, 27 January 2014 (UTC)
  • Support. Better title. GCR usage in the literature does encompass creeping risks beside the dramatic catastrophes, so the article contents still fit. I think it might be helpful later to separate out the issues and explanations unique for existential risks (finality, special moral weight) to a separate section or their own page, but the base risks do belong here. Anders Sandberg (talk) 02:10, 29 January 2014 (UTC)
The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page or in a move review. No further edits should be made to this section.

Ice age section incorrect?

I believe there were civilizations primitive by today's standards but they were organized into societies and had navigation and architecture. Irony here being that civilization technically already ended before, if that's correct. Without a citeable reference, I'll leave this out of the article. No, Ancient Aliens doesn't count! 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:22, 17 April 2014 (UTC)

Splitting off existential risk

An editor has been attempting to fork this article, creating a new article existential risk. This is a significant re-arrangement of ideas that needs consensus. I disagree with the change as it's a confusing and unnecessary. This article is already about existential risk. It's too difficult and confusing to have two articles on the same topic. If required we can entertain renaming this article to existential risk, but we just had a major discussion on what to name this article (see above) and there was no consensus for naming it existential risk. -- GreenC 13:45, 13 June 2014 (UTC)

Merger proposal

I propose that Human extinction be merged into this article. The content of the 2 articles overlaps completely Jytdog (talk) 20:40, 23 July 2014 (UTC)

  • Oppose. The nominator has the evidence on their side, but I wish to make an objection. "Human extinction" was created in February 2005, with only one-third of its scope dedicated to global catastrophic risks.[7] In March 2005, a user forked out a discussion of the catastrophic risks into a new article "Global catastrophic risks", which framed these risks as "human extinction scenarios".[8] So the original article was not supposed to discuss just scenarios, but rather the historical roots of the idea, observations and perceptions of the risk, and related ideas. Over time, however, both articles have overlapped as the nominator suggests, but "Human extinction" is still focused on the human side of the equation. If "Human extinction" is to survive (!) as a separate article, it should take a philosophical approach similar to The World Without Us and Our Final Hour, with less focus on the risks themselves but more on how the planet would change. It could also focus on actual human extinction, such as those of indigenous peoples and their cultural practices, including language. Viriditas (talk) 22:34, 24 July 2014 (UTC)
  • Mild oppose. The topics overlap; so do the topics for 'Dog' and 'Mammal', but both have their own Wikipedia page because the former can focus more specifically on its subtopic. Rolf H Nelson (talk) 04:24, 29 July 2014 (UTC)
  • Support The article as written is a fork (this one is broader scope and more inclusive so should be the parent). If it was redone in a way that didn't fork so much content and focused on original stuff I'd be more inclined to keep. I agree with Viriditas that if it stays it should take a new and original tract along the lines of The World Without Us, looking at what the world would be like after a human extinction event, based on sources such as that book. Specific extinction scenarios should be left to this article, it would remove half the article (some brief discussion could be had with a "main article" link to here). -- GreenC 14:15, 29 July 2014 (UTC)
  • Oppose - While there may be a lot of overlap, that can be fixed with time and editing and we aren't in a rush. At the core, they are two different topics, even if they don't resemble that now. Human extinction doesn't require destruction of the planet in any way, although that is one method. To be topical, if the ebola virus wiped out all of human kind, the planet as a whole system would hardly notice, except for cleaner air and water. By virtue of its title, Global catastrophic risks should be more broad and less homo-centric, and focus on entire systems more than individual species. In the end, one article is about a singular species (homo sapiens sapiens), the other is about a biosphere called earth, including all living things within it. (Gaia hypothesis and others) And of course there will be overlap, one lives within the other. That said, I agree they need improvement and refinement but I think the solution is accomplished by editing, not merging titles that have legitimate reasons to be seperate. Dennis 14:46, 11 October 2014 (UTC)

Global catastrophic risk vs risks

Speaking of this class of risks in the singular is awkward and unusual. The article is about a class of risks that includes many risks. By making it singular it is no longer about a class but about a single risk, grammatically speaking. WP:SINGULAR says "Exceptions include .. the names of classes of objects (e.g. Arabic numerals or Bantu languages)." It's the same here, this topic includes many risks and should reflect that, we don't say "Arabic numeral" unless the topic is about a single number. -- GreenC 22:18, 30 January 2014 (UTC)

Speaking about any class in the singular is awkward and unusual. We don't say "dog" or "cat" when talking about the breeds or when talking about all cats or dogs. We say "dogs" or "cats". By making any class singular, it is no longer about the class but a single type from that class, but that is how Wikipedia articles are named. Types are named in the singular. The exception applies to sets that are usually referred to as sets. Crown jewels, for instance. Examples of types of things include human, planet, extinction event, war, pandemic, disaster, earthquake, tropical storm, risk, financial risk, market risk, existential risk, catastrophe, catastrophic illness, catastrophic injury, catastrophic failure, and global catastrophic risk. WP:Singular applies. The Transhumanist 18:35, 8 November 2014 (UTC)
Agreed. There's no reason to make an exception for this article. And the singular form is hardly unheard of when talking about risks as a class, anyway (e.g., the Centre for the Study of Existential Risk). ∴ ZX95 [discuss] 19:10, 8 November 2014 (UTC)

Probability of an existential catastrophe

Most of the material in this section seems *extremely* speculative and, yet, oddly precise. So the table listing probabilities for human extinction by 2100 .... well, this seems silly to me. I'd be in favor of removing a lot of this material. Thoughts anybody? Grandma (talk) 03:17, 9 December 2014 (UTC)

One-off extreme risks are, by their nature, intrinsically speculative; nonetheless some sources consider attempting to estimate them important. Is there a change that would make it more clear to you and others that they are imprecise estimates? Rolf H Nelson (talk) 08:14, 9 December 2014 (UTC)

The first sentence of the section reads "The following are examples of individuals and institutions that have made probability predictions about existential events." This is how Wikipedia is supposed to work, presenting Multiple Points of View from reliable sources. If there is a sentence or fact that in the section that needs better sourcing one option is to mark it with a {{fact}} tag. -- GreenC 12:54, 9 December 2014 (UTC)

Graph on BBC website

Interesting chart with scenarios on BBC. I think we have most of these covered. -- GreenC 14:30, 5 April 2015 (UTC)

Recent changes

The recent changes were significant and there is no consensus. For starters, "existential risk" means 100% annihilation of every human on the planet, and none of the scenarios are inherently existential risks (except the very long range ones) .. they could be or could not be .. all scenarios carry the possibility of "existential risk", but none of them guarantee it - it's impossible to predict - one can't categorize them as existential vs non-existential is meaningless. -- GreenC 01:33, 28 April 2015 (UTC)

The distinction between existential risks and other GCR's already exists on the page, e.g. in the "classifications of risk" section and the "precautions and prevention" section. Introducing the same distinction in section 4 would make the page more consistent. (The existing division into anthropogenic and non-anthropogenic risks is not precise either, e.g. global pandemic could be anthropogenic as well, but it is informative nonetheless.) Some scenarios carry a significantly higher probability of human extinction than others, and it would be more informative to have them categorized separately than to have all of them lumped together. Would it work to categorize the scenarios as "most likely existential risks" vs "less likely existential risks"? Polar Mermaid (talk) 17:45, 28 April 2015 (UTC)
How do we know which specific scenarios count as existential or not, it's based on predicting the future and our own original "research" (opinion). Nor can we assign a relativistic probability as again that would be based on original research (opinion). You're correct there is overlap with global pandemic. That could be resolved the same way global warming is, with one section concerning human caused and the other section natural caused. For global pandemic one section could be on natural pandemics, the other on terrorism. These are good distinctions as they involve different scenarios and solutions, even if the ultimate mechanism of death is the same. -- GreenC 14:08, 29 April 2015 (UTC)
Concerning the issue of probabilities: there are a set of events in the original catastrophic scenario, and a set of follow-up events that have some (unknown) probability of occurring. For example, the artificial intelligence scenario assumes that superintelligent AI has arisen. It then considers a number of follow-up events that have certain (unknown) probabilities, some leading to human extinction. Other scenarios, such as megatsunamis, do not include global human extinction as a possible outcome. Hence, there is a clear categorical difference between AI and megatsunamis. Similar to the suggestion by Polar Mermaid, we could categorize risks as "potentially existential" and "not potentially existential." This would avoid the use of probabilities and also allow us to include this important distinction, which is already highlighted throughout the article (Section 1, "Classification of risks," and Section 3, "Moral importance of existential risk").Davearthurs (talk) 18:48, 7 May 2015 (UTC)
"No potential" according to who? Therein is the problem. Also, there is no hard definition of existential risk. The term is sometimes used meaning 100% extinction - but it doesn't have to mean that. It is one of context - who uses it, what they mean by it. We use it here to mean 100% extinction, but it has a flexible meaning. -- GreenC 19:48, 7 May 2015 (UTC)

Also, why does this article need a "discredited scenarios" section containing only the Mayan end of world scenario? This religious apocalypse scenario really does not fit in with the scientific style of the rest of the page and pulls down its credibility, and why single out the Mayan scenario among the many religious predictions of apocalypse? Polar Mermaid (talk) 17:57, 28 April 2015 (UTC)

Seems relevant to discuss history not just current thinking. Granted there is probably a better article for that in detail (millenialism, apocalyptic thinking), but should still have a sub section on history. Right now it's an out of place single scenario but it could be redone as an historical overview of the field. -- GreenC 14:08, 29 April 2015 (UTC)

Stewart Quote

@Drbogdan: I fail to see the significance of the Stewart quote restored to the article in this diff [9]. I'll admit it's dramatic dialog, and would expect to see its inclusion in, say, a novel, but it needs to be pertinent as well. See WP:QUOTEFARM. I'm aware that you have sources for it but the existence of sources is not in itself justification for content, per WP:INDISCRIMINATE. Not all information is of equal value, and inclusion of insignificant information degrades the usefulness of an article. Geogene (talk) 22:09, 4 June 2015 (UTC)

@Geogene: Thank you *very much* for your comments - they're all *greatly* appreciated - however, imo the quote (see below) seems relevant, pertinent and significant in the "Global catastrophic risk" article - to help readers understand, from a reliable and official source, that a quick (and/or real?) solution, as perhaps suggested in the popular media, to a possible asteroid impact event, may not be possible at the moment or foreseeable future - (afaik, we may be no better off now than the dinosaurs 65 million years ago confronting a similar asteroid event) - the quote seems to present our current predicament *very* well (& realistically) imo - however, I'm aware since, according to "WP:OWN", "All Wikipedia content ... is edited collaboratively", there may be other opinions from other editors about this - therefore - other opinions are welcome of course - in any case - Thanks again for your own comments - and - Enjoy! :) Drbogdan (talk) 22:57, 4 June 2015 (UTC)
If we want the readers to know this, then we need to tell them this ourselves, in Wikipedia's voice, cited to reliable secondary sources. Geogene (talk) 23:10, 4 June 2015 (UTC)
Thank you for your suggestion - Yes - your suggestion may also be ok - however, the present quote (see above) seems sufficient - and a better way of presenting the notion at the moment afaik - as before, opinions from others are welcome - iac - Enjoy! :) Drbogdan (talk) 23:32, 4 June 2015 (UTC)

@Green Cardamom: I oppose your reversion of my edit here [10] removing external links from the body of the article. According to the relevant guideline, WP:EL, external links should not normally be placed in the body of an article. I see no justification in that guideline to maintain a directory of websites of organizations that are tangentially related to the article's subject. Also see WP:ELNO #1, #4, and #19, and WP:ELBURDEN. Geogene (talk) 22:43, 4 June 2015 (UTC)

They are not "external links", they are references linking to the relevant organizations. They should be moved into citation templates not deleted. -- GreenC 01:00, 5 June 2015 (UTC)
From the first sentence of the guideline I cited above: Wikipedia articles may include links to web pages outside Wikipedia (external links), but they should not normally be placed in the body of an article. All external links must conform to certain formatting restrictions. Request you self-revert. Geogene (talk) 01:26, 5 June 2015 (UTC)
WP:ELPOINTS #1 "This guideline does not apply to inline citations or general references" .. those links are references per WP:GENREF. They should be moved to citation templates but that is not reason to delete the entire section just because no one has yet added cite templates. -- GreenC 03:10, 5 June 2015 (UTC)
So, linkspam is a "reference" now? Some of those might be okay. Geogene (talk) 03:27, 5 June 2015 (UTC)
I've removed those individually, giving a specific justification for each in an edit summary. One seems obscure and appears to get a disproportionate percentage of its traffic from this article, which looks spammy to me. The others were all variations of a theme of "about us" and "how to donate", which aren't useful as references (or as generally as external link). I also removed an external link to a paper that was about individual rather than collective mortality, and a link to a webarchive of somebody's personal essay. Geogene (talk) 03:46, 5 June 2015 (UTC)

Proposal: Add moral implications of existential risk reduction and reasons why more effort isn't being dedicated to decrease it

More specifically, would it be okay to discuss reasons for decreasing existential risks along with the cognitive biases and economic reasons why more effort isn't being devoted to doing so? Nick Bostrom's paper "Existential Risk Prevention as Global Priority" discusses this extensively. — Preceding unsigned comment added by Y2N1-09631 (talkcontribs) 04:04, 1 July 2015 (UTC)

Maybe. Here's an article from The Atlantic: We're Underestimating the Risk of Human Extinction: "Bostrom, who directs Oxford's Future of Humanity Institute, has argued over the course of several papers that human extinction risks are poorly understood and, worse still, severely underestimated by society. ... If you have that moral point of view that future generations matter in proportion to their population numbers, then you get this very stark implication that existential risk mitigation has a much higher utility than pretty much anything else that you could do." — Jeraphine Gryphon (talk) 12:31, 1 July 2015 (UTC)

Ok, I Added more cognitive biases affecting perception of existential risks, and added economic reasons why more effort isn't being spent to decrease them.--Y2N1-09631 (talk) 19:19, 20 August 2015 (UTC)

Revising classification of risk section

I organized this section to reflect that there are different ways of classifying risk, such as the different axes used in Bostrom's scope/severity chart. A section on severity of risk should be added.

In the future, it would be good to bring discussion of different GCR scenarios into a more organized discussion of their classification, rather than having them in an disorganized list. Right now they are only classified as anthropogenic vs. natural, but there are other meaningful distinctions. Davearthurs (talk) 18:54, 3 October 2015 (UTC)

We've had this discussion many times, see the history. It's hard to deal with on Wikipedia because as soon as you classify something, an editor will complain about the classification being subjective. The solution has been a simple non-ambiguous break (anthro vs nonanthro) and leave the splitting and chunking to the experts, who can classify according to whatever aspect they are trying to emphasize. Classification presumes scenarios will play out a certain way, it requires an argument and position. We can though report on other expert's classification schemes. -- GreenC 19:49, 3 October 2015 (UTC)
Also Bostrom makes the split between nature and anthro existential risks, in a presentation given to the United Nations.[11] -- GreenC 16:32, 15 October 2015 (UTC)

Definition of existential risk

Here is the Future of Life's definition of an existential risk.[12] It mirrors our own attempts defining what this article is about. -- GreenC 16:37, 15 October 2015 (UTC)

Please add {{Authority control |VIAF=xxxxxx |LCCN=n/xx/xxxxxx |ISNI=xxxx xxxx xxxx xxxx |ORCID=xxxx-xxxx-xxxx-xxxx |GND=xxxxxx |SELIBR=xxxxxx |SUDOC=xxxxxxxxx |BNF=xxxxxx |BPN=xxxxx |RID=xxxxx |BIBSYS=xxxxx |ULAN=xxxxx |MBA=xxxxxx |NLA=xxxxxxx |NDL=xxxxxxxx}}....? — 73.47.37.131 (talk) 19:31, 1 January 2016 (UTC)

Not done. not relevant here. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:08, 24 March 2016 (UTC)

Catastrophic risks in development that our current civilization has already passed by

Some catastrophes have, luckily, not taken place.

Human societies could have depleted the easily-available non-renewable resources like minerals (and/or other resources that renew only over relatively long term, such as wood), before reaching a higher level of technology, such as ours, that allows, inter alia, to use natural resources that are more difficult to reach and access to new technologies (such as solar modules as replacement of fire). Human societies could, for example, have developed to a level of technology and civilization similar to the roman empire, but then further development could have stalled and humanity could had fallen back to permanent stone-age societies. The same could already have happened after copper ores were depleted, unless iron ore had been found as more readily available resource. Once the global society would have lost its original experience of refining metals, a similar technological development would be more difficult to achieve once again, for geological time-scales, as the readily available resources had been used up already. Such hypothetical development should also be counted as a "Global catastrophic risk" that, in any case, would have ended technological development of humanity.

While no threat any more, such development would have to be factored in for questions like the Drake equation, i.e. the chance of intelligent species developing to technically developed societies. — Preceding unsigned comment added by Meerwind7 (talkcontribs) 17:49, 25 February 2016 (UTC)

Introducing the Subfield of Agential Riskology

"Introducing the Subfield of Agential Riskology". Interesting perspectives on ways to look at existential risks. -- GreenC 15:38, 27 February 2016 (UTC)

Hello fellow Wikipedians,

I have just added archive links to one external link on Global catastrophic risk. Please take a moment to review my edit. You may add {{cbignore}} after the link to keep me from modifying it, if I keep adding bad data, but formatting bugs should be reported instead. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether, but should be used as a last resort. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 05:54, 29 March 2016 (UTC)

Global Challenges Foundation annual report 2016

This is an excellent report.[13]

-- GreenC 16:11, 30 April 2016 (UTC)

Putting actual estimated risk in the lead

Per wp:lead should contain a summary of what makes the article interesting. Adding information that you are 5 times more likely to die from a near extinction event than a car crash, makes the article so much more interesting. What do you think? Daniel.Cardenas (talk) 18:00, 30 April 2016 (UTC)

It's just one report there are many others, why are you trying to push that one in the lead section? It's also fairly trivial in nature and limited to Americans only. And it's a singular POV from a single report. We have a whole section on probabilities, plus many other reports that should be in the article. -- GreenC 18:24, 30 April 2016 (UTC)
Add a better report one if you have it. Why are you pushing hard for its removal when it makes the article so much more interesting? Daniel.Cardenas (talk) 18:44, 30 April 2016 (UTC)
Excerpt from wp:lead:
A good lead section cultivates the reader's interest in reading more of the article...
Daniel.Cardenas (talk) 18:53, 30 April 2016 (UTC)
The lead is a summary of the article contents it is not a place to put original content just because you want to push it up to where people can see it that's a problem with WP:WEIGHT. Also you are messing up the references. And you don't have consensus for these edits so stop edit warring. -- GreenC 20:07, 30 April 2016 (UTC)
Also this report just came out yesterday (Atlantic article) - it's brand new, making some really far out sensational statistical claims, it's had no time to establish consensus or further discussion in the community of scholars of this field. Calling it "interesting" is what newspapers do with sensational headlines, which we are not supposed to do per LEAD. -- GreenC 20:21, 30 April 2016 (UTC)
What makes you think it is original content? Feel free to fix a reference issue. You don't have consensus so stop edit warring. The claim is much less sensational then others prominently listed in the article. Is it a couple of orders of magnitude less sensational then this?:
Risk Estimated probability from an expert survey for human extinction before 2100
Overall probability 19%
Molecular nanotechnology weapons 5%
Superintelligent AI 5%
Non-nuclear wars 4%
Engineered pandemic 2%
Nuclear wars 1%
Nanotechnology accident 0.5%
Natural pandemic 0.05%
Nuclear terrorism 0.03%
Since it is much less sensational no basis for revert. wp:BRD is not a valid excuse for reverting good-faith efforts to improve a page simply because you don't like the changes. Daniel.Cardenas (talk) 22:49, 30 April 2016 (UTC)
There is plenty of room for multiple POV's in the article. We can list 100s of them. There is not plenty of room for multiple POVs in the lead section. You're picking and choosing the one you personally favor to highlight in the lead section and edit warring to keep it there. Your bias is made clear by your assertion that it is "interesting" - more so you deleted the content from the article body so it only exists in the lead section! Do you realize how broken that is? Lead sections are not supposed to have original content, they duplicate content from the body in summary format. I already fixed the broken refs but you keep re-breaking them with your reverts. You don't have consensus per WP:BRD the burden is up to you, but you obviously don't respect BRD. If your unable to get consensus than I will.. but I'll give it some time to see if anyone wants to comment first and hopefully we can just take care of this quickly with a 2 against 1 situation, otherwise it will be a longer haul to bring in outside editors to comment. -- GreenC 00:08, 1 May 2016 (UTC)
What makes you think I personally favor this one? Have you proposed something else? Something to add or contrast or replace might be good. Others may add to the interest for the article. Didn't intend to delete anything from the article body. Will review and revert if I did. Daniel.Cardenas (talk) 02:35, 1 May 2016 (UTC)
Didn't find where I deleted that content in the body and put it in the lead. Here is the only thing I found that I deleted from body: https://wiki.riteme.site/w/index.php?title=Global_catastrophic_risk&type=revision&diff=717934052&oldid=717933799 02:39, 1 May 2016 (UTC)

Do I really need to answer this? If anyone is interested, look at Daniel.Cardenas's most recent edit to see the content he deleted but is apparently unwilling or unable to see. [14] He also messed up the refs, Ref # 4 & 5 are the same as #121 and #122. I tried to fix but he keeps reverting. More significantly, the lead section is not the place for a small not well known organization who just released a new report, it is undue WP:WEIGHT and possibly marketing. -- GreenC 05:00, 1 May 2016 (UTC)

Didn't see that because was looking for changes with deleted content and that was part of an add. Fixed now. Daniel.Cardenas (talk) 12:39, 1 May 2016 (UTC)
You made an attempt to fix it, but you are not paying attention and broke it further. Take your time. Look closely. Examine the references. Notice how it's in THREE different sections. It's in the lead section (the lead section is the part that comes first, before the first sub-section). It's in the sub-section called "Likelihoods". And it's in the subsection called "Organizations". Furthermore, when a reference is used more than once in an article (note: this references is used THREE times) -- you only define it one time, and the other times (two in this case) use the name system, for example "<ref name=whatever/>". You've been on Wikipedia for 10 years, according your page, you should know the basics of how to do references. -- GreenC 20:09, 2 May 2016 (UTC)
Thanks! Daniel.Cardenas (talk) 02:19, 3 May 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 2 external links on Global catastrophic risk. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

This one failed, but it doesn't appear to be a key source. Rolf H Nelson (talk) 12:20, 10 July 2016 (UTC)

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

checkY An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 06:10, 2 July 2016 (UTC)

Overrepresentation of AI/nanotech

Does it seem to anyone else that this article overstates the risk of nanotechnology and artificial intelligence? I'm as much of a sci-fi buff as anybody but these two seemed pretty far-fetched compared to e.g. climate change and pandemics; given that the current state of these technologies basically boils down to carbon nanotubes and a liberal application of gradient descent, respectively, they seem roughly as likely to end civilization as a hostile alien invasion.

Maybe it would be more appropriate to have a brief "unforeseen technological disasters" section with links to the respective pages? I find both subjects interesting but their inclusion detracts from the credibility of the article. Bhlev (talk) 04:51, 22 August 2016 (UTC)

The point of this field of study (futurism) is not where things are today, but to look into the future where things are possibly headed. There are plenty of sources (indeed specialized institutions) that discuss the future potential of AI and nanotechnology to cause a catastrophe. Your welcome to disagree but we shouldn't impose our own opinions about it when there are so many sources. Your opinion contradicts Stephen Hawking and Elon Musk among others who say AI is the greatest threat to humanity. Are they right? Are you right? I don't know doesn't matter, we go by what reliable sources say. Given the huge number of sources on this topic there is no way its being over-represented. Other than global warming, it probably has more sourcing available than any other topic. -- GreenC 13:10, 22 August 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 4 external links on Global catastrophic risk. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 03:30, 13 January 2017 (UTC)

Hello fellow Wikipedians,

I have just modified 5 external links on Global catastrophic risk. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 15:08, 22 May 2017 (UTC)