Existential risk

from Wikipedia, the free encyclopedia
Artist's impression of a large asteroid impact . An asteroid with tremendous impact could be responsible for the extinction of the dinosaurs .

An existential risk for mankind is an event which is able to extinguish intelligent life that has originated on earth, or to drastically and permanently restrict its desired development. They differ from other forms of risk both in their extensive scope and in their enormous extent of damage. Natural disasters such as super volcanoes and asteroids can, if powerful enough, pose existential risks, although events of human origin such as nuclear war , bioterrorism , artificial intelligence or catastrophic climate change can also threaten the continued existence of intelligent life on earth.

While individual threats such as nuclear war or climate change are being intensively investigated, the systematic analysis of existential risks, despite their enormous importance, only began in 2002. This may also be due to the fact that a high degree of interdisciplinary knowledge is necessary to research existential risks.

Possible scenarios

Various events represent an existential risk for humanity or even all life on earth:

Authoritarian world government

A totalitarian dominated world government could threaten the continued existence of all humanity through a global genocide (omnicide), or at least exterminate a large part of the world population, while resistance through massive surveillance and the restriction of civil rights would be endangered. Large and powerful states are likely to pose a greater threat than small rogue states , which can only seriously endanger the continued existence of humanity with weapons of mass destruction , while a world government would be capable of a creeping omnicide.

Such scenarios are often discussed in alternative world stories. Aldous Huxley's Brave New World and George Orwell's 1984 are the best-known representatives of the genre. In Philip K. Dick's novel Das Orakel vom Berge , the victorious National Socialists murdered all black Africans in 1962 and turned them into grease.

Nuclear Holocaust

Even single nuclear warheads can destroy cities and destroy thousands of lives. Even the use of a fraction of the world's arsenals could kill millions or even billions of people through direct effects, while the aftermath of the explosions could ultimately exterminate humanity:

Destroyed infrastructure, poisoned water, germline mutations, rising cancer rates, a collapsing ecosystem, the collapse of social institutions and a nuclear winter could decimate the survivors of the explosions to extinction.

Environmental disasters

The loss of biodiversity through hunting , neobiota and environmental pollution , among other things , could lead to the end of humanity in the form of a global ecocide through certain key events .

Pandemics

A global pandemic that surpasses HIV , the Black Death or the Spanish flu in virulence and lethality could mean the end of mankind. The Quaternary extinction wave may have been triggered by such a global plague, an explanatory approach known as the "hyper-disease hypothesis". This was developed by Ross MacPhee (staff member at the American Museum of Natural History) and Preston Marx (Aaron Diamond AIDS Research Center and Tulane University), who argue that a highly infectious and fatal disease may have played a key role in the decimation of the megafauna in North America . However, such diseases are difficult to substantiate through paleontological evidence. The extinction of the indigenous rat species on the Christmas Islands may have been caused directly by flea diseases spread by introduced rats. DNA analyzes confirmed this assumption.

Bioterrorism

Developments in biotechnology , genetics and genomics have lasting effects on global security. The causative agent of hyper-disease in the hands of terrorists could mean the end of humanity. The following pathogens are so-called class A pathogens and can already be used as biological weapons in their natural form.

  1. Bacillus anthracis ( Anthrax )
  2. Clostridium botulinum toxin ( botulism )
  3. Yersinia pestis ( plague )
  4. Variola major ( smallpox ) and related smallpox viruses
  5. Francisella tularensis ( tularemia )
  6. Viral hemorrhagic fever
  7. Arena viruses: Junin, Machupo, Guanarito, Chapare Lassa, Lujo
  8. Bunyaviruses
  9. Hantaviruses
  10. Flaviruses
  11. Dengue
  12. Filoviruses
  13. Ebola
  14. Marburg virus

Genetically modified pathogens in biological weapons (e.g. Ebola pox ) are certainly particularly suitable for causing a final pandemic .

Gray goo

Molecular nanotechnology can well lead to the construction of bacteria- sized robots that replicate themselves and require inorganic or organic compounds for their replication. By in this exponentially accelerating process, the biosphere destroy, poison or block the sunlight, they could jeopardize the continued existence of humanity and all life on Earth. Security measures such as shielded storage, the restriction of the robots' “food” to rare elements or programmed locks could hypothetically reduce the risk of such a catastrophe. However, a design accident or the misuse of the nanorobots as weapons could bypass these barriers.

Escalating global warming

There is a possibility that the release of greenhouse gases into the atmosphere could turn out to be a strongly self-reinforcing feedback process, in which, from a certain point of no return, neither avoidance strategies nor geoengineering can intervene. The earth's atmosphere could then adjust to that of Venus and life would be impossible.

New Ice Age / Second Snowball Earth

The “ Snowball Earth ” theory states that the entire earth was frozen several times in the history of the earth, possibly leading to some of the most severe crises and mass extinctions in the history of life on earth.

Should the current warm period end abruptly, the temperature drops and their impact on the food supply could endanger the continued existence of mankind.

Outbreak of a super volcano

The eruption of an entire volcanic complex could cause effects that are comparable to a nuclear winter and thus endanger the continued existence of mankind.

Impact of a meteorite

An object more than 500 m in diameter that hits the earth with sufficient speed could cause the end of mankind.

Gamma-ray bursts

Gamma-ray bursts are waves of intense radiation that arise during certain astronomical events and that travel through the cosmos at the speed of light. Should such a lightning strike the earth with sufficient intensity, the intense radiation would not penetrate deep into the atmosphere. However, nitrogen and oxygen in the layers of the upper atmosphere would be fused to form nitrogen monoxide , which has an effect on ozone similar to that of CFCs , and as a result the ozone layer would likely be irreversibly destroyed.

The UV light from the sun, which could then reach the earth's surface unchecked, would probably primarily exterminate the phytoplankton that live just below the water surface, which could lead to the collapse of the entire global ecosystem.

The Ordovician mass extinction may have been caused by a gamma-ray burst near Earth. The Wolf-Rayet star WR 104 could send a GRB to Earth when it dies and sterilize it.

Massive solar flares

Solar flares are huge protuberances on the sun's surface, as a result of which the earth is bombarded with subatomic particles , which, however , can be deflected by the earth's magnetic field . However, main sequence stars , which include the sun, have already been observed to increase their luminosity by a factor of 20. Some researchers suggest that this could be a sign of extraordinary “super prominences”. It cannot be ruled out that the sun could also send such bursts of radiation to earth. These could possibly have similar effects on earthly life as a gamma-ray flash.

Entry of the solar system into a dark cloud

A dark cloud is a cloud of interstellar gas and dust that is far larger than the solar system and can measure up to 150 light years across. If the solar system were to drift through such a nebula, the cosmic dust could darken the light from the stars. Furthermore, a dark cloud with a density of 100 to 300 molecules per cm³ could strongly compress the heliosphere , whereby its matter could reach the interior of the solar system and even darken the sun. This could disrupt photosynthesis or make it impossible. Some researchers suspect such a "foggy winter" behind past ice ages and mass extinctions .

We can live without sunshine, but a dark cloud would restrict us from averting other existential threats.

Extinction due to the influence of artificial intelligence

Superior artificial intelligence could endanger the very existence of all of humanity. Either by deliberately exterminating them or in the course of fulfilling their tasks or out of indifference, destroying our livelihood, the earth or even the universe ; Certain thought experiments do not make it seem impossible for a superintelligence to transform the solar system into Computronium without directly having malicious intentions.

Such intelligence could be created purposefully through intensive study of neurology and cybernetics or emerge from entering a technological singularity when machines begin to improve themselves to such an extent that human understanding can no longer keep up. Such intelligence would also be possible if the use of brain-computer interfaces and nanomedicine or comparable technologies change the capacities of one or many people in such a way that a new, different and superior consciousness arises.

The Blue Brain project, ACT-R , SOAR , Cyc , ALICE and Watson are just a few of the projects that could promote the emergence of artificial intelligence or even aim directly at it.

In popular culture , this idea is elaborated prominently: With Skynet, an artificial intelligence that in the movie Terminator , immediately after she regained consciousness, a global atomic war begins and then all it can to eradicate the surviving people with time-traveling fight robots; or Colossus from the film of the same name , which keeps all humanity in check with nuclear missiles and threatens to exterminate them if they resist.

Extinction due to extraterrestrial influence

Should there be contact between humans and extraterrestrial civilizations, they could exterminate humanity, either through a targeted genocide , however justified , through the introduction of foreign pathogens , through predators, through the robbery of all earthly raw materials or through the extermination of key animal species.

Falsified scenarios

Polywater

In the 1960s, Soviet scientists gave themselves up to the idea that polywater would be a particularly stable variant of normal water. This could convert normal water, which would then be hostile to life due to changed properties and thus destroy the entire ecosystem. Later this research turned out to be a pathological science .

Probability of existential risks

The probabilities of some hazards have been calculated with considerable accuracy. For example, the probability of an asteroid impact causing human extinction is 0.000001 (one in a million) over the next hundred years (although later research suggested a much higher probability). Comparable to this are volcanic eruptions severe enough to cause catastrophic climate change, similar to the Toba volcanic eruption , which nearly caused the extinction of human species. The frequency of such volcanic eruptions is estimated to be about once every 50,000 years. However, the probabilities of other threats are much more difficult to calculate. Although experts at the Global Catastrophic Risk Conference estimated the likelihood of extinction for the human species at 19% within the next 100 years, there was considerable disagreement as to the relative importance of each hazard.

There are significant methodological difficulties in assessing these risks. Most of the attention has been devoted to risks that will threaten human civilization for the next hundred years, but predicting over this period is very difficult. All threats originating in nature are relatively constant, although new risks may be discovered. However, anthropogenic threats are likely to change dramatically as new technologies develop; while volcanoes have been a threat throughout history, nuclear weapons have only played a role since the 20th century. History shows that previous experts' ability to predict the future was very limited, but modern probabilistic forecasting methods such as forecasting markets or more traditional approaches such as peer review could increase the accuracy of forecasts.

Human-origin threats such as B. nuclear war or nanotechnology are even more difficult to predict because of the inherent methodological problems in the social sciences. During the Cuban Missile Crisis , John F. Kennedy estimated the likelihood of a nuclear war at 33% to 50%. Regardless, it is generally very difficult to correctly assess the magnitude of any of these risks, as international relations and technological advances can change rapidly.

Due to observation selection effects , existential risks pose a unique challenge to predictions, even more so than other events far in the future. In contrast to most other events, the absence of existential risks in the past is no evidence that the probability of existential risks in the future is low. Because if an existential risk had occurred in our past, there would no longer be any people who could observe it.

“In the case of existential risks, our approach cannot be practical trial and error. There is no way to learn from mistakes. The reactive approach - just see what is happening, limit dangers and learn from experience - is impracticable. No, we have to take a proactive approach. This requires the foresight to anticipate new types of threats and the willingness to implement decisive preventive measures and to bear their (moral or economic) costs. "

- Nick Bostrom , Milan Cirkovic

Fermi paradox

Many planets have been discovered outside of our solar system, and there are likely many more undiscovered, Earth-like planets on which life could arise. Given the relative rapidity with which life evolved on Earth, and given the enormous size of the observable universe, it seems a priori and a posteriori likely that intelligent life evolved on other planets as well.

Thus, the lack of any evidence of intelligent life outside of Earth seems to be a paradox . Particularly relevant is the lack of large, artificial structures visible from Earth, which suggests that few or no civilizations survive long enough to be able to colonize space.

There are a number of explanations for the Fermi Paradox , such as the hypothesis that the earth is part of a galactic zoo or that life is simply extremely unlikely to arise. However, another explanation is that a Big filters (English: Great Filter ) exists; The Great Filter is the property of our universe that prevents dead matter from evolving into intelligent life within a sufficiently long period of time and then from colonizing the universe. If this filter is now in front of us - perhaps most civilizations are destroyed in nuclear wars - it will, unless it is very unusual compared to alien civilizations, prevent humanity from colonizing space.

Cognitive biases

Research into cognitive bias reveals a number of tendencies and inclinations in people's perception, judgment and thinking that do not conform to the standards of impartial rationality. Many of these biases influence the prediction of existential risk. The availability heuristic, for example, could lead people to underestimate the danger of existential risks, since nobody has obviously observed one. Similarly, hindsight error causes past events to appear more predictable than they actually were, leading to over-confidence in the ability to predict the future.

The Konjunktions Fallacy (English: Conjunction fallacy ) occurs when people the chance of conjunctions overestimate; z. B. believing that a political activist is more likely to be a feminist bank clerk than a bank clerk. Likewise, people underestimate the likelihood of disjunctive events. The threat to existential risks is highly disjunctive, because the cause can be nuclear war or climate change or artificial intelligence or bioterrorism or an asteroid impact, which means that the plausibility of existential risks is underestimated.

Potential significance of existential risks

Some scientists are strongly committed to reducing existential risks, as future generations would benefit greatly from it. Derek Parfit claims that human extinction would be an immense loss as our descendants could potentially survive for billions of years before the rising heat of the sun made the earth uninhabitable. Nick Bostrom argues that with the colonization of space, humanity will have even greater options. If our descendants settle in space, they may be able to feed very large numbers of people, and possibly for billions of years. Therefore, even a small reduction in the likelihood of existential risk would have a most significant impact on the expected number of people who will exist in the future.

Some scientists disagree with these arguments. Exponential discounting could make these future benefits much less important, and some philosophers doubt the value of ensuring the existence of future generations.

Some economists also have the meaning of existential risks discussed, though (English: in most of these discussions, the name "catastrophic risks" catastrophic risk ) is used. Martin Weitzman argues that possibly most of the economic costs expected from climate change come from the small likelihood that global warming will far exceed our calculations and cause catastrophic damage. Richard Posner has argued that, in general, we do far too little about dangers of huge damage, but seemingly small, difficult to estimate probability.

See also

literature

Web links

Individual evidence

  1. Nick Bostrom: Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards .
  2. Michael R. Rampino: Super volcanism and other geophysical processes of catastrophic import . In: Nick Bostrom, Milan Cirkovic (eds.): Global Catastrophic Risks . Oxford University Press, 2008.
  3. ^ William Napier: Hazards from Comets and Asteroids . In: Nick Bostrom, Milan Cirkovic (eds.): Global Catastrophic Risks . Oxford University Press, 2008.
  4. Eliezer Yudkowsky: yudkowsky.net Artificial Intelligence as a Positive and Negative Factor in Global Risk . In: Nick Bostrom, Milan Cirkovic (eds.): Global Catastrophic Risks. Oxford University Press, 2008.
  5. a b Martin Weitzman: On modeling and interpreting the economics of catastrophic climate change . In: The Review of Economics and Statistics . 91, No. 1, 2009, pp. 1-19.
  6. Nick Bostrom: Existential risk prevention as the most important task for humanity . (PDF; 978 kB) 2011.
  7. N. Bostrom: Existential Risks . In: J. Evol. Technol.
  8. ^ 'Rogue states' as a source of global risk . ( Memento of the original from December 4, 2014 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. Retrieved November 28, 2014. @1@ 2Template: Webachiv / IABot / postbiota.org
  9. a b c d Existential Risks Analyzing Human Extinction Scenarios and Related Hazards . Retrieved November 28, 2014.
  10. ^ Humanity Could Not Survive a Nuclear War Using Even a Fraction of Existing Arsenals . Retrieved November 28, 2014.
  11. That doesn't work on a cow skin . In: Die Zeit , No. 46/2005. Between collapse and the turn of the epoch - On the significance of ecocide for the present . ( Memento from May 16, 2007 in the Internet Archive )
  12. How Disease can wipe out an entire species . Retrieved November 28, 2014.
  13. How did Hyperdisease cause extinctions? Retrieved November 28, 2014.
  14. ^ Disease Can Cause Extinction of Mammals . Retrieved November 28, 2014.
  15. ^ Genetic engineering and biological weapons . PMC 1326447 (free full text)
  16. ^ NIAID Biodefense Research NIAID Category A, B, and C Priority Pathogens . Retrieved November 28, 2014.
  17. Caltech Education.Retrieved November 28, 2014.
  18. ^ Who originated the concept of snowball earth? Retrieved November 28, 2014.
  19. Exploding Stars: Is Earth at Risk? Retrieved November 28, 2014.
  20. Christopher Wanjek: Explosions in Space May Have Initiated Ancient Extinction on Earth . NASA. April 6, 2005. Retrieved April 30, 2008.
  21. ^ Ray burst is extinction suspect , BBC. April 6, 2005. Retrieved April 30, 2008. 
  22. AL Melott et al .: Did a gamma-ray burst initiate the late Ordovician mass extinction? . In: International Journal of Astrobiology . 3, No. 2, 2004, pp. 55-61. arxiv : astro-ph / 0309415 . bibcode : 2004IJAsB ... 3 ... 55M . doi : 10.1017 / S1473550404001910 .
  23. AL Melott, BC Thomas: Late Ordovician geographic patterns of extinction Compared with simulations of astrophysical ionizing radiation damage . In: Paleobiology . 35, 2009, pp. 311-320. arxiv : 0809.0899 .
  24. PG Tuthill, JD Monnier, WC Danchi: Letters to Nature: A dusty pinwheel nebula around the massive star WR104 . In: Nature . 398, No. 6727, 1999, p. 487. doi : 10.1038 / 19033 .
  25. 20 Ways the World Could End . Retrieved November 28, 2014.
  26. How Will Humans Meet Their End? 5 cosmic risks . Retrieved November 29, 2014.
  27. James Matheny Gaverick: Reducing the Risk of Human Extinction . In: Risk Analysis . 27, No. 5, 2007.
  28. ^ DJ Asher, ME Bailey, V. Emel'yanenko, WM Napier: Earth in the cosmic shooting gallery. In: The Observatory 2005, 125, pp. 319-322.
  29. Stanley H. Ambrose: Late Pleistocene human population bottlenecks, volcanic winter, and differentiation of modern humans . In: Journal of Human Evolution . tape 34 , no. 6 . Oxford 1998, p. 623-651 , doi : 10.1006 / jhev.1998.0219 . ISSN 0047-2484  
  30. MR Rampino, SH Ambrose: Super eruptions as a threat to civilizations on Earth-like planets. In: Icarus 2002, 156, pp. 562-569.
  31. ^ Global Catastrophic Risks Survey, Technical Report. 2008, Future of Humanity Institute
  32. ^ Record of the Workshop on Policy Foresight and Global Catastrophic Risks. Future of Humanity Institute
  33. a b Observation Selection Effects and Global Catastrophic Risks. Milan Cirkovic, 2008
  34. Nick Bostrom, Milan Cirkovic: Global Catastrophic Risks. 2008, Oxford University Press.
  35. ^ Jean Schneider: Interactive Extra-solar Planets Catalog . In: The Extrasolar Planets Encyclopedia . 2011. Retrieved June 23, 2011.
  36. hanson.gmu.edu
  37. ^ Katja Grace: Anthropic Reasoning in the Great Filter .
  38. I knew it would happen: remembered probabilities of once-future things . In: Organ. Behav. Human Perf. , 13, pp. 1-16
  39. A. Sides, D. Osherson, N. Bonini, R. Viale: On the reality of the Conjunction fallacy . In: Memory Cogn. , 30 (2), 2002, pp. 191-198
  40. ^ Tversky, Kahneman: Judgment under Uncertainty: heuristics and biases . In: Science , 185, 1974, pp. 251-284.
  41. Eliezer Yudkowsky: Cognitive Biases potentially affecting judgments of global risks . 2008
  42. ^ Derek Parfit: Reasons and Persons . Oxford University Press, 1984, pp. 453-454.
  43. Nick Bostrom: Astronomical Waste: The opportunity cost of delayed technological development . In: Utilitas . 15, No. 3, August, pp. 308-314.
  44. ^ Jan Narveson: Utilitarianism and New Generations . In: Min . 76, 1967.
  45. ^ Richard Posner: Catastrophe: Risk and Response . Oxford University Press, 2004.