Non-anthropogenic global catastrophic risk

The notion of risk of disaster planetary or global catastrophe is introduced in 2008 by the philosopher Nick Bostrom to describe a hypothetical future event that would have the potential to degrade the welfare of the majority of mankind, for example by destroying modern civilization; he had also proposed as early as 2002 to call an existential risk an event that could cause the extinction of humanity.

Potential global catastrophes include climate change, pandemics and nuclear wars, as well as risks related to nanotechnology or control by hostile artificial intelligence, as well as cosmic disasters such as meteorite impacts.

A rigorous quantitative study of these risks is difficult, because of the uncertainty about the final consequences (the stakes) of the hazard triggering the catastrophe, and the probability of this hazard, and also because many cognitive biases in complicate the analysis; moreover, an event that may have caused the extinction of mankind or the complete destruction of civilization has obviously never occurred, the probability that it will happen is minimized; this phenomenon is known in statistics as selection bias.

Although the risks of global catastrophe have been the subject of many science fiction scenarios (often modernizing very old myths like Pandora’s), and alarmist statements from the 1950s, it is only at the beginning the 21th century that various organizations have begun to study them systematically, especially under the leadership of the movements transhumanists.

Classifications
A major risk is an uncertain event whose realization is often not very probable, but whose negative effects are considerable. The geographers and many specialists after they cut this notion in three terms: the hazard, which is the event itself uncertain, the issues that are socio-economic or ecological values subjected to the effects of the hazard when occurs, and vulnerability, which sets the degree of destruction of the stakes by these effects. For example, in the case of river flood risk, the hazard is the flood of the watercourse, the stakes are the people and the goods exposed to the flood, finally the vulnerability is measured in particular taking into account the height, the solidity and the watertightness of the concerned buildings.

Global catastrophe risk and existential risk
The philosopher Nick Bostrom introduced in 2002 the notion of existential risk, and in 2008 the concept of global catastrophe risk, in relation to a classification of risks according to their extent and intensity, the range from the individual scale to the set of future generations, and the intensity from “imperceptible” to “maximum” (examples are given in the table below)). On this scale, it defines a “global catastrophe risk” as at least “global” (affecting the majority of humans) and of “major” intensity (affecting the well-being of individuals over a prolonged period); an “existential risk” is defined as “transgenerational” and “maximal” (irreversible, and deadly in the short term). Thus, an existential risk destroys humanity (or even all forms of higher life) or at least leaves no chance for the reappearance of a civilization, while a global catastrophe, even if it killed the majority of humans, would leave a chance to survive and rebuild others; Bostrom thus considers the existential risks as much more significant; He also noted that humanity could not consider existential risk before 1950 No. 3, and that all strategies designed to reduce the risk of planetary catastrophe are inoperative face of complete extinction threats.

Regardless of this work, in Catastrophe: Risk and Response, Richard Posner, in 2004, grouped together events that brought “complete upheaval or ruin” on a global (rather than local or regional) scale, considering them as worthy of special attention. in terms of cost-benefit analysis, because they could, directly or indirectly, endanger the survival of humanity as a whole. Among the events discussed by Posner are cosmic impacts, the runaway global warming, gray frost, bioterrorism, and Accidents in particle accelerators.

Nearly by definition, planetary disasters are not only major risks, but represent maximum vulnerability and issues so vast that they are impossible to quantify. This results in a confusion often made between risk and hazard in this case.

Classification according to scenarios
Bostrom identifies four types of global disaster scenarios. The “bangs” are brutal catastrophes (accidental or deliberate); the most likely examples being nuclear war, the aggressive (and out of control) use of biotechnology or nanotechnology, and cosmic impacts. The crunches are scenarios of progressive deterioration of the social structures in which humanity survives, but civilization is irremediably destroyed, for example by exhaustion of natural resources, or by dysgenic pressureslowering the average intelligence. The Shrieks are scenarios of futures dystopian such as totalitarian regimes using artificial intelligence to control the human race. The whimpers are gradual declines values and civilization. Nick Bostrom considers the last three types of scenarios as preventing (more or less definitively) humanity from realizing its potential; Francis Fukuyama believes that this argument, which is based on the values of transhumanismis not enough in itself to classify them as global disaster risks.

Potential sources of risk
Some sources of catastrophic risk are natural, such as meteor impacts or supervolcanoes. Some of these have caused mass extinctions in the past. On the other hand, some risks are man-made, such as global warming, environmental degradation, engineered pandemics and nuclear war.

Non-anthropogenic

Asteroid impact
Several asteroids have collided with earth in recent geological history. The Chicxulub asteroid, for example, is theorized to have caused the extinction of the non-avian dinosaurs 66 million years ago at the end of the Cretaceous. No sufficiently large asteroid currently exists in an Earth-crossing orbit; however, a comet of sufficient size to cause human extinction could impact the Earth, though the annual probability may be less than 10−8. Geoscientist Brian Toon estimates that a 60-mile meteorite would be large enough to “incinerate everybody”. Asteroids with around a 1 km diameter have impacted the Earth on average once every 500,000 years; these are probably too small to pose an extinction risk, but might kill billions of people. Larger asteroids are less common. Small near-Earth asteroids are regularly observed and can impact anywhere on the Earth injuring local populations. As of 2013, Spaceguard estimates it has identified 95% of all NEOs over 1 km in size.

In April 2018, the B612 Foundation reported “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid], but we’re not 100 per cent sure when.” In June 2018, the US National Science and Technology Council warned that America is unprepared for an asteroid impact event, and has developed and released the “National Near-Earth Object Preparedness Strategy Action Plan” to better prepare.

Extraterrestrial invasion
Extraterrestrial life could invade Earth either to exterminate and supplant human life, enslave it under a colonial system, steal the planet’s resources, or destroy the planet altogether.

Although evidence of alien life has never been documented, scientists such as Carl Sagan have postulated that the existence of extraterrestrial life is very likely. In 1969, the “Extra-Terrestrial Exposure Law” was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991. Scientists consider such a scenario technically possible, but unlikely.

An article in The New York Times discussed the possible threats for humanity of intentionally sending messages aimed at extraterrestrial life into the cosmos in the context of the SETI efforts. Several renowned public figures such as Stephen Hawking and Elon Musk have argued against sending such messages on the grounds that extraterrestrial civilizations with technology are probably far more advanced than humanity and could pose an existential threat to humanity.

Natural climate change
Climate change refers to a lasting change in the Earth’s climate. The climate has ranged from ice ages to warmer periods when palm trees grew in Antarctica. It has been hypothesized that there was also a period called “snowball Earth” when all the oceans were covered in a layer of ice. These global climatic changes occurred slowly, prior to the rise of human civilization about 10 thousand years ago near the end of the last Major Ice Age when the climate became more stable. However, abrupt climate change on the decade time scale has occurred regionally. Since civilization originated during a period of stable climate, a natural variation into a new climate regime (colder or hotter) could pose a threat to civilization.

In the history of the Earth, many ice ages are known to have occurred. More ice ages will be possible at an interval of 40,000–100,000 years. An ice age would have a serious impact on civilization because vast areas of land (mainly in North America, Europe, and Asia) could become uninhabitable. It would still be possible to live in the tropical regions, but with possible loss of humidity and water. Currently, the world is in an interglacial period within a much older glacial event. The last glacial expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that a natural ice age will occur anytime soon. This may be due to manmade emissions potentially delaying the possible onset or another ice age for at least another 50,000 years.

Cosmic threats
A number of astronomical threats have been identified. Massive objects, e.g. a star, large planet or black hole, could be catastrophic if a close encounter occurred in the Solar System. In April 2008, it was announced that two simulations of long-term planetary movement, one at the Paris Observatory and the other at the University of California, Santa Cruz, indicate a 1% chance that Mercury’s orbit could be made unstable by Jupiter’s gravitational pull sometime during the lifespan of the Sun. Were this to happen, the simulations suggest a collision with Earth could be one of four possible outcomes (the others being Mercury colliding with the Sun, colliding with Venus, or being ejected from the Solar System altogether). If Mercury were to collide with Earth, all life on Earth could be obliterated entirely: an asteroid 15 km wide is believed to have caused the extinction of the non-avian dinosaurs, whereas Mercury is 4,879 km in diameter.

Another cosmic threat is a gamma-ray burst, typically produced by a supernova when a star collapses inward on itself and then “bounces” outward in a massive explosion. Under certain circumstances, these events are thought to produce massive bursts of gamma radiation emanating outward from the axis of rotation of the star. If such an event were to occur oriented towards the Earth, the massive amounts of gamma radiation could significantly affect the Earth’s atmosphere and pose an existential threat to all life. Such a gamma-ray burst may have been the cause of the Ordovician–Silurian extinction events. Neither this scenario nor the destabilization of Mercury’s orbit are likely in the foreseeable future.

If the Solar System were to pass through a dark nebula, a cloud of cosmic dust, severe global climate change would occur.

A powerful solar flare or solar superstorm, which is a drastic and unusual decrease or increase in the Sun’s power output, could have severe consequences for life on Earth.

If our universe lies within a false vacuum, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that we know without forewarning.[further explanation needed] Such an occurrence is called vacuum decay.

Geomagnetic reversal
The magnetic poles of the Earth shifted many times in geologic history. The duration of such a shift is still debated. Theories exist that during such times, the Earth’s magnetic field would be substantially weakened, threatening civilization by allowing radiation from the Sun, especially solar wind, solar flares or cosmic radiation, to reach the surface. These theories have been somewhat discredited, as statistical analysis shows no evidence for a correlation between past reversals and past extinctions.

Global pandemic
Numerous historical examples of pandemics had a devastating effect on a large number of people. The present, unprecedented scale and speed of human movement make it more difficult than ever to contain an epidemic through local quarantines. A global pandemic has become a realistic threat to human civilization.

Naturally evolving pathogens will ultimately develop an upper limit to their virulence. Pathogens with the highest virulence, quickly killing their hosts reduce their chances of spread the infection to new hosts or carriers. This simple model predicts that – if virulence and transmission are not genetically linked – pathogens will evolve towards low virulence and rapid transmission. However, this is not necessarily a safeguard against a global catastrophe, for the following reasons:

1. The fitness advantage of limited virulence is primarily a function of a limited number of hosts. Any pathogen with a high virulence, high transmission rate and long incubation time may have already caused a catastrophic pandemic before ultimately virulence is limited through natural selection. 2. In models where virulence level and rate of transmission are related, high levels of virulence can evolve. Virulence is instead limited by the existence of complex populations of hosts with different susceptibilities to infection, or by some hosts being geographically isolated. The size of the host population and competition between different strains of pathogens can also alter virulence. 3. A pathogen that infects humans as a secondary host and primarily infects another species (a zoonosis) has no constraints on its virulence in people, since the accidental secondary infections do not affect its evolution.

Volcanism
A geological event such as massive flood basalt, volcanism, or the eruption of a supervolcano could lead to a so-called volcanic winter, similar to a nuclear winter. One such event, the Toba eruption, occurred in Indonesia about 71,500 years ago. According to the Toba catastrophe theory, the event may have reduced human populations to only a few tens of thousands of individuals. Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in the past 17 million years. A massive volcano eruption would eject extraordinary volumes of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling: volcanic winter if short-term, and ice age if long-term) or global warming (if greenhouse gases were to prevail).

When the supervolcano at Yellowstone last erupted 640,000 years ago, the thinnest layers of the ash ejected from the caldera spread over most of the United States west of the Mississippi River and part of northeastern Mexico. The magma covered much of what is now Yellowstone National Park and extended beyond, covering much of the ground from Yellowstone River in the east to the Idaho falls in the west, with some of the flows extending north beyond Mammoth Springs.

According to a recent study, if the Yellowstone caldera erupted again as a supervolcano, an ash layer one to three millimeters thick could be deposited as far away as New York, enough to “reduce traction on roads and runways, short out electrical transformers and cause respiratory problems”. There would be centimeters of thickness over much of the U.S. Midwest, enough to disrupt crops and livestock, especially if it happened at a critical time in the growing season. The worst-affected city would likely be Billings, Montana, population 109,000, which the model predicted would be covered with ash estimated as 1.03 to 1.8 meters thick.

The main long-term effect is through global climate change, which reduces the temperature globally by about 5–15 degrees C for a decade, together with the direct effects of the deposits of ash on their crops. A large supervolcano like Toba would deposit one or two meters thickness of ash over an area of several million square kilometers.(1000 cubic kilometers is equivalent to a one-meter thickness of ash spread over a million square kilometers). If that happened in some densely populated agricultural area, such as India, it could destroy one or two seasons of crops for two billion people.

However, Yellowstone shows no signs of a supereruption at present, and it is not certain that a future supereruption will occur there.

Research published in 2011 finds evidence that massive volcanic eruptions caused massive coal combustion, supporting models for significant generation of greenhouse gases. Researchers have suggested that massive volcanic eruptions through coal beds in Siberia would generate significant greenhouse gases and cause a runaway greenhouse effect. Massive eruptions can also throw enough pyroclastic debris and other material into the atmosphere to partially block out the sun and cause a volcanic winter, as happened on a smaller scale in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer. Such an eruption might cause the immediate deaths of millions of people several hundred miles from the eruption, and perhaps billions of deaths worldwide, due to the failure of the monsoon, resulting in major crop failures causing starvation on a profound scale.

A much more speculative concept is the verneshot: a hypothetical volcanic eruption caused by the buildup of gas deep underneath a craton. Such an event may be forceful enough to launch an extreme amount of material from the crust and mantle into a sub-orbital trajectory.

Proposed mitigation
Planetary management and respecting planetary boundaries have been proposed as approaches to preventing ecological catastrophes. Within the scope of these approaches, the field of geoengineering encompasses the deliberate large-scale engineering and manipulation of the planetary environment to combat or counteract anthropogenic changes in atmospheric chemistry. Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario. Solutions of this scope may require megascale engineering. Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition.

Some survivalists stock survival retreats with multiple-year food supplies.

The Svalbard Global Seed Vault is buried 400 feet (120 m) inside a mountain on an island in the Arctic. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world’s crops. The surrounding rock is −6 °C (21 °F) (as of 2015) but the vault is kept at −18 °C (0 °F) by refrigerators powered by locally sourced coal.

More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.

Global catastrophic risks and global governance
Insufficient global governance creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks.

Risk perception
According to Eliezer Yudkowsky, many cognitive biases can influence the way in which individuals and groups consider the importance of global disaster risks, including insensitivity to scope, heuristic availability, representativity bias, affect heuristics, and the effect of overconfidence. For example, insensitivity to scope leads people to be more often concerned with individual threats than those addressed to larger groups (this is why their donations for altruistic causes are not proportional to the magnitude of the problem); that is why they do not consider the extinction of humanity as a problem as serious as it should be. Similarly, the representativeness bias leads them to minimize disasters that have little to do with those they were aware, and assuming that the damage they cause will not be much more serious.

It has often been noticed that the majority of anthropogenic risks mentioned above often correspond to very ancient myths, those of Prometheus, in Pandora and more recently that of the sorcerer’s apprentice is the most representative. The symbolism of the four Horsemen of the Apocalypse, the last three representative War, Famine and Death, is already in the Old Testament as the uncomfortable choice offered by God to King David. The various risks of machine revolt appear in the myth of the Golem, and, combined with biotechnologies, in the story of Frankenstein’s monster. On the other hand, it has been suggested that disaster narratives of various religious traditions (where they are most often related to the wrath of deities) would correspond to memories of real catastrophes (eg the Flood would be linked to the re-connection the Sea of Marmara with the Black Sea); under the name of catastrophism coherent (coherent catastrophism), Victor Clube and Bill Napier developed the hypothesis that cataclysmic meteor showers have given birth to many cosmological myths, ranging from the history of the destruction of Sodom and Gomorrah(thesis also defended by Marie-Agnes Courty) to the descriptions of Revelation; However, their ideas are well accepted by the scientific community.

The existence of these “mythical” interpretations, as well as numerous end-of-the-world prophecies No. 15, facilitates a phenomenon of partial or total refusal to take into account these disaster risks, known as Cassandra syndrome: while the anthropogenic risks are minimized by attributing them to irrational fears, the catastrophes described in the myths are judged exaggerated by the ignorance and the deformation of the memories.

The analysis of risks caused by humans suffers from two opposing biases: whistleblowers tend to exaggerate the risk to be heard, or even to denounce imaginary risks in the name of the precautionary principle; powerful economic interests trying to reverse minimize the risks associated with their activities, as shown for example the case of the institute Heartland, and more generally the analysis of disinformation strategies outlined in The doubt Merchants.

Giving a rational interpretation of the myth of the Golden Age No 20, Jared Diamond is finally observed that some disasters (the “collapses” of Nick Bostrom) can go undetected companies that suffer the lack of a memory sufficient history; it is such as he explains the ecological disaster suffered by the inhabitants of Easter Island.

Organizations
The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated “grey goo”.

Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.

Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute, which aims to reduce the risk of a catastrophe caused by artificial intelligence, with donors including Peter Thiel and Jed McCaleb. The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe. Most of the research money funds projects at universities. The Global Catastrophic Risk Institute (est. 2011) is a think tank for catastrophic risk. It is funded by the NGO Social and Environmental Entrepreneurs. The Global Challenges Foundation (est. 2012), based in Stockholm and founded by Laszlo Szombatfalvy, releases a yearly report on the state of global risks. The Future of Life Institute (est. 2014) aims to support research and initiatives for safeguarding life considering new technologies and challenges facing humanity. Elon Musk is one of its biggest donors. The Nuclear Threat Initiative seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event. It maintains a nuclear material security index.

University-based organizations include the Future of Humanity Institute (est. 2005) which researches the questions of humanity’s long-term future, particularly existential risk. It was founded by Nick Bostrom and is based at Oxford University. The Centre for the Study of Existential Risk (est. 2012) is a Cambridge-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare. All are man-made risks, as Huw Price explained to the AFP news agency, “It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology”. He added that when this happens “we’re no longer the smartest things around,” and will risk being at the mercy of “machines that are not malicious, but machines whose interests don’t include us.” Stephen Hawking was an acting adviser. The Millennium Alliance for Humanity and the Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academic in the humanities. It was founded by Paul Ehrlich among others. Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk.

Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis. GAR helps member states with training and coordination of response to epidemics. The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source. The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.

Precautions and prevention
The concept of global governance respecting planetary boundaries has been proposed as an approach to disaster risk reduction. In particular, the field of geoengineering envisions manipulating the global environment to combat anthropogenic changes in atmospheric composition. Comprehensive food storage and conservation techniques have been explored, but their costs would be high, and they could aggravate the consequences of malnutrition. David Denkenberger and Joshua Pearce have suggested using a variety of alternative foods to reduce the risk of starvationrelated to global catastrophes such as a nuclear winter or sudden climate change, for example, converting biomass (trees and timber) into edible products; However, it will take a lot of progress in this area so that these methods allow a large fraction of the population to survive. Other risk-reduction suggestions, such as asteroid deflection strategies to deal with impact risks, or nuclear disarmament, prove to be economically or politically difficult to implement. Finally, The colonization of space is another proposal made to increase the chances of survival in the face of an existential risk, but solutions of this type, which are currently inaccessible, will no doubt require, among other things, the use of large-scale engineering.

Among the precautions actually taken individually or collectively include:

The establishment of food reserves (planned for several years) and other resources made by survivalists in the framework of, for example, the construction of antiatomic shelters.

The Svalbard World Seed Reserve, an underground vault on the Norwegian island of Spitsbergen, intended to keep seeds of all food crops in the world safe and secure, thus preserving genetic diversity; some of these seeds should be kept for several thousand years. In May 2017, the vault was flooded with permafrost melting due to global warming, without damaging the seed supply.

Analyzes and reviews
The importance of the risks detailed in the previous sections is rarely denied, even though the risks to humans are often minimized; however, Nick Bostrom’s analyzes have been criticized from several distinct perspectives.

Technical Reviews
Many of the risks mentioned by Nick Bostrom in his books are considered exaggerated (even imaginary), or correspond to time scales so vast that it seems somewhat absurd to group them with almost immediate threats. Moreover, the calculations of probability, hope or utility are difficult or ill – defined for this kind of situation, as shown, for example, by paradoxes such as the apocalypse argument, and as Nick Bostrom acknowledges. himself. In particular, he developed an ethical argument claiming that the exorbitant number of our descendants doomed to nothingness by an existential catastrophe justifies employing every conceivable means to decrease, however little it is, the probability of the accident; however, the calculations on which it is based have been contested and this argument may well be only a fallacy.

Nick Bostrom and Max Tegmark published in 2005 an analysis of a risk of instability of the whole universe. Regardless of the validity of their calculations (tending to show that the risk is very low), one may wonder whether there really is a meaning to speak of a disaster of which no one would be warned in advance, and who leave no observer; during a similar discussion of the risk of a chain reaction igniting the whole atmosphere, a friend had responded to the anxieties Richard Hamming by “Do not worry, Hamming, there will be no one to blame you”.

Philosophical positions
Nick Bostrom’s analyzes are based on transhumanism, an ideology advocating the use of science and technology to improve the physical and mental characteristics of human beings; and he considers all that could prevent mankind from its full potential is an existential risk.

This position has been severely criticized, partly because it leads to denial of the values present humanity is attached on behalf of hypothetical future values. Steve Fuller particularly noted that if a global catastrophe does not destroy all humanity, the survivors may legitimately consider in some cases that their situation has improved.

Source from Wikipedia