Search This Blog

Saturday, September 8, 2018

Human extinction

From Wikipedia, the free encyclopedia

In futures studies, human extinction is the hypothetical end of the human species. This may result from natural causes or it may be the result of human action.

The likelihood of human extinction in future by wholly natural scenarios, such as a meteorite impact or large-scale volcanism, is generally considered to be extremely low.

For anthropogenic extinction, many possible scenarios have been proposed: human global nuclear annihilation, biological warfare or the release of a pandemic-causing agent, dysgenics, overpopulation, ecological collapse, and climate change; in addition, emerging technologies could bring about new extinction scenarios, such as advanced artificial intelligence, biotechnology or self-replicating nanobots. The probability of anthropogenic human extinction within the next hundred years is the topic of an active debate.

Human extinction needs to be differentiated from the extinction of all life on Earth (see also future of Earth) and from the extinction of major components of human culture (e.g., through a global catastrophe leaving only small, scattered human populations, which might evolve in isolation).

Moral importance of existential risk

"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress. Many scholars make an argument based on the size of the "cosmic endowment" and state that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:
  • Carl Sagan wrote in 1983: "If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss—including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise."
  • Philosopher Derek Parfit in 1984 makes an anthropocentric utilitarian argument that, because all human lives have roughly equal intrinsic value no matter where in time or space they are born, the large number of lives potentially saved in the future should be multiplied by the percentage chance that an action will save them, yielding a large net benefit for even tiny reductions in existential risk.
  • Humanity has a 95% probability of being extinct in 7,800,000 years, according to J. Richard Gott's formulation of the controversial Doomsday argument, which argues that we have probably already lived through half the duration of human history.
  • Philosopher Robert Adams rejects in 1989 Parfit's "impersonal" views, but speaks instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society- more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."
  • Philosopher Nick Bostrom argues in 2013 that preference-satisfactionist, democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.
Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 1016 (or 10,000,000,000,000,000) human lives of normal duration. Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 1034 biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 1054 cybernetic human life-years.

Possible scenarios

Severe forms of known or recorded disasters

  • Nuclear or biological warfare; for example, a future arms race may result in larger arsenals than those of the Cold War and a hypothetical World War III could cause the total annihilation of humankind.
  • A pandemic involving one or more viruses, prions, or antibiotic-resistant bacteria. Past examples include the Spanish flu outbreak in 1918 and the various European viruses that decimated indigenous American populations. A deadly pandemic restricted to humans alone would be self-limiting as its mortality would reduce the density of its target population. A pathogen with a broad host range in multiple species, however, could eventually reach even isolated human populations, e.g. when using animals as "carriers". U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity", if left unchecked, is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction.
  • Climate change; for example, global warming caused by human emission of carbon dioxide may render the planet uninhabitable. According to CDIAC (The Carbon Dioxide Information Analysis Center) increases in carbon emissions per head is closely followed by the growth in human population (a nearly 4-fold increase in the past 100 years). On a longer time scale, Milankovitch cycles, also known as Quaternary Climatic Oscillations, affect the climate in various ways in either extreme cold or extreme heat.
  • Population decline through a preference for fewer children. If developing world demographics are assumed to become developed world demographics, and if the latter are extrapolated, data suggest an extinction before 3000 AD. John A. Leslie estimates that if the reproduction rate drops to the German level the extinction date will be 2400. However, evolutionary biology suggests the demographic transition may reverse itself.
  • A geological or cosmological disaster such as an impact event of a near-Earth object, a lethal gamma-ray burst in our part of the Milky Way, a supervolcanic eruption, or natural long-term climate change. Near-Earth objects (NEOs), serve as an absolute threat to the survival of living species. A single, extraterrestrial event can lead to the accumulation of more deaths and destruction than any man-made war or epidemic could ever produce. This scenario is special among the non-anthropogenic extinction scenarios in so far as it is conceivable to develop countermeasures for a large proportion of such impact events.

Habitat threats

  • Human-induced changes to the atmosphere's composition may render Earth uninhabitable for humans. Carbon dioxide is toxic at high concentrations, causing death by respiratory acidosis (acidification of the blood), and lifelong exposure to even the moderately elevated concentrations projected to occur during the 21st century could cause chronic physical and mental health conditions in all humans. The upper limit, above which human survival and reproduction would be impossible, is still unknown.
  • There is no other population of a large vertebrate animal like humans in the history of the planet that has grown so much, so fast, or with such devastating consequences to fellow earthlings. The growth of population occurred rapidly from 1 billion in 1800 to 2 billion in 1930, and has reached over 7 billion today. With the population going up, people will annually absorb more primary productivity of the Earth’s terrestrial net, and increasing the use of land. Mentioned by Center for Biological Diversity, the species will eventually lead to a crash when its population grows beyond the capacity of its environmental sustainment and reduces its capacity below the original level. Evidence suggests birth rates may be rising in the 21st century in the developed world. The work of Hans Rosling, a Swedish medical doctor, academic, statistician and public speaker predicts global populations peaking at less than 12 billion.
  • In around 1 billion years from now, the Sun's brightness may increase as a result of a shortage of hydrogen and the heating of its outer layers may cause the Earth's oceans to evaporate, leaving only minor forms of life. Well before this time, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains.
  • About 7–8 billion years from now, if and after the Sun has become a red giant, the Earth will probably be engulfed by an expanding Sun and destroyed.

Scientific accidents

Without regulation, scientific advancement has a potential to risk human extinction as a result of the effects or use of totally new technologies. Some scenarios include:
  • The creators of a "superintelligent" entity could inadvertently give it goals that lead it to "annihilate" the human race.
  • Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).
  • Creation of a "micro black hole" on Earth during the course of a scientific experiment, or other unlikely scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents. There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at a speed near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.

Further scenarios of extraterrestrial origin

(Major impact events. and Gamma-ray burst in our part of the Milky Way were already mentioned above.)

Evolution of a posthuman species

Normal biological evolution of humanity will continue and change humans over geological time scales. Although this could, in a non-phylogenetic taxonomy, be considered to give rise of a new species, such an ongoing evolution would biologically not be considered a species extinction. Given the likelihood that significant genetic exchange between human populations will continue, it is highly unlikely that humans will split into multiple species through natural evolution.

Some scenarios envision that humans could use genetic engineering or technological modifications to split into normal humans and a new species – posthumans. Such a species could be fundamentally different from any previous life form on Earth, e.g. by merging humans with technological systems. Such scenarios do harbor a risk of the extinction of the "old" human species by means of the new, posthuman entity.

Extinction through devolution

Humans are doing so well that there's no more survival of the fittest. This was highly debated in the 19th century in the form of devolution and degeneration.

Perception of and reactions to human extinction risk

Probability estimates

Because human extinction is unprecedented, speculation about the probability of different scenarios is highly subjective. Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure. A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. The 2006 Stern Review for the UK Treasury assumes the 100-year probability of human extinction is 10% in its economic calculations.

Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for a long isolation. In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war. Any number of events could lead to a massive loss of human life; but if the last few, most resilient, humans are unlikely to also die off, then that particular human extinction scenario is not credible.

Psychology

Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:
Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".
All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.

Some behavioural finance scholars claim that recent evidence is given undue significance in risk analysis. Roughly speaking, "100-year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented.

Research and initiatives

Even though the importance and potential impact of research on existential risks is often highlighted, relatively few research efforts are being made in this field. In 2001 Bostrom stated:


Although existential risks are less manageable by individuals than, e.g., health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."

Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute (est. 2011).

Omnicide

Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare, but it can also apply to extinction through means such as a global anthropogenic ecological catastrophe.

Omnicide can be considered a subcategory of genocide. Using the concept in this way, one can argue, for example, that:

Proposed countermeasures

Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war. However this would create the possibility of interstellar war, which could be even more destructive.

More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster. Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.

In popular culture

Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared? A threat of human extinction, such as through a technological singularity (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide. Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.

History of climate change science

From Wikipedia, the free encyclopedia
 
Global Temperature Trends 1880–2017

The history of the scientific discovery of climate change began in the early 19th century when ice ages and other natural changes in paleoclimate were first suspected and the natural greenhouse effect first identified. In the late 19th century, scientists first argued that human emissions of greenhouse gases could change the climate. Many other theories of climate change were advanced, involving forces from volcanism to solar variation. In the 1960s, the warming effect of carbon dioxide gas became increasingly convincing. Some scientists also pointed out that human activities that generated atmospheric aerosols (e.g., "pollution") could have cooling effects as well. During the 1970s, scientific opinion increasingly favored the warming viewpoint. By the 1990s, as a result of improving fidelity of computer models and observational work confirming the Milankovitch theory of the ice ages, a consensus position formed: greenhouse gases were deeply involved in most climate changes and human caused emissions were bringing discernible global warming. Since the 1990s, scientific research on climate change has included multiple disciplines and has expanded. Research has expanded our understanding of causal relations, links with historic data and ability to model climate change numerically. Research during this period has been summarized in the Assessment Reports by the Intergovernmental Panel on Climate Change.

Climate change is a significant and lasting change in the statistical distribution of weather patterns over periods ranging from decades to millions of years. It may be a change in average weather conditions, or in the distribution of weather around the average conditions (such as more or fewer extreme weather events). Climate change is caused by factors that include oceanic processes (such as oceanic circulation), biotic processes (ie. plants), variations in solar radiation received by Earth, plate tectonics and volcanic eruptions, and human-induced alterations of the natural world. The latter effect is currently causing global warming, and "climate change" is often used to describe human-specific impacts.

Regional changes, antiquity through 19th century

From ancient times, people suspected that the climate of a region could change over the course of centuries. For example, Theophrastus, a pupil of Aristotle, told how the draining of marshes had made a particular locality more susceptible to freezing, and speculated that lands became warmer when the clearing of forests exposed them to sunlight. Renaissance and later scholars saw that deforestation, irrigation, and grazing had altered the lands around the Mediterranean since ancient times; they thought it plausible that these human interventions had affected the local weather. Vitruvius, in the first century BC, wrote about climate in relation to housing architecture and how to choose locations for cities.

The most striking change came in the 18th and 19th centuries, obvious within a single lifetime: the conversion of Eastern North America from forest to croplands. By the early 19th century many believed the transformation was altering the region's climate—probably for the better. When farmers, called "sodbusters" took over the Great Plains they were told that "rain follows the plough." Not everyone agreed. Some experts reported that deforestation not only caused rainwater to run off rapidly in useless floods, but reduced rainfall itself. European professors, alert to any proof that their nations were wiser than others, claimed that the Orientals of the Ancient Near East had heedlessly converted their once lush lands into impoverished deserts.

Meanwhile, national weather agencies had begun to compile masses of reliable observations of temperature, rainfall, and the like. When the figures were analyzed they showed many rises and dips, but no steady long-term change. By the end of the 19th century, scientific opinion had turned decisively against any belief in a human influence on climate. And whatever the regional effects, few imagined that humans could affect the climate of the planet as a whole.

Paleoclimate change and theories of its causes, 19th century

Erratics, boulders deposited by glaciers far from any existing glaciers, led geologists to the conclusion that climate had changed in the past.

Prior to the 18th century, scientists had not suspected that prehistoric climates were different from the modern period. By the late 18th century, geologists found evidence of a succession of geological ages with changes in climate. There were various competing theories about these changes, and James Hutton, whose ideas of cyclic change over huge periods of time were later dubbed uniformitarianism, was among those who found signs of past glacial activity in places too warm for glaciers in modern times.

In 1815 Jean-Pierre Perraudin described for the first time how glaciers might be responsible for the giant boulders seen in alpine valleys. As he hiked in the Val de Bagnes, he noticed giant granite rocks that were scattered around the narrow valley. He knew that it would take an exceptional force to move such large rocks. He also noticed how glaciers left stripes on the land, and concluded that it was the ice that had carried the boulders down into the valleys.

His idea was initially met with disbelief. Jean de Charpentier wrote, "I found his hypothesis so extraordinary and even so extravagant that I considered it as not worth examining or even considering." Despite Charpentier's initial rejection, Perraudin eventually convinced Ignaz Venetz that it might be worth studying. Venetz convinced Charpentier, who in turn convinced the influential scientist Louis Agassiz that the glacial theory had merit.

Agassiz developed a theory of what he termed "Ice Age" — when glaciers covered Europe and much of North America. In 1837 Agassiz was the first to scientifically propose that the Earth had been subject to a past ice age. William Buckland had led attempts in Britain to adapt the geological theory of catastrophism to account for erratic boulders and other "diluvium" as relics of the Biblical flood. This was strongly opposed by Charles Lyell's version of Hutton's uniformitarianism, and was gradually abandoned by Buckland and other catastrophist geologists. A field trip to the Alps with Agassiz in October 1838 convinced Buckland that features in Britain had been caused by glaciation, and both he and Lyell strongly supported the ice age theory which became widely accepted by the 1870s.

Joseph Fourier

In the same general period that scientists first suspected climate change and ice ages, Joseph Fourier, in 1824, found that Earth's atmosphere kept the planet warmer than would be the case in a vacuum. Fourier recognized that the atmosphere transmitted visible light waves efficiently to the earth's surface. The earth then absorbed visible light and emitted infrared radiation in response, but the atmosphere did not transmit infrared efficiently, which therefore increased surface temperatures. He also suspected that human activities could influence climate, although he focused primarily on land use changes. In an 1827 paper Fourier stated, "The establishment and progress of human societies, the action of natural forces, can notably change, and in vast regions, the state of the surface, the distribution of water and the great movements of the air. Such effects are able to make to vary, in the course of many centuries, the average degree of heat; because the analytic expressions contain coefficients relating to the state of the surface and which greatly influence the temperature."

Eunice Newton Foote studied the warming effect of the sun, including how this warming was increased by the presence of carbonic acid gas (carbon dioxide), and suggested that the surface of an Earth whose atmosphere was rich in this gas would have a higher temperature. Her work was presented by Prof. Joseph Henry at the American Association for the Advancement of Science meeting in August 1856 and published as a brief note, but scientists failed to take notice.
Foote: “The highest effect of the sun’s rays I have found to be in carbonic acid gas. ... An atmosphere of that gas would give to our earth a high temperature; and if, as some suppose, at one period of its history, the air had mixed with it a larger proportion than at present, an increased temperature from its own action, as well as from increased weight, must have necessarily resulted.”
John Tyndall took Fourier's work one step further in 1859 when he investigated the absorption of infrared radiation in different gases. He found that water vapor, hydrocarbons like methane (CH4), and carbon dioxide (CO2) strongly block the radiation. Some scientists suggested that ice ages and other great climate changes were due to changes in the amount of gases emitted in volcanism. But that was only one of many possible causes. Another obvious possibility was solar variation. Shifts in ocean currents also might explain many climate changes. For changes over millions of years, the raising and lowering of mountain ranges would change patterns of both winds and ocean currents. Or perhaps the climate of a continent had not changed at all, but it had grown warmer or cooler because of polar wander (the North Pole shifting to where the Equator had been or the like). There were dozens of theories.

For example, in the mid 19th century, James Croll published calculations of how the gravitational pulls of the Sun, Moon, and planets subtly affect the Earth's motion and orientation. The inclination of the Earth’s axis and the shape of its orbit around the Sun oscillate gently in cycles lasting tens of thousands of years. During some periods the Northern Hemisphere would get slightly less sunlight during the winter than it would get during other centuries. Snow would accumulate, reflecting sunlight and leading to a self-sustaining ice age. Most scientists, however, found Croll’s ideas—and every other theory of climate change—unconvincing.

In 1876, Peter Kropotkin wrote about his observations that since the Industrial Revolution, Siberian glaciers were melting.

First calculations of human-induced climate change, 1896

In 1896 Svante Arrhenius calculated the effect of a doubling atmospheric carbon dioxide to be an increase in surface temperatures of 5–6 degrees Celsius.
 
T.C. Chamberlin

By the late 1890s, American scientist Samuel Pierpoint Langley had attempted to determine the surface temperature of the Moon by measuring infrared radiation leaving the Moon and reaching the Earth. The angle of the Moon in the sky when a scientist took a measurement determined how much CO2 and water vapor the Moon's radiation had to pass through to reach the Earth's surface, resulting in weaker measurements when the Moon was low in the sky. This result was unsurprising given that scientists had known about infrared radiation absorption for decades.

In 1896, Swedish scientist, Svante Arrhenius, used Langley's observations of increased infrared absorption where Moon rays pass through the atmosphere at a low angle, encountering more carbon dioxide (CO2), to estimate an atmospheric cooling effect from a future decrease of CO2. He realized that the cooler atmosphere would hold less water vapor (another greenhouse gas) and calculated the additional cooling effect. He also realized the cooling would increase snow and ice cover at high latitudes, making the planet reflect more sunlight and thus further cool down, as James Croll had hypothesized. Overall Arrhenius calculated that cutting CO2 in half would suffice to produce an ice age. He further calculated that a doubling of atmospheric CO2 would give a total warming of 5–6 degrees Celsius.

Further, Arrhenius' colleague Professor Arvid Högbom, who was quoted in length in Arrhenius' 1896 study On the Influence of Carbonic Acid in the Air upon the Temperature of the Earth had been attempting to quantify natural sources of emissions of CO2 for purposes of understanding the global carbon cycle. Högbom found that estimated carbon production from industrial sources in the 1890s (mainly coal burning) was comparable with the natural sources. Arrhenius saw that this human emission of carbon would eventually lead to warming. However, because of the relatively low rate of CO2 production in 1896, Arrhenius thought the warming would take thousands of years, and he expected it would be beneficial to humanity.

In 1899 Thomas Chrowder Chamberlin developed at length the idea that changes in climate could result from changes in the concentration of atmospheric carbon dioxide. Chamberlin wrote in his 1899 book, An Attempt to Frame a Working Hypothesis of the Cause of Glacial Periods on an Atmospheric Basis:
Previous advocacy of an atmospheric hypothesis, — The general doctrine that the glacial periods may have been due to a change in the atmospheric content of carbon dioxide is not new. It was urged by Tyndall a half century ago and has been urged by others since. Recently it has been very effectively advocated by Dr. Arrhenius, who has taken a great step in advance of his predecessors in reducing his conclusions to definite quantitative terms deduced from observational data. [..] The functions of carbon dioxide. — By the investigations of Tyndall, Lecher and Pretner, Keller, Roentgen, and Arrhenius, it has been shown that the carbon dioxide and water vapor of the atmosphere have remarkable power of absorbing and temporarily retaining heat rays, while the oxygen, nitrogen, and argon of the atmosphere possess this power in a feeble degree only. It follows that the effect of the carbon dioxide and water vapor is to blanket the earth with a thermally absorbent envelope. [..] The general results assignable to a greatly increased or a greatly reduced quantity of atmospheric carbon dioxide and water may be summarized as follows:
  • a. An increase, by causing a larger absorption of the sun's radiant energy, raises the average temperature, while a reduction lowers it. The estimate of Dr. Arrhenius, based upon an elaborate mathematical discussion of the observations of Professor Langley, is that an increase of the carbon dioxide to the amount of two or three times the present content would elevate the average temperature 8° or 9° C. and would bring on a mild climate analogous to that which prevailed in the Middle Tertiary age. On the other hand, a reduction of the quantity of carbon dioxide in the atmosphere to an amount ranging from 55 to 62 per cent, of the present content, would reduce the average temperature 4 or 5 C, which would bring on a glaciation comparable to that of the Pleistocene period.
  • b. A second effect of increase and decrease in the amount of atmospheric carbon dioxide is the equalization, on the one hand, of surface temperatures, or their differentiation on the other. The temperature of the surface of the earth varies with latitude, altitude, the distribution of land and water, day and night, the seasons, and some other elements that may here be neglected. It is postulated that an increase in the thermal absorption of the atmosphere equalizes the temperature, and tends to eliminate the variations attendant on these contingencies. Conversely, a reduction of thermal atmospheric absorption tends to intensify all of these variations. A secondary effect of intensification of differences of temperature is an increase of atmospheric movements in the effort to restore equilibrium. Increased atmospheric movements, which are necessarily convectional, carry the warmer air to the surface of the atmosphere, and facilitate the discharge of the heat and thus intensify the primary effect. [..]
In the case of the outgoing rays, which are absorbed in much larger proportions than the incoming rays because they are more largely long-wave rays, the tables of Arrhenius show that the absorption is augmented by increase of carbonic acid in greater proportions in high latitudes than in low; for example, the increase of temperature for three times the present content of carbonic acid is 21.5 per cent, greater between 60° and 70° N. latitude than at the equator.
It now becomes necessary to assign agencies capable of removing carbon dioxide from the atmosphere at a rate sufficiently above the normal rate of supply, at certain times, to produce glaciation; and on the other hand, capable of restoring it to the atmosphere at certain other times in sufficient amounts to produce mild climates.

When the temperature is rising after a glacial episode, dissociation is promoted, and the ocean gives forth its carbon dioxide at an increased rate, and thereby assists in accelerating the amelioration of climate.

A study of the life of the geological periods seems to indicate that there were very notable fluctuations in the total mass of living matter. To be sure there was a reciprocal relation between the life of the land and that of the sea, so that when the latter was extended upon the continental platforms and greatly augmented, the former was contracted, but notwithstanding this it seems clear that the sum of life activity fluctuated notably during the ages. It is believed that on the whole it was greatest at the periods of sea extension and mild climates, and least at the times of disruption and climatic intensification. This factor then acted antithetically to the carbonic acid freeing previously noted, and, so far as it went, tended to offset its effects.

In periods of sea extension and of land reduction (base-level periods in particular), the habitat of shallow water lime-secreting life is concurrently extended, giving to the agencies that set carbon dioxide free accelerated activity, which is further aided by the consequent rising temperature which reduces the absorptive power of the ocean and increases dissociation. At the same time, the area of the land being diminished, a low consumption of carbon dioxide both in original decomposition of the silicates and in the solution of the limestones and dolomites obtains.

Thus the reciprocating agencies again conjoin, but now to increase the carbon dioxide of the air. These are the great and essential factors. They are modified by several subordinate agencies already mentioned, but the quantitative effect of these is thought to be quite insufficient to prevent very notable fluctuations in the atmospheric constitution.
As a result, it is postulated that geological history has been accentuated by an alternation of climatic episodes embracing, on the one hand, periods of mild, equable, moist climate nearly uniform for the whole globe; and on the other, periods when there were extremes of aridity and precipitation, and of heat and cold; these last denoted by deposits of salt and gypsum, of subaerial conglomerates, of red sandstones and shales, of arkose deposits, and occasionally by glaciation in low latitudes.

Paleoclimates and sunspots, early 1900s to 1950s

Arrhenius's calculations were disputed and subsumed into a larger debate over whether atmospheric changes had caused the ice ages. Experimental attempts to measure infrared absorption in the laboratory seemed to show little differences resulted from increasing CO2 levels, and also found significant overlap between absorption by CO2 and absorption by water vapor, all of which suggested that increasing carbon dioxide emissions would have little climatic effect. These early experiments were later found to be insufficiently accurate, given the instrumentation of the time. Many scientists also thought that the oceans would quickly absorb any excess carbon dioxide.

Other theories of the causes of climate change fared no better. The principal advances were in observational paleoclimatology, as scientists in various fields of geology worked out methods to reveal ancient climates. Wilmot H. Bradley found that annual varves of clay laid down in lake beds showed climate cycles. An Arizona astronomer, Andrew Ellicott Douglass, saw strong indications of climate change in tree rings. Noting that the rings were thinner in dry years, he reported climate effects from solar variations, particularly in connection with the 17th-century dearth of sunspots (the Maunder Minimum) noticed previously by William Herschel and others. Other scientists, however, found good reason to doubt that tree rings could reveal anything beyond random regional variations. The value of tree rings for climate study was not solidly established until the 1960s.

Through the 1930s the most persistent advocate of a solar-climate connection was astrophysicist Charles Greeley Abbot. By the early 1920s, he had concluded that the solar "constant" was misnamed: his observations showed large variations, which he connected with sunspots passing across the face of the Sun. He and a few others pursued the topic into the 1960s, convinced that sunspot variations were a main cause of climate change. Other scientists were skeptical. Nevertheless, attempts to connect the solar cycle with climate cycles were popular in the 1920s and 1930s. Respected scientists announced correlations that they insisted were reliable enough to make predictions. Sooner or later, every prediction failed, and the subject fell into disrepute.

Meanwhile, the Serbian engineer Milutin Milankovitch, building on James Croll's theory, improved the tedious calculations of the varying distances and angles of the Sun's radiation as the Sun and Moon gradually perturbed the Earth's orbit. Some observations of varves (layers seen in the mud covering the bottom of lakes) matched the prediction of a Milankovitch cycle lasting about 21,000 years. However, most geologists dismissed the astronomical theory. For they could not fit Milankovitch’s timing to the accepted sequence, which had only four ice ages, all of them much longer than 21,000 years.

In 1938 a British engineer, Guy Stewart Callendar, attempted to revive Arrhenius's greenhouse-effect theory. Callendar presented evidence that both temperature and the CO2 level in the atmosphere had been rising over the past half-century, and he argued that newer spectroscopic measurements showed that the gas was effective in absorbing infrared in the atmosphere. Nevertheless, most scientific opinion continued to dispute or ignore the theory.

Increasing concern, 1950s – 1960s

Charles Keeling, receiving the National Medal of Science from George W. Bush, in 2001.

Better spectrography in the 1950s showed that CO2 and water vapor absorption lines did not overlap completely. Climatologists also realized that little water vapor was present in the upper atmosphere. Both developments showed that the CO2 greenhouse effect would not be overwhelmed by water vapor.

In 1955 Hans Suess's carbon-14 isotope analysis showed that CO2 released from fossil fuels was not immediately absorbed by the ocean. In 1957, better understanding of ocean chemistry led Roger Revelle to a realization that the ocean surface layer had limited ability to absorb carbon dioxide, also predicting the rise in levels of CO2 and later being proven by Charles David Keeling. By the late 1950s, more scientists were arguing that carbon dioxide emissions could be a problem, with some projecting in 1959 that CO2 would rise 25% by the year 2000, with potentially "radical" effects on climate. In 1960 Charles David Keeling demonstrated that the level of CO2 in the atmosphere was in fact rising. Concern mounted year by year along with the rise of the "Keeling Curve" of atmospheric CO2.

Another clue to the nature of climate change came in the mid-1960s from analysis of deep-sea cores by Cesare Emiliani and analysis of ancient corals by Wallace Broecker and collaborators. Rather than four long ice ages, they found a large number of shorter ones in a regular sequence. It appeared that the timing of ice ages was set by the small orbital shifts of the Milankovitch cycles. While the matter remained controversial, some began to suggest that the climate system is sensitive to small changes and can readily be flipped from a stable state into a different one.

Scientists meanwhile began using computers to develop more sophisticated versions of Arrhenius's calculations. In 1967, taking advantage of the ability of digital computers to integrate absorption curves numerically, Syukuro Manabe and Richard Wetherald made the first detailed calculation of the greenhouse effect incorporating convection (the "Manabe-Wetherald one-dimensional radiative-convective model"). They found that, in the absence of unknown feedbacks such as changes in clouds, a doubling of carbon dioxide from the current level would result in approximately 2 °C increase in global temperature.

By the 1960s, aerosol pollution ("smog") had become a serious local problem in many cities, and some scientists began to consider whether the cooling effect of particulate pollution could affect global temperatures. Scientists were unsure whether the cooling effect of particulate pollution or warming effect of greenhouse gas emissions would predominate, but regardless, began to suspect that human emissions could be disruptive to climate in the 21st century if not sooner. In his 1968 book The Population Bomb, Paul R. Ehrlich wrote, "the greenhouse effect is being enhanced now by the greatly increased level of carbon dioxide... [this] is being countered by low-level clouds generated by contrails, dust, and other contaminants... At the moment we cannot predict what the overall climatic results will be of our using the atmosphere as a garbage dump."

In 1965, the landmark report, "Restoring the Quality of Our Environment" by U.S. President Lyndon B. Johnson’s Science Advisory Committee warned of the harmful effects of fossil fuel emissions:
The part that remains in the atmosphere may have a significant effect on climate; carbon dioxide is nearly transparent to visible light, but it is a strong absorber and back radiator of infrared radiation, particularly in the wave lengths from 12 to 18 microns; consequently, an increase of atmospheric carbon dioxide could act, much like the glass in a greenhouse, to raise the temperature of the lower air.
A 1968 study by the Stanford Research Institute for the American Petroleum Institute noted:
If the earth's temperature increases significantly, a number of events might be expected to occur, including the melting of the Antarctic ice cap, a rise in sea levels, warming of the oceans, and an increase in photosynthesis. [..] Revelle makes the point that man is now engaged in a vast geophysical experiment with his environment, the earth. Significant temperature changes are almost certain to occur by the year 2000 and these could bring about climatic changes.
In 1969, NATO was the first candidate to deal with climate change on an international level. It was planned then to establish a hub of research and initiatives of the organization in the civil area, dealing with environmental topics as Acid Rain and the Greenhouse effect. The suggestion of US President Richard Nixon was not very successful with the administration of German Chancellor Kurt Georg Kiesinger. But the topics and the preparation work done on the NATO proposal by the German authorities gained international momentum, (see e.g. the Stockholm United Nations Conference on the Human Environment 1970) as the government of Willy Brandt started to apply them on the civil sphere instead.

Also in 1969, Mikhail Budyko published a theory on the ice-albedo feedback, a center piece of what is today known as Arctic amplification. The same year a similar model was published by William D. Sellers. Both studies attracted significant attention since they hinted at the possibility for a runaway positive feedback within the global climate system.

Scientists increasingly predict warming, 1970s

Mean temperature anomalies during the period 1965 to 1975 with respect to the average temperatures from 1937 to 1946. This dataset was not available at the time.

In the early 1970s, evidence that aerosols were increasing worldwide encouraged Reid Bryson and some others to warn of the possibility of severe cooling. Meanwhile, the new evidence that the timing of ice ages was set by predictable orbital cycles suggested that the climate would gradually cool, over thousands of years. For the century ahead, however, a survey of the scientific literature from 1965 to 1979 found 7 articles predicting cooling and 44 predicting warming (many other articles on climate made no prediction); the warming articles were cited much more often in subsequent scientific literature. Several scientific panels from this time period concluded that more research was needed to determine whether warming or cooling was likely, indicating that the trend in the scientific literature had not yet become a consensus.

John Sawyer published the study Man-made Carbon Dioxide and the “Greenhouse” Effect in 1972. He summarized the knowledge of the science at the time, the anthropogenic attribution of the carbon dioxide greenhouse gas, distribution and exponential rise, findings which still hold today. Additionally he accurately predicted the rate of global warming for the period between 1972 and 2000.
The increase of 25% CO2 expected by the end of the century therefore corresponds to an increase of 0.6°C in the world temperature – an amount somewhat greater than the climatic variation of recent centuries. – John Sawyer, 1972
The mainstream news media at the time exaggerated the warnings of the minority who expected imminent cooling. For example, in 1975, Newsweek magazine published a story that warned of "ominous signs that the Earth's weather patterns have begun to change." The article continued by stating that evidence of global cooling was so strong that meteorologists were having "a hard time keeping up with it." On 23 October 2006, Newsweek issued an update stating that it had been "spectacularly wrong about the near-term future".

In the first two "Reports for the Club of Rome" in 1972 and 1974, the anthropogenic climate changes by CO2 increase as well as by Waste heat were mentioned. About the latter John Holdren wrote in a study cited in the 1st report, “… that global thermal pollution is hardly our most immediate environmental threat. It could prove to be the most inexorable, however, if we are fortunate enough to evade all the rest.” Simple global-scale estimates that recently have been actualized and confirmed by more refined model calculations show noticeable contributions from waste heat to global warming after the year 2100, if its growth rates are not strongly reduced (below the averaged 2% p.a. which occurred since 1973).

Evidence for warming accumulated. By 1975, Manabe and Wetherald had developed a three-dimensional Global climate model that gave a roughly accurate representation of the current climate. Doubling CO2 in the model's atmosphere gave a roughly 2 °C rise in global temperature. Several other kinds of computer models gave similar results: it was impossible to make a model that gave something resembling the actual climate and not have the temperature rise when the CO2 concentration was increased.

In a separate development, an analysis of deep-sea cores published in 1976 by Nicholas Shackleton and colleagues showed that the dominating influence on ice age timing came from a 100,000-year Milankovitch orbital change. This was unexpected, since the change in sunlight in that cycle was slight. The result emphasized that the climate system is driven by feedbacks, and thus is strongly susceptible to small changes in conditions.

The 1979 World Climate Conference (12 to 23 February) of the World Meteorological Organization concluded "it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere, especially at higher latitudes....It is possible that some effects on a regional and global scale may be detectable before the end of this century and become significant before the middle of the next century."

In July 1979 the United States National Research Council published a report, concluding (in part):
When it is assumed that the CO2 content of the atmosphere is doubled and statistical thermal equilibrium is achieved, the more realistic of the modeling efforts predict a global surface warming of between 2°C and 3.5°C, with greater increases at high latitudes.

… we have tried but have been unable to find any overlooked or underestimated physical effects that could reduce the currently estimated global warmings due to a doubling of atmospheric CO2 to negligible proportions or reverse them altogether.

Consensus begins to form, 1980–1988

James Hansen during his 1988 testimony to Congress, which alerted the public to the dangers of global warming.

By the early 1980s, the slight cooling trend from 1945–1975 had stopped. Aerosol pollution had decreased in many areas due to environmental legislation and changes in fuel use, and it became clear that the cooling effect from aerosols was not going to increase substantially while carbon dioxide levels were progressively increasing.

Hansen et al. 1981, published the study Climate impact of increasing atmospheric carbon dioxide, and noted:
It is shown that the anthropogenic carbon dioxide warming should emerge from the noise level of natural climate variability by the end of the century, and there is a high probability of warming in the 1980s. Potential effects on climate in the 21st century include the creation of drought-prone regions in North America and central Asia as part of a shifting of climatic zones, erosion of the West Antarctic ice sheet with a consequent worldwide rise in sea level, and opening of the fabled Northwest Passage.
In 1982, Greenland ice cores drilled by Hans Oeschger, Willi Dansgaard, and collaborators revealed dramatic temperature oscillations in the space of a century in the distant past. The most prominent of the changes in their record corresponded to the violent Younger Dryas climate oscillation seen in shifts in types of pollen in lake beds all over Europe. Evidently drastic climate changes were possible within a human lifetime.

In 1973, British scientist James Lovelock speculated that chlorofluorocarbons (CFCs) could have a global warming effect. In 1975, V. Ramanathan found that a CFC molecule could be 10,000 times more effective in absorbing infrared radiation than a carbon dioxide molecule, making CFCs potentially important despite their very low concentrations in the atmosphere. While most early work on CFCs focused on their role in ozone depletion, by 1985 Ramanathan and others showed that CFCs together with methane and other trace gases could have nearly as important a climate effect as increases in CO2. In other words, global warming would arrive twice as fast as had been expected.

In 1985 a joint UNEP/WMO/ICSU Conference on the "Assessment of the Role of Carbon Dioxide and Other Greenhouse Gases in Climate Variations and Associated Impacts" concluded that greenhouse gases "are expected" to cause significant warming in the next century and that some warming is inevitable.

Meanwhile, ice cores drilled by a Franco-Soviet team at the Vostok Station in Antarctica showed that CO2 and temperature had gone up and down together in wide swings through past ice ages. This confirmed the CO2-temperature relationship in a manner entirely independent of computer climate models, strongly reinforcing the emerging scientific consensus. The findings also pointed to powerful biological and geochemical feedbacks.

In June 1988, James E. Hansen made one of the first assessments that human-caused warming had already measurably affected global climate. Shortly after, a "World Conference on the Changing Atmosphere: Implications for Global Security" gathered hundreds of scientists and others in Toronto. They concluded that the changes in the atmosphere due to human pollution "represent a major threat to international security and are already having harmful consequences over many parts of the globe," and declared that by 2005 the world should push its emissions some 20% below the 1988 level.

The 1980s saw important breakthroughs with regard to global environmental challenges. E.g. Ozone depletion was mitigated by the Vienna Convention (1985) and the Montreal Protocol (1987). Acid rain was mainly regulated on the national and regional level.

Modern period: 1988 to present

2015 – Warmest Global Year on Record (since 1880) – Colors indicate temperature anomalies (NASA/NOAA; 20 January 2016).

In 1988 the WMO established the Intergovernmental Panel on Climate Change with the support of the UNEP. The IPCC continues its work through the present day, and issues a series of Assessment Reports and supplemental reports that describe the state of scientific understanding at the time each report is prepared. Scientific developments during this period are summarized about once every five to six years in the IPCC Assessment Reports which were published in 1990 (First Assessment Report), 1995 (Second Assessment Report), 2001 (Third Assessment Report), 2007 (Fourth Assessment Report), and 2013/2014 (Fifth Assessment Report).

Since the 1990s, research on climate change has expanded and grown, linking many fields such as atmospheric sciences, numeric modeling, behavioral sciences, geology and economics, or security.

Discovery of other climate changing factors

Methane: In 1859, John Tyndall determined that coal gas, a mix of methane and other gases, strongly absorbed infrared radiation. Methane was subsequently detected in the atmosphere in 1948, and in the 1980s scientists realized that human emissions were having a substantial impact.

Chlorofluorocarbon: In 1973, British scientist James Lovelock speculated that chlorofluorocarbons (CFCs) could have a global warming effect. In 1975, V. Ramanathan found that a CFC molecule could be 10,000 times more effective in absorbing infrared radiation than a carbon dioxide molecule, making CFCs potentially important despite their very low concentrations in the atmosphere. While most early work on CFCs focused on their role in ozone depletion, by 1985 scientists had concluded that CFCs together with methane and other trace gases could have nearly as important a climate effect as increases in CO2.

Greenhouse effect

From Wikipedia, the free encyclopedia

A representation of the exchanges of energy between the source (the Sun), Earth's surface, the Earth's atmosphere, and the ultimate sink outer space. The ability of the atmosphere to capture and recycle energy emitted by Earth's surface is the defining characteristic of the greenhouse effect.
 
Energy flow between the sun, the atmosphere and earth's surface.
 
Earth's energy budget

The greenhouse effect is the process by which radiation from a planet's atmosphere warms the planet's surface to a temperature above what it would be without its atmosphere.

If a planet's atmosphere contains radiatively active gases (i.e., greenhouse gases) they will radiate energy in all directions. Part of this radiation is directed towards the surface, warming it. The intensity of the downward radiation – that is, the strength of the greenhouse effect – will depend on the atmosphere's temperature and on the amount of greenhouse gases that the atmosphere contains.
Earth’s natural greenhouse effect is critical to supporting life. Human activities, mainly the burning of fossil fuels and clearing of forests, have strengthened the greenhouse effect and caused global warming.

The term "greenhouse effect" arose from a faulty analogy with the effect of sunlight passing through glass and warming a greenhouse. The way a greenhouse retains heat is fundamentally different, as a greenhouse works mostly by reducing airflow so that warm air is kept inside.

History

The existence of the greenhouse effect was argued for by Joseph Fourier in 1824. The argument and the evidence were further strengthened by Claude Pouillet in 1827 and 1838 and reasoned from experimental observations by John Tyndall in 1859, who measured the radiative properties of specific greenhouse gases. The effect was more fully quantified by Svante Arrhenius in 1896, who made the first quantitative prediction of global warming due to a hypothetical doubling of atmospheric carbon dioxide. However, the term "greenhouse" was not used to refer to this effect by any of these scientists; the term was first used in this way by Nils Gustaf Ekholm in 1901.

Mechanism

Earth receives energy from the Sun in the form of ultraviolet, visible, and near-infrared radiation. About 26% of the incoming solar energy is reflected to space by the atmosphere and clouds, and 19% is absorbed by the atmosphere and clouds. Most of the remaining energy is absorbed at the surface of Earth. Because the Earth's surface is colder than the Sun, it radiates at wavelengths that are much longer than the wavelengths that were absorbed. Most of this thermal radiation is absorbed by the atmosphere and warms it. The atmosphere also gains heat by sensible and latent heat fluxes from the surface. The atmosphere radiates energy both upwards and downwards; the part radiated downwards is absorbed by the surface of Earth. This leads to a higher equilibrium temperature than if the atmosphere were absent.

The solar radiation spectrum for direct light at both the top of Earth's atmosphere and at sea level

An ideal thermally conductive blackbody at the same distance from the Sun as Earth would have a temperature of about 5.3 °C. However, because Earth reflects about 30% of the incoming sunlight, this idealized planet's effective temperature (the temperature of a blackbody that would emit the same amount of radiation) would be about −18 °C. The surface temperature of this hypothetical planet is 33 °C below Earth's actual surface temperature of approximately 14 °C.

The basic mechanism can be qualified in a number of ways, none of which affect the fundamental process. The atmosphere near the surface is largely opaque to thermal radiation (with important exceptions for "window" bands), and most heat loss from the surface is by sensible heat and latent heat transport. Radiative energy losses become increasingly important higher in the atmosphere, largely because of the decreasing concentration of water vapor, an important greenhouse gas. It is more realistic to think of the greenhouse effect as applying to a "surface" in the mid-troposphere, which is effectively coupled to the surface by a lapse rate. The simple picture also assumes a steady state, but in the real world, there are variations due to the diurnal cycle as well as the seasonal cycle and weather disturbances. Solar heating only applies during daytime. During the night, the atmosphere cools somewhat, but not greatly, because its emissivity is low. Diurnal temperature changes decrease with height in the atmosphere.

Within the region where radiative effects are important, the description given by the idealized greenhouse model becomes realistic. Earth's surface, warmed to a temperature around 255 K, radiates long-wavelength, infrared heat in the range of 4–100 μm. At these wavelengths, greenhouse gases that were largely transparent to incoming solar radiation are more absorbent. Each layer of atmosphere with greenhouses gases absorbs some of the heat being radiated upwards from lower layers. It reradiates in all directions, both upwards and downwards; in equilibrium (by definition) the same amount as it has absorbed. This results in more warmth below. Increasing the concentration of the gases increases the amount of absorption and reradiation, and thereby further warms the layers and ultimately the surface below.

Greenhouse gases—including most diatomic gases with two different atoms (such as carbon monoxide, CO) and all gases with three or more atoms—are able to absorb and emit infrared radiation. Though more than 99% of the dry atmosphere is IR transparent (because the main constituents—N
2
, O
2
, and Ar—are not able to directly absorb or emit infrared radiation), intermolecular collisions cause the energy absorbed and emitted by the greenhouse gases to be shared with the other, non-IR-active, gases.

Greenhouse gases

By their percentage contribution to the greenhouse effect on Earth the four major gases are:

Atmospheric gases only absorb some wavelengths of energy but are transparent to others. The absorption patterns of water vapor (blue peaks) and carbon dioxide (pink peaks) overlap in some wavelengths. Carbon dioxide is not as strong a greenhouse gas as water vapor, but it absorbs energy in longer wavelengths (12–15 micrometers) that water vapor does not, partially closing the "window" through which heat radiated by the surface would normally escape to space. (Illustration NASA, Robert Rohde)
It is not possible to assign a specific percentage to each gas because the absorption and emission bands of the gases overlap (hence the ranges given above). Clouds also absorb and emit infrared radiation and thus affect the radiative properties of the atmosphere.

Role in climate change

The Keeling Curve of atmospheric CO2 concentrations measured at Mauna Loa Observatory.

Strengthening of the greenhouse effect through human activities is known as the enhanced (or anthropogenic) greenhouse effect. This increase in radiative forcing from human activity is attributable mainly to increased atmospheric carbon dioxide levels. According to the latest Assessment Report from the Intergovernmental Panel on Climate Change, "atmospheric concentrations of carbon dioxide, methane and nitrous oxide are unprecedented in at least the last 800,000 years. Their effects, together with those of other anthropogenic drivers, have been detected throughout the climate system and are extremely likely to have been the dominant cause of the observed warming since the mid-20th century".

CO2 is produced by fossil fuel burning and other activities such as cement production and tropical deforestation. Measurements of CO2 from the Mauna Loa observatory show that concentrations have increased from about 313 parts per million (ppm) in 1960 to about 389 ppm in 2010. It reached the 400 ppm milestone on May 9, 2013. The current observed amount of CO2 exceeds the geological record maxima (~300 ppm) from ice core data. The effect of combustion-produced carbon dioxide on the global climate, a special case of the greenhouse effect first described in 1896 by Svante Arrhenius, has also been called the Callendar effect.

Over the past 800,000 years, ice core data shows that carbon dioxide has varied from values as low as 180 ppm to the pre-industrial level of 270 ppm. Paleoclimatologists consider variations in carbon dioxide concentration to be a fundamental factor influencing climate variations over this time scale.

Real greenhouses

A modern Greenhouse in RHS Wisley

The "greenhouse effect" of the atmosphere is named by analogy to greenhouses which become warmer in sunlight. However, a greenhouse is not primarily warmed by the "greenhouse effect". "Greenhouse effect" is actually a misnomer since heating in the usual greenhouse is due to the reduction of convection, while the "greenhouse effect" works by preventing absorbed heat from leaving the structure through radiative transfer.

A greenhouse is built of any material that passes sunlight usually glass, or plastic. The sun warms the ground and contents inside just like the outside, which then warms the air. Outside, the warm air near the surface rises and mixes with cooler air aloft, keeping the temperature lower than inside, where the air continues to heat up because it is confined within the greenhouse. This can be demonstrated by opening a small window near the roof of a greenhouse: the temperature will drop considerably. It was demonstrated experimentally (R. W. Wood, 1909) that a (not heated) "greenhouse" with a cover of rock salt (which is transparent to infrared) heats up an enclosure similarly to one with a glass cover. Thus greenhouses work primarily by preventing convective cooling.

Heated greenhouses are yet another matter, having an internal source of heating they leak heat out, which must be prevented. So it again makes sense to try to prevent radiative cooling through the use of adequate glazing.

Related effects

Anti-greenhouse effect

The anti-greenhouse effect is a mechanism similar and symmetrical to the greenhouse effect: greenhouse effect is about atmosphere letting radiation in, while not letting thermal radiation out, which warms the body surface; anti-greenhouse effect is about atmosphere NOT letting radiation in, while letting thermal radiation out, which lowers the equilibrium surface temperature. Such an effect has been cited about Titan

Runaway greenhouse effect

A runaway greenhouse effect occurs if positive feedbacks lead to the evaporation of all greenhouse gases into the atmosphere. A runaway greenhouse effect involving carbon dioxide and water vapor has long ago been hypothesized to have occurred on Venus, this idea is still largely accepted.

Bodies other than Earth

The greenhouse effect on Venus is particularly large because its dense atmosphere consists mainly of carbon dioxide. "Venus experienced a runaway greenhouse in the past, and we expect that Earth will in about 2 billion years as solar luminosity increases".

Titan has an anti-greenhouse effect, in that its atmosphere absorbs solar radiation but is relatively transparent to outgoing infrared radiation.

Pluto is also colder than would be expected because evaporation of nitrogen cools it.

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...