Search This Blog

Saturday, December 27, 2025

Climate sensitivity

From Wikipedia, the free encyclopedia
Diagram of factors that determine climate sensitivity. After increasing CO2 levels, there is an initial warming. This warming gets amplified by the net effect of climate feedbacks.

Climate sensitivity is a key measure in climate science and describes how much Earth's surface will warm for a doubling in the atmospheric carbon dioxide (CO2) concentration. Its formal definition is: "The change in the surface temperature in response to a change in the atmospheric carbon dioxide (CO2) concentration or other radiative forcing." This concept helps scientists understand the extent and magnitude of the effects of climate change.

The Earth's surface warms as a direct consequence of increased atmospheric CO2, as well as increased concentrations of other greenhouse gases such as nitrous oxide and methane. The increasing temperatures have secondary effects on the climate system. These secondary effects are called climate feedbacks. Self-reinforcing feedbacks include for example the melting of sunlight-reflecting ice as well as higher evapotranspiration. The latter effect increases average atmospheric water vapour, which is itself a greenhouse gas.

Scientists do not know exactly how strong these climate feedbacks are. Therefore, it is difficult to predict the precise amount of warming that will result from a given increase in greenhouse gas concentrations. If climate sensitivity turns out to be on the high side of scientific estimates, the Paris Agreement goal of limiting global warming to below 2 °C (3.6 °F) will be even more difficult to achieve.

There are two main kinds of climate sensitivity: the transient climate response is the initial rise in global temperature when CO2 levels double, and the equilibrium climate sensitivity is the larger long-term temperature increase after the planet adjusts to the doubling. Climate sensitivity is estimated by several methods: looking directly at temperature and greenhouse gas concentrations since the Industrial Revolution began around the 1750s, using indirect measurements from the Earth's distant past, and simulating the climate.

Fundamentals

The rate at which energy reaches Earth (as sunlight) and leaves Earth (as heat radiation to space) must balance, or the planet will get warmer or cooler. An imbalance between incoming and outgoing radiation energy is called radiative forcing. A warmer planet radiates heat to space faster and so a new balance is eventually reached, with a higher temperature and stored energy content. However, the warming of the planet also has knock-on effects, which create further warming in an exacerbating feedback loop. Climate sensitivity is a measure of how much temperature change a given amount of radiative forcing will cause.

Radiative forcing

Radiative forcings are generally quantified as Watts per square meter (W/m2) and averaged over Earth's uppermost surface defined as the top of the atmosphere. The magnitude of a forcing is specific to the physical driver and is defined relative to an accompanying time span of interest for its application. In the context of a contribution to long-term climate sensitivity from 1750 to 2020, the 50% increase in atmospheric CO
2
is characterized by a forcing of about +2.1 W/m2. In the context of shorter-term contributions to Earth's energy imbalance (i.e. its heating/cooling rate), time intervals of interest may be as short as the interval between measurement or simulation data samplings, and are thus likely to be accompanied by smaller forcing values. Forcings from such investigations have also been analyzed and reported at decadal time scales.

Radiative forcing leads to long-term changes in global temperature. A number of factors contribute radiative forcing: increased downwelling radiation from the greenhouse effect, variability in solar radiation from changes in planetary orbit, changes in solar irradiance, direct and indirect effects caused by aerosols (for example changes in albedo from cloud cover), and changes in land use (deforestation or the loss of reflective ice cover). In contemporary research, radiative forcing by greenhouse gases is well understood. As of 2019, large uncertainties remain for aerosols.

Key numbers

Carbon dioxide (CO2) levels rose from 280 parts per million (ppm) in the 18th century, when humans in the Industrial Revolution started burning significant amounts of fossil fuel such as coal, to over 415 ppm by 2020. As CO2 is a greenhouse gas, it hinders heat energy from leaving the Earth's atmosphere. In 2016, atmospheric CO2 levels had increased by 45% over preindustrial levels, and radiative forcing caused by increased CO2 was already more than 50% higher than in pre-industrial times because of non-linear effects. Between the 18th-century start of the Industrial Revolution and the year 2020, the Earth's temperature rose by a little over one degree Celsius (about two degrees Fahrenheit).

Societal importance

Because the economics of climate change mitigation depend greatly on how quickly carbon neutrality needs to be achieved, climate sensitivity estimates can have important economic and policy-making implications. One study suggests that halving the uncertainty of the value for transient climate response (TCR) could save trillions of dollars. A higher climate sensitivity would mean more dramatic increases in temperature, which makes it more prudent to take significant climate action. If climate sensitivity turns out to be on the high end of what scientists estimate, the Paris Agreement goal of limiting global warming to well below 2 °C cannot be achieved, and temperature increases will exceed that limit, at least temporarily. One study estimated that emissions cannot be reduced fast enough to meet the 2 °C goal if equilibrium climate sensitivity (the long-term measure) is higher than 3.4 °C (6.1 °F). The more sensitive the climate system is to changes in greenhouse gas concentrations, the more likely it is to have decades when temperatures are much higher or much lower than the longer-term average.

Factors that determine sensitivity

The radiative forcing caused by a doubling of atmospheric CO2 levels (from the pre-industrial 280 ppm) is approximately 3.7 watts per square meter (W/m2). In the absence of feedbacks, the energy imbalance would eventually result in roughly 1 °C (1.8 °F) of global warming. That figure is straightforward to calculate by using the Stefan–Boltzmann law and is undisputed.

A further contribution arises from climate feedbacks, both self-reinforcing and balancing. The uncertainty in climate sensitivity estimates is mostly from the feedbacks in the climate system, including water vapour feedback, ice–albedo feedback, cloud feedback, and lapse rate feedback. Balancing feedbacks tend to counteract warming by increasing the rate at which energy is radiated to space from a warmer planet. Self-reinfocing feedbacks increase warming; for example, higher temperatures can cause ice to melt, which reduces the ice area and the amount of sunlight the ice reflects, which in turn results in less heat energy being radiated back into space. The reflectiveness of a surface is called albedo. Climate sensitivity depends on the balance between those feedbacks.

Types

Schematic of how different measures of climate sensitivity relate to one another

Depending on the time scale, there are two main ways to define climate sensitivity: the short-term transient climate response (TCR) and the long-term equilibrium climate sensitivity (ECS), both of which incorporate the warming from exacerbating feedback loops. They are not discrete categories, but they overlap. Sensitivity to atmospheric CO2 increases is measured in the amount of temperature change for doubling in the atmospheric CO2 concentration.

Although the term "climate sensitivity" is usually used for the sensitivity to radiative forcing caused by rising atmospheric CO2, it is a general property of the climate system. Other agents can also cause a radiative imbalance. Climate sensitivity is the change in surface air temperature per unit change in radiative forcing, and the climate sensitivity parameter is therefore expressed in units of °C/(W/m2). Climate sensitivity is approximately the same whatever the reason for the radiative forcing (such as from greenhouse gases or solar variation). When climate sensitivity is expressed as the temperature change for a level of atmospheric CO2 double the pre-industrial level, its units are degrees Celsius (°C).

Transient climate response

The transient climate response (TCR) is defined as "the change in the global mean surface temperature, averaged over a 20-year period, centered at the time of atmospheric carbon dioxide doubling, in a climate model simulation" in which the atmospheric CO2 concentration increases at 1% per year. That estimate is generated by using shorter-term simulations. The transient response is lower than the equilibrium climate sensitivity because slower feedbacks, which exacerbate the temperature increase, take more time to respond in full to an increase in the atmospheric CO2 concentration. For instance, the deep ocean takes many centuries to reach a new steady state after a perturbation during which it continues to serve as heatsink, which cools the upper ocean. The IPCC literature assessment estimates that the TCR likely lies between 1 °C (1.8 °F) and 2.5 °C (4.5 °F).[30]

A related measure is the transient climate response to cumulative carbon emissions (TCRE), which is the globally averaged surface temperature change after 1000 GtC of CO2 has been emitted. As such, it includes not only temperature feedbacks to forcing but also the carbon cycle and carbon cycle feedbacks.

Equilibrium climate sensitivity

The equilibrium climate sensitivity (ECS) is the long-term temperature rise (equilibrium global mean near-surface air temperature) that is expected to result from a doubling of the atmospheric CO2 concentration (ΔT). It is a prediction of the new global mean near-surface air temperature once the CO2 concentration has stopped increasing, and most of the feedbacks have had time to have their full effect. Reaching an equilibrium temperature can take centuries or even millennia after CO2 has doubled. ECS is higher than TCR because of the oceans' short-term buffering effects. Computer models are used for estimating the ECS. A comprehensive estimate means that modelling the whole time span during which significant feedbacks continue to change global temperatures in the model, such as fully-equilibrating ocean temperatures, requires running a computer model that covers thousands of years. There are, however, less computing-intensive methods.

The IPCC Sixth Assessment Report (AR6) stated that there is high confidence that ECS is within the range of 2.5 °C to 4 °C, with a best estimate of 3 °C.

The long time scales involved with ECS make it arguably a less relevant measure for policy decisions around climate change.

Effective climate sensitivity

A common approximation to ECS is the effective equilibrium climate sensitivity, is an estimate of equilibrium climate sensitivity by using data from a climate system in model or real-world observations that is not yet in equilibrium. Estimates assume that the net amplification effect of feedbacks, as measured after some period of warming, will remain constant afterwards. That is not necessarily true, as feedbacks can change with time. In many climate models, feedbacks become stronger over time and so the effective climate sensitivity is lower than the real ECS.

Earth system sensitivity

By definition, equilibrium climate sensitivity does not include feedbacks that take millennia to emerge, such as long-term changes in Earth's albedo because of changes in ice sheets and vegetation. It also does not include the slow response of the deep oceans' warming, which takes millennia. Earth system sensitivity (ESS) incorporates the effects of these slower feedback loops, such as the change in Earth's albedo from the melting of large continental ice sheets, which covered much of the Northern Hemisphere during the Last Glacial Maximum and still cover Greenland and Antarctica. Changes in albedo as a result of changes in vegetation, as well as changes in ocean circulation, are also included. The longer-term feedback loops make the ESS larger than the ECS, possibly twice as large. Data from the geological history of Earth is used in estimating ESS. Differences between modern and long-ago climatic conditions mean that estimates of the future ESS are highly uncertain. The carbon cycle is not included in the definition of the ESS, but all other elements of the climate system are included.

Sensitivity to nature of forcing

Different forcing agents, such as greenhouse gases and aerosols, can be compared using their radiative forcing, the initial radiative imbalance averaged over the entire globe. Climate sensitivity is the amount of warming per radiative forcing. To a first approximation, the cause of the radiative imbalance does not matter. However, radiative forcing from sources other than CO2 can cause slightly more or less surface warming than the same averaged forcing from CO2. The amount of feedback varies mainly because the forcings are not uniformly distributed over the globe. Forcings that initially warm the Northern Hemisphere, land, or polar regions generate more self-reinforcing feedbacks (such as the ice-albedo feedback) than an equivalent forcing from CO2, which is more uniformly distributed over the globe. This gives rise to more overall warming. Several studies indicate that human-emitted aerosols are more effective than CO2 at changing global temperatures, and volcanic forcing is less effective. When climate sensitivity to CO2 forcing is estimated using historical temperature and forcing (caused by a mix of aerosols and greenhouse gases), and that effect is not taken into account, climate sensitivity is underestimated.

State dependence

Artist impression of a Snowball Earth.

Climate sensitivity has been defined as the short- or long-term temperature change resulting from any doubling of CO2, but there is evidence that the sensitivity of Earth's climate system is not constant. For instance, the planet has polar ice and high-altitude glaciers. Until the world's ice has completely melted, a self-reinforcing ice–albedo feedback loop makes the system more sensitive overall. Throughout Earth's history, multiple periods are thought to have snow and ice cover almost the entire globe. In most models of "Snowball Earth", parts of the tropics were at least intermittently free of ice cover. As the ice advanced or retreated, climate sensitivity must have been very high, as the large changes in area of ice cover would have made for a very strong ice–albedo feedback. Volcanic atmospheric composition changes are thought to have provided the radiative forcing needed to escape the snowball state.

Equilibrium climate sensitivity can change with climate.

Throughout the Quaternary period (the most recent 2.58 million years), climate has oscillated between glacial periods, the most recent one being the Last Glacial Maximum, and interglacial periods, the most recent one being the current Holocene, but the period's climate sensitivity is difficult to determine. The Paleocene–Eocene Thermal Maximum, about 55.5 million years ago, was unusually warm and may have been characterized by above-average climate sensitivity.

Climate sensitivity may further change if tipping points are crossed. It is unlikely that tipping points will cause short-term changes in climate sensitivity. If a tipping point is crossed, climate sensitivity is expected to change at the time scale of the subsystem that hits its tipping point. Especially if there are multiple interacting tipping points, the transition of climate to a new state may be difficult to reverse.

The two most common definitions of climate sensitivity specify the climate state: the ECS and the TCR are defined for a doubling with respect to the CO2 levels in the pre-industrial era. Because of potential changes in climate sensitivity, the climate system may warm by a different amount after a second doubling of CO2 from after a first doubling. The effect of any change in climate sensitivity is expected to be small or negligible in the first century after additional CO2 is released into the atmosphere.

Estimation

Using Industrial Age (1750–present) data

Climate sensitivity can be estimated using the observed temperature increase, the observed ocean heat uptake, and the modelled or observed radiative forcing. The data are linked through a simple energy-balance model to calculate climate sensitivity. Radiative forcing is often modelled because Earth observation satellites measuring it has existed during only part of the Industrial Age (only since the late 1950s). Estimates of climate sensitivity calculated by using these global energy constraints have consistently been lower than those calculated by using other methods, around 2 °C (3.6 °F) or lower.

Estimates of transient climate response (TCR) that have been calculated from models and observational data can be reconciled if it is taken into account that fewer temperature measurements are taken in the polar regions, which warm more quickly than the Earth as a whole. If only regions for which measurements are available are used in evaluating the model, the differences in TCR estimates are negligible.

A very simple climate model could estimate climate sensitivity from Industrial Age data by waiting for the climate system to reach equilibrium and then by measuring the resulting warming, ΔTeq (°C). Computation of the equilibrium climate sensitivity, S (°C), using the radiative forcing ΔF (W/m2) and the measured temperature rise, would then be possible. The radiative forcing resulting from a doubling of CO2, F2×CO2, is relatively well known, at about 3.7 W/m2. Combining that information results in this equation:

.

However, the climate system is not in equilibrium since the actual warming lags the equilibrium warming, largely because the oceans take up heat and will take centuries or millennia to reach equilibrium. Estimating climate sensitivity from Industrial Age data requires an adjustment to the equation above. The actual forcing felt by the atmosphere is the radiative forcing minus the ocean's heat uptake, H (W/m2) and so climate sensitivity can be estimated:

The global temperature increase between the beginning of the Industrial Period, which is (taken as 1750, and 2011 was about 0.85 °C (1.53 °F). In 2011, the radiative forcing from CO2 and other long-lived greenhouse gases (mainly methane, nitrous oxide, and chlorofluorocarbon) that have been emitted since the 18th century was roughly 2.8 W/m2. The climate forcing, ΔF, also contains contributions from solar activity (+0.05 W/m2), aerosols (−0.9 W/m2), ozone (+0.35 W/m2), and other smaller influences, which brings the total forcing over the Industrial Period to 2.2 W/m2, according to the best estimate of the IPCC Fifth Assessment Report in 2014, with substantial uncertainty. The ocean heat uptake, estimated by the same report to be 0.42 W/m2, yields a value for S of 1.8 °C (3.2 °F).

Other strategies

In theory, Industrial Age temperatures could also be used to determine a time scale for the temperature response of the climate system and thus climate sensitivity: if the effective heat capacity of the climate system is known, and the timescale is estimated using autocorrelation of the measured temperature, an estimate of climate sensitivity can be derived. In practice, however, the simultaneous determination of the time scale and heat capacity is difficult.

Attempts have been made to use the 11-year solar cycle to constrain the transient climate response. Solar irradiance is about 0.9 W/m2 higher during a solar maximum than during a solar minimum, and those effect can be observed in measured average global temperatures from 1959 to 2004. Unfortunately, the solar minima in the period coincided with volcanic eruptions, which have a cooling effect on the global temperature. Because the eruptions caused a larger and less well-quantified decrease in radiative forcing than the reduced solar irradiance, it is questionable whether useful quantitative conclusions can be derived from the observed temperature variations.

Observations of volcanic eruptions have also been used to try to estimate climate sensitivity, but as the aerosols from a single eruption last at most a couple of years in the atmosphere, the climate system can never come close to equilibrium, and there is less cooling than there would be if the aerosols stayed in the atmosphere for longer. Therefore, volcanic eruptions give information only about a lower bound on transient climate sensitivity.

Using data from Earth's past

Historical climate sensitivity can be estimated by using reconstructions of Earth's past temperatures and CO2 levels. Paleoclimatologists have studied different geological periods, such as the warm Pliocene (5.3 to 2.6 million years ago) and the colder Pleistocene (2.6 million to 11,700 years ago), and sought periods that are in some way analogous to or informative about current climate change. Climates further back in Earth's history are more difficult to study because fewer data are available about them. For instance, past CO2 concentrations can be derived from air trapped in ice cores, but as of 2020, the oldest continuous ice core is less than one million years old. Recent periods, such as the Last Glacial Maximum (LGM) (about 21,000 years ago) and the Mid-Holocene (about 6,000 years ago), are often studied, especially when more information about them becomes available.

A 2007 estimate of sensitivity made using data from the most recent 420 million years is consistent with sensitivities of current climate models and with other determinations. The Paleocene–Eocene Thermal Maximum (about 55.5 million years ago), a 20,000-year period during which massive amount of carbon entered the atmosphere and average global temperatures increased by approximately 6 °C (11 °F), also provides a good opportunity to study the climate system when it was in a warm state. Studies of the last 800,000 years have concluded that climate sensitivity was greater in glacial periods than in interglacial periods.

As the name suggests, the Last Glacial Maximum was much colder than today, and good data on atmospheric CO2 concentrations and radiative forcing from that period are available. The period's orbital forcing was different from today's but had little effect on mean annual temperatures. Estimating climate sensitivity from the Last Glacial Maximum can be done by several different ways. One way is to use estimates of global radiative forcing and temperature directly. The set of feedback mechanisms active during the period, however, may be different from the feedbacks caused by a present doubling of CO2, which introduces additional uncertainty. In a different approach, a model of intermediate complexity is used to simulate conditions during the period. Several versions of this single model are run, with different values chosen for uncertain parameters, such that each version has a different ECS. Outcomes that best simulate the LGM's observed cooling probably produce the most realistic ECS values.

Using climate models

Histogram of equilibrium climate sensitivity as derived for different plausible assumptions
Frequency distribution of equilibrium climate sensitivity based on simulations of the doubling of CO2. Each model simulation has different estimates for processes which scientists do not sufficiently understand. Few of the simulations result in less than 2 °C (3.6 °F) of warming or significantly more than 4 °C (7.2 °F). However, the positive skew, which is also found in other studies, suggests that if carbon dioxide concentrations double, the probability of large or very large increases in temperature is greater than the probability of small increases.

Climate models simulate the CO2-driven warming of the future as well as the past. They operate on principles similar to those underlying models that predict the weather, but they focus on longer-term processes. Climate models typically begin with a starting state and then apply physical laws and knowledge about biology to predict future states. As with weather modelling, no computer has the power to model the complexity of the entire planet and simplifications are used to reduce that complexity to something manageable. An important simplification divides Earth's atmosphere into model cells. For instance, the atmosphere might be divided into cubes of air ten or one hundred kilometers on each side. Each model cell is treated as if it were homogeneous (uniform). Calculations for model cells are much faster than trying to simulate each molecule of air separately.

A lower model resolution (large model cells and long time steps) takes less computing power but cannot simulate the atmosphere in as much detail. A model cannot simulate processes smaller than the model cells or shorter-term than a single time step. The effects of the smaller-scale and shorter-term processes must therefore be estimated by using other methods. Physical laws contained in the models may also be simplified to speed up calculations. The biosphere must be included in climate models. The effects of the biosphere are estimated by using data on the average behaviour of the average plant assemblage of an area under the modelled conditions. Climate sensitivity is therefore an emergent property of these models; it is not prescribed, but it follows from the interaction of all the modelled processes.

To estimate climate sensitivity, a model is run by using a variety of radiative forcings (doubling quickly, doubling gradually, or following historical emissions) and the temperature results are compared to the forcing applied. Different models give different estimates of climate sensitivity, but they tend to fall within a similar range, as described above.

Testing, comparisons, and climate ensembles

Modelling of the climate system can lead to a wide range of outcomes. Models are often run that use different plausible parameters in their approximation of physical laws and the behaviour of the biosphere, which forms a perturbed physics ensemble, which attempts to model the sensitivity of the climate to different types and amounts of change in each parameter. Alternatively, structurally-different models developed at different institutions are put together, creating an ensemble. By selecting only the simulations that can simulate some part of the historical climate well, a constrained estimate of climate sensitivity can be made. One strategy for obtaining more accurate results is placing more emphasis on climate models that perform well in general.

A model is tested using observations, paleoclimate data, or both to see if it replicates them accurately. If it does not, inaccuracies in the physical model and parametrizations are sought, and the model is modified. For models used to estimate climate sensitivity, specific test metrics that are directly and physically linked to climate sensitivity are sought. Examples of such metrics are the global patterns of warming, the ability of a model to reproduce observed relative humidity in the tropics and subtropics, patterns of heat radiation, and the variability of temperature around long-term historical warming. Ensemble climate models developed at different institutions tend to produce constrained estimates of ECS that are slightly higher than 3 °C (5.4 °F). The models with ECS slightly above 3 °C (5.4 °F) simulate the above situations better than models with a lower climate sensitivity.

Many projects and groups exist to compare and to analyse the results of multiple models. For instance, the Coupled Model Intercomparison Project (CMIP) has been running since the 1990s.

Historical estimates

Svante Arrhenius in the 19th century was the first person to quantify global warming as a consequence of a doubling of the concentration of CO2. In his first paper on the matter, he estimated that global temperature would rise by around 5 to 6 °C (9.0 to 10.8 °F) if the quantity of CO2 was doubled. In later work, he revised that estimate to 4 °C (7.2 °F). Arrhenius used Samuel Pierpont Langley's observations of radiation emitted by the full moon to estimate the amount of radiation that was absorbed by water vapour and by CO2. To account for water vapour feedback, he assumed that relative humidity would stay the same under global warming.

The first calculation of climate sensitivity that used detailed measurements of absorption spectra, as well as the first calculation to use a computer for numerical integration of the radiative transfer through the atmosphere, was performed by Syukuro Manabe and Richard Wetherald in 1967. Assuming constant humidity, they computed an equilibrium climate sensitivity of 2.3 °C per doubling of CO2, which they rounded to 2 °C, the value most often quoted from their work, in the abstract of the paper. The work has been called "arguably the greatest climate-science paper of all time" and "the most influential study of climate of all time."

A committee on anthropogenic global warming, convened in 1979 by the United States National Academy of Sciences and chaired by Jule Charney, estimated equilibrium climate sensitivity to be 3 °C (5.4 °F), plus or minus 1.5 °C (2.7 °F). The Manabe and Wetherald estimate (2 °C (3.6 °F)), James E. Hansen's estimate of 4 °C (7.2 °F), and Charney's model were the only models available in 1979. According to Manabe, speaking in 2004, "Charney chose 0.5 °C as a reasonable margin of error, subtracted it from Manabe's number, and added it to Hansen's, giving rise to the 1.5 to 4.5 °C (2.7 to 8.1 °F) range of likely climate sensitivity that has appeared in every greenhouse assessment since ...." In 2008, climatologist Stefan Rahmstorf said: "At that time [it was published], the [Charney report estimate's] range [of uncertainty] was on very shaky ground. Since then, many vastly improved models have been developed by a number of climate research centers around the world."

Assessment reports of IPCC

diagram showing five historical estimates of equilibrium climate sensitivity by the IPCC
Historical estimates of climate sensitivity from the IPCC assessments. The first three reports gave a qualitative likely range, and the fourth and the fifth assessment report formally quantified the uncertainty. The dark blue range is judged as being more than 66% likely.

Despite considerable progress in the understanding of Earth's climate system, assessments continued to report similar uncertainty ranges for climate sensitivity for some time after the 1979 Charney report. The First Assessment Report of the Intergovernmental Panel on Climate Change (IPCC), published in 1990, estimated that equilibrium climate sensitivity to a doubling of CO2 lay between 1.5 and 4.5 °C (2.7 and 8.1 °F), with a "best guess in the light of current knowledge" of 2.5 °C (4.5 °F). The report used models with simplified representations of ocean dynamics. The IPCC supplementary report, 1992, which used full-ocean circulation models, saw "no compelling reason to warrant changing" the 1990 estimate; and the IPCC Second Assessment Report stated, "No strong reasons have emerged to change [these estimates]," In the reports, much of the uncertainty around climate sensitivity was attributed to insufficient knowledge of cloud processes. The 2001 IPCC Third Assessment Report also retained this likely range.

Authors of the 2007 IPCC Fourth Assessment Report stated that confidence in estimates of equilibrium climate sensitivity had increased substantially since the Third Annual Report. The IPCC authors concluded that ECS is very likely to be greater than 1.5 °C (2.7 °F) and likely to lie in the range 2 to 4.5 °C (3.6 to 8.1 °F), with a most likely value of about 3 °C (5.4 °F). The IPCC stated that fundamental physical reasons and data limitations prevent a climate sensitivity higher than 4.5 °C (8.1 °F) from being ruled out, but the climate sensitivity estimates in the likely range agreed better with observations and the proxy climate data.

The 2013 IPCC Fifth Assessment Report reverted to the earlier range of 1.5 to 4.5 °C (2.7 to 8.1 °F) (with high confidence), because some estimates using industrial-age data came out low. The report also stated that ECS is extremely unlikely to be less than 1 °C (1.8 °F) (high confidence), and it is very unlikely to be greater than 6 °C (11 °F) (medium confidence). Those values were estimated by combining the available data with expert judgement.

In preparation for the 2021 IPCC Sixth Assessment Report, a new generation of climate models was developed by scientific groups around the world. Across 27 global climate models, estimates of a higher climate sensitivity were produced. The values spanned 1.8 to 5.6 °C (3.2 to 10.1 °F) and exceeded 4.5 °C (8.1 °F) in 10 of them. The estimates for equilibrium climate sensitivity changed from 3.2 °C to 3.7 °C and the estimates for the transient climate response from 1.8 °C, to 2.0 °C. The cause of the increased ECS lies mainly in improved modelling of clouds. Temperature rises are now believed to cause sharper decreases in the number of low clouds, and fewer low clouds means more sunlight is absorbed by the planet and less reflected to space.

Remaining deficiencies in the simulation of clouds may have led to overestimates, as models with the highest ECS values were not consistent with observed warming. A fifth of the models began to 'run hot', predicting that global warming would produce significantly higher temperatures than is considered plausible. According to these models, known as hot models, average global temperatures in the worst-case scenario would rise by more than 5 °C above preindustrial levels by 2100, with a "catastrophic" impact on human society. In comparison, empirical observations combined with physics models indicate that the "very likely" range is between 2.3 and 4.7 °C. Models with a very high climate sensitivity are also known to be poor at reproducing known historical climate trends, such as warming over the 20th century or cooling during the last ice age. For these reasons the predictions of hot models are considered implausible, and have been given less weight by the IPCC in 2022.

Self-fulfilling prophecy

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Self-fulfilling_prophecy

A self-fulfilling prophecy is a prediction that comes true at least in part as a result of a person's belief or expectation that the prediction would come true. In the phenomena, people tend to act the way they have been expected to in order to make the expectations come true. Self-fulfilling prophecies are an example of the more general phenomenon of positive feedback loops. A self-fulfilling prophecy can have either negative or positive outcomes. Merely applying a label to someone or something can affect the perception of the person/thing and create a self-fulfilling prophecy. Interpersonal communication plays a significant role in establishing these phenomena as well as impacting the labeling process.

American sociologists W. I. Thomas and Dorothy Swaine Thomas were the first Western scholars to investigate this phenomenon. In 1928, they developed the Thomas theorem (also known as the Thomas dictum): "If men define situations as real, they are real in their consequences." Another American sociologist, Robert K. Merton, continued the research, and is credited with coining the term "self-fulfilling prophecy" and popularizing the idea that "a belief or expectation, correct or incorrect, could bring about a desired or expected outcome." The works of philosophers Karl Popper and Alan Gewirth also contributed to the idea.

History

An early precursor of the concept appears in Edward Gibbon's Decline and Fall of the Roman Empire: "During many ages, the prediction, as it is usual, contributed to its own accomplishment".

The phrase "self-fulfilling prophecy" was coined by Robert K. Merton, a sociologist who also developed the ideas of anomie, social structure, and the modes of individual adaption. In his book Social Theory and Social Structure, he uses the example of a bank run to show how self-fulfilling thoughts can make unwanted situations happen. In his illustration, rumors spread about the town that the local bank is going to file for bankruptcy, causing many people to rush to the bank and close their accounts. Because banks do not keep their total assets in cash, the bank was unable to fulfill all its customers' withdrawals, which eventually caused the bank to go bankrupt. Merton concludes with the analysis, "The prophecy of collapse led to its own fulfillment".

While Merton's example focused on self-fulfilling prophecies within a community, self-fulfilling prophecies also apply to individuals, as individuals often conform to the expectations of others. This is also known as the Pygmalion effect, based on the experiments by Robert Resenthal and Lenore Jacobson, where teachers were told that a random selection of students were expected to perform exceptionally well; those students showed a significant increase in test scores at the end of the year.

Philosopher Karl Popper called the self-fulfilling prophecy the Oedipus effect:

One of the ideas I had discussed in The Poverty of Historicism was the influence of a prediction upon the event predicted. I had called this the "Oedipus effect", because the oracle played a most important role in the sequence of events which led to the fulfilment of its prophecy. [...] For a time I thought that the existence of the Oedipus effect distinguished the social from the natural sciences. But in biology, too—even in molecular biology—expectations often play a role in bringing about what has been expected.

The idea is similar to that discussed by the philosopher William James as "The Will to Believe." But James viewed it positively, as the self-validation of a belief.

Applications

Examples abound in studies of cognitive dissonance theory and the related self-perception theory; people will often change their attitudes to come into line with what they profess publicly.

In the United States, the concept was broadly and consistently applied in the field of public education reform, following the "war on poverty", as teacher expectations have been shown to influence student academic performance. Theodore Brameld noted: "In simplest terms, education already projects and thereby reinforces whatever habits of personal and cultural life are considered to be acceptable and dominant." The effects of teacher attitudes, beliefs, and values, affecting their expectations have been tested repeatedly, most notably in the Pygmalion in the Classroom study, where teachers were told arbitrarily that random students were likely to show significant intellectual growth. As a result, those random students actually ended the year with significantly greater improvement when given another IQ test. Though the changes may be subconscious, teachers who have higher expectations typically give "more time to answer questions, more specific feedback, and more approval". Likewise, students who have positive experiences with their teachers may study more. Academic self-fulfilling prophecies can be negative, however: one study indicated that female students may perform worse if they expect their male instructor to be sexist.

The phenomenon of the "inevitability of war" is a self-fulfilling prophecy that has received considerable study.

Fear of failure leads to deterioration of results, even if the person is objectively able to adequately cope with the problem. For example, fear of falling leads to more falls among older people.

Americans of Chinese and Japanese origin are more likely to die of a heart attack on the 4th of each month, due to the number four being considered unlucky and a portent of death.

Moore's law predicting that the number of transistors in an integrated circuit (IC) doubles about every two years is often considered as a self-fulfilling prophecy.

The belief that a bank is insolvent may help create the fact, but confidence in the bank's prospects may improve them. Similarly, stock-exchange panics and speculative bubbles can be both triggered by a widespread belief that the stock will go down (or up), thus starting the selling/buying mass move, etc.

People adapt to the judgments and assessments made by society, regardless of whether they were originally correct or not. There are certain prejudices against a socially marginalized group (e.g., homeless people, drug addicts or other minorities), and therefore, people in this marginalized group actually begin to behave in accordance with expectations.

Relationships

A leading study by Columbia University found that self-fulfilling prophecies have some part in relationships: the beliefs by people in relationships can impact the likelihood of a breakup or the overall health of the relationship. L. Alan Sroufe suggested that "rejection expectations can lead people to behave in ways that elicit rejection from others." The study looked at the inner workings behind the role of self-fulfilling prophecies in romantic relationships of people who were deemed high in rejection sensitivity, which was defined as "the disposition to anxiously expect, readily perceive, and overreact to rejection". The study found that women were more likely to experience rejection sensitivity in comparison to the negativity held by men about the future of their relationships, and that women sensitive to rejection "may be more likely to behave in ways that exacerbate conflicts," which could lead to behavior that would "erode their partners' relationship satisfaction and commitment."

Other specific examples discussed in psychology include:

International relations

Self-fulfilling prophecies have been apparent throughout history with the 'Thucydides trap': the occurrence of a rising power threatening a ruling or dominant power. Thucydides was an Athenian historian and general who recorded the Peloponnesian war between Sparta and Athens. He wrote, "It was the rise of Athens and the fear that this instilled in Sparta that made war inevitable."

Another example of self-fulfilling prophecies is the United States' invasion of Iraq in 2003. The invasion was based on the assumption that Iraq posed a terrorist threat to the United States, though evidence shows that no threat was actually posed. The invasion and subsequent overthrowing of the Ba'athist regime created the conditions for an insurgency that resulted in Iraq becoming a stronghold for the terrorist organization Al Qaeda, thus fulfilling the initial belief of a potential threat.

Stereotype

Self-fulfilling prophecies are one of the main contributions to racial prejudice and vice versa. According to the Dictionary of Race, Ethnicity & Culture, "Self-fulfilling prophecy makes it possible to highlight the tragic vicious circle which victimizes people twice: first, because the victim is stigmatized with an inherent negative quality; and secondly, because he or she is prevented from disproving this quality." An example is given where white workers expected that black people would be against the principles of trade unionism because they considered black workers to be "undisciplined in traditions of trade unionism and the art of collective bargain-ing." Due to this belief, black workers were not hired at white-owned businesses, which made black workers unable to learn the principles of trade unionism, and thus prevented them from unionizing.

Teachers can encourage stereotype-based courses and can interact with students in a manner that encourages self-fulfilling thoughts: for example, female students may seem to be bad at math if teachers never encouraged them to improve their mathematical abilities.

The term "self-fulfilling prophecy" made its first appearance in educational literature in the 1960s, when African-American psychologist Kenneth B. Clark studied the responses of black children to black and white dolls. The responses from Clark's study ranged from some children calling the black doll ugly to one girl bursting into tears when prompted to pick the doll she identified with. The black children internalized the inferiority they learned and acted accordingly. Clark, whose work pushed the Supreme Court to desegregate schools, noted the influence of teachers on the achievement levels between Black and White students. This prompted Clark to begin a study in ten inner-city schools where he assessed the attitudes and behaviors of teachers. The belief held by teachers was that minority students were unintelligent, and therefore the teachers put no effort into teaching them. This led to a feedback loop of those students not being educated, and thus being perceived as unintelligent.

Literature, media, and the arts

In literature, self-fulfilling prophecies are often used as plot devices. They have been used in stories for millennia, but are especially popular in science fiction and fantasy. They are often used for dramatic irony, with the prophesied events coming to pass due to the attempts to prevent the prophecy. They are also sometimes used as comic relief.

Classical

Many myths, legends, and fairy-tales make use of this motif as a central element of narratives that are designed to illustrate inexorable fate, fundamental to the Hellenic world-view. In a common motif, a child, whether newborn or not yet conceived, is prophesied to cause something that those in power do not want to happen, but the prophesied events come about as a result of the actions taken to prevent them.

Greek

The word "prophet" is derived from the Greek word prophete, meaning "one who speaks for another."

Oedipus in the arms of Phorbas

The best-known example from Greek legend is that of Oedipus. Warned that his child would one day kill him, Laius abandoned his newborn son Oedipus to die, but Oedipus was found and raised by others, and thus in ignorance of his true origins. When he grew up, Oedipus was warned that he would kill his father and marry his mother. He sought to avoid this, and, believing his foster parents to be his real parents, left his home and travelled to Greece, eventually reaching the city where his biological parents lived. There, he got into a fight with a stranger, killed him, and married his widow, only to discover that the stranger he had killed was his biological father, and his new wife was his biological mother.

Although the legend of Perseus opens with the prophecy that he will kill his grandfather Acrisius, the prophecy is only self-fulfilling in some variants. In some, he accidentally spears his grandfather at a competition—an act that could have happened regardless of Acrisius' response to the prophecy. In other variants, his presence at the games is due to his hearing of the prophecy. In still others, Acrisius is one of the wedding guests when Polydectes tries to force Danaë to marry him, and is accidentally killed when Perseus turns all the guests to stone with the Gorgon's head.

Greek historiography provides a famous variant: when the Lydian king Croesus asked the Delphic Oracle if he should invade Persia, the response came that if he did, he would destroy a great kingdom. Assuming this meant he would succeed, he attacked, only to fail—the kingdom he destroyed was his own.

When it was predicted that Cronos would be overthrown by his son, and usurp his throne as King of the Gods, Cronus ate his children, each shortly after they were born, enraging his wife, Rhea. To get revenge, when she bore Zeus, she gave Cronos a stone to eat instead, sending Zeus to be raised by Amalthea. Cronos' attempt to avoid the prophecy made Zeus his enemy, ultimately leading to its fulfilment.

Roman

Romulus and Remus nursed by a she-wolf

The story of Romulus and Remus is another example. According to legend, a man overthrew his brother, the king. He then ordered that his two nephews, Romulus and Remus, be drowned, fearing that they would someday kill him as he did to his brother. The boys were placed in a basket and thrown in the Tiber River. A wolf found the babies and she raised them. Later, a shepherd found the twins and named them Romulus and Remus. As teenagers, they discovered their heritage, and killed their uncle in revenge, fulfilling the prophecy.

Arabic

A variation of the self-fulfilling prophecy is the self-fulfilling dream, which dates back to medieval Arabic literature. Several tales in the One Thousand and One Nights, also known as the Arabian Nights, use this device to foreshadow what is going to happen, as a special form of literary prolepsis. A notable example is "The Ruined Man Who Became Rich Again Through a Dream", in which a man is told in his dream to leave his native city of Baghdad and travel to Cairo, where he will discover the whereabouts of some hidden treasure. The man travels there and experiences misfortune after losing belief in the prophecy, ending up in jail, where he tells his dream to a police officer. The officer mocks the idea of foreboding dreams and tells the protagonist that he himself had a dream about a house with a courtyard and fountain in Baghdad where treasure is buried under the fountain. The man recognizes the place as his own house and, after he is released from jail, he returns home and digs up the treasure. In other words, the foreboding dream not only predicted the future, but the dream was the cause of its prediction coming true. A variant of this story later appears in English folklore as the "Pedlar of Swaffham".

Another variation of the self-fulfilling prophecy can be seen in "The Tale of Attaf", where Harun al-Rashid consults his library (the House of Wisdom), reads a random book, "falls to laughing and weeping and dismisses the faithful vizier" Ja'far ibn Yahya from sight. Ja'far, "disturbed and upset flees Baghdad and plunges into a series of adventures in Damascus, involving Attaf and the woman whom Attaf eventually marries." After returning to Baghdad, Ja'far reads the same book that caused Harun to laugh and weep, and discovers that it describes his own adventures with Attaf. In other words, it was Harun's reading of the book that provoked the adventures described in the book to take place. This is an early example of reverse causality. In the 12th century, this tale was translated into Latin by Petrus Alphonsi and included in his Disciplina Clericalis. In the 14th century, a version of this tale also appears in the Gesta Romanorum and Giovanni Boccaccio's The Decameron.

Hinduism

Krishna playing his flute with Radha

Self-fulfilling prophecies appear in classical Sanskrit literature. In the story of Krishna in the Indian epic Mahabharata, the ruler of the Mathura kingdom, Kamsa, afraid of a prophecy that predicted his death at the hands of his sister Devaki's son, had her cast into prison where he planned to kill all of her children at birth. After killing the first six children, and Devaki's apparent miscarriage of the seventh, Krishna (the eighth son) was born. As his life was in danger he was smuggled out to be raised by his foster parents Yashoda and Nanda in the village of Gokula. Years later, Kamsan learned about the child's escape and kept sending various demons to put an end to him. The demons were defeated at the hands of Krishna and his brother Balarama. Krishna, as a young man returned to Mathura to overthrow his uncle, and Kamsa was eventually killed by his nephew Krishna. It was due to Kamsa's attempts to prevent the prophecy that it came true, thus fulfilling the prophecy.

Ruthenian

Oleg of Novgorod was a Varangian prince who ruled over the Rus people during the early tenth century. As old East Slavic chronicles say, it was prophesied by the pagan priests that Oleg's stallion would be the source of Oleg's death. To avoid this he sent the horse away. Many years later he asked where his horse was, and was told that it had died. He asked to see the remains and was taken to the place where the bones lay. When he touched the horse's skull with his boot a snake slithered from the skull and bit him. Oleg died, thus fulfilling the prophecy. In the Primary Chronicle, Oleg is known as the Prophet, ironically referring to the circumstances of his death. The story was romanticized by Alexander Pushkin in his celebrated ballad "The Song of the Wise Oleg". In Scandinavian traditions, this legend lived on in the saga of Orvar-Odd.

European fairy-tales

Many fairy-tales, such as The Devil With the Three Golden Hairs, The Fish and the Ring, The Story of Three Wonderful Beggars, or The King Who Would Be Stronger Than Fate, feature a prophecy that a poor boy will marry a rich girl (or, less frequently, a poor girl will marry a rich boy). This is story type 930 in the Aarne–Thompson classification scheme. The girl's father's efforts to prevent it are the reason why the boy ends up marrying her.

Another fairy-tale occurs with older children. In The Language of the Birds, a father forces his son to tell him what the birds say: that the father would be the son's servant. In The Ram, the father forces his daughter to tell him her dream: that her father would hold an ewer for her to wash her hands in. In both, the father takes the child's response as evidence of malice and drives the child off; this allows the child to change so that the father will not recognize his own offspring later and so offer to act as the child's servant.

In some variants of Sleeping Beauty, such as Sun, Moon, and Talia, the sleep is not brought about by a curse, but a prophecy that she will be endangered by flax (or hemp) results in the royal order to remove all the flax or hemp from the castle, resulting in her ignorance of the danger and her curiosity.

Shakespeare

Shakespeare's Macbeth is another classic example of a self-fulfilling prophecy. The three witches prophecy that Macbeth will eventually become king, but that the offspring of his best friend will rule instead of his own. Spurred by the prophecy, Macbeth kills the king and his own friend, something he arguably would not have done otherwise, leading to a revolution against him, and his death. The later prophecy by the first apparition of the witches that Macbeth should "Beware Macduff" is also a self-fulfilling prophecy. If Macbeth had not been told this, then he might not have regarded Macduff as a threat. Therefore, he would not have killed Macduff's family, and Macduff would not have sought revenge and killed Macbeth.

Modern

New age religion

The law of attraction is a typical example of self-fulfilling prophecy. It is the name given to the belief that "like attracts like" and that by focusing on positive or negative thoughts, one can bring about positive or negative results. According to this law, all things are created first by imagination, which leads to thoughts, then to words and actions. The thoughts, words and actions held in mind affect someone's intentions which makes the expected result happen. Although there are some cases where positive or negative attitudes can produce corresponding results (principally the placebo and nocebo effects), there is no scientific basis to the law of attraction.

Sports

Some researchers from 2008 found that in basketball, the head coaches gave more biased feedback while the assistant coaches gave more critical feedback. They predicted this was due to the external expectations from the coaches to the athletes which could have resulted in the Pygmalion effect with positive and negative results.

Researcher Helen Brown published findings of two experiments performed on athletes, investigating the effect that the media has on them, and concluded that the athlete's performance was impacted by and aligned with expectations of their performance. A follow-up experiment in London found that such expectations can impact their judgement and thought processes, and can even have a dangerous and destructive impact on some athletes.

Causal loop

A self-fulfilling prophecy may be a form of causality loop. Predestination does not necessarily involve a supernatural power, and could be the result of other "infallible foreknowledge" mechanisms. Problems arising from infallibility and influencing the future are explored in Newcomb's paradox. A notable fictional example of a self-fulfilling prophecy occurs in classical play Oedipus Rex, in which Oedipus becomes the king of Thebes, whilst in the process unwittingly fulfills a prophecy that he would kill his father and marry his mother. The prophecy itself serves as the impetus for his actions, and thus it is self-fulfilling. The movie 12 Monkeys heavily deals with themes of predestination and the Cassandra complex, where the protagonist who travels back in time explains that he cannot change the past.

Unintended consequences

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Unintended_consequences
A gully erosion in Australia caused by rabbits, an unintended consequence of their introduction as game animals

In the social sciences, unintended consequences (sometimes unanticipated consequences or unforeseen consequences, more colloquially called knock-on effects) are outcomes of a purposeful action that are not intended or foreseen. The term was popularized in the 20th century by American sociologist Robert K. Merton.

Unintended consequences can be grouped into three types:

  • Unexpected benefit: A positive unexpected benefit (also referred to as luck, serendipity, or a windfall).
  • Unexpected drawback: An unexpected detriment occurring in addition to the desired effect of the policy (e.g., while irrigation schemes provide people with water for agriculture, they can increase waterborne diseases that have devastating health effects, such as schistosomiasis).
  • Perverse result: A perverse effect contrary to what was originally intended (when an intended solution makes a problem worse).

History

John Locke

The idea of unintended consequences dates back at least to John Locke who discussed the unintended consequences of interest rate regulation in his letter to Sir John Somers, Member of Parliament.

Adam Smith

The idea was also discussed by Adam Smith, the Scottish Enlightenment, and consequentialism (judging by results).

The invisible hand theorem is an example of the unintended consequences of agents acting in their self-interest. As Andrew S. Skinner puts it:

"The individual undertaker (entrepreneur), seeking the most efficient allocation of resources, contributes to overall economic efficiency; the merchant's reaction to price signals helps to ensure that the allocation of resources accurately reflects the structure of consumer preferences; and the drive to better our condition contributes to economic growth."

Marx and Engels

Influenced by 19th century positivism and Charles Darwin's evolution, for both Friedrich Engels and Karl Marx, the idea of uncertainty and chance in social dynamics (and thus unintended consequences beyond results of perfectly defined laws) was only apparent, (if not rejected) since social actions were directed and produced by deliberate human intention.

While discerning between the forces that generate changes in nature and those that generate changes in history in his discussion of Ludwig Feuerbach, Friedrich Engels touched on the idea of (apparent) unintended consequences:

In nature [...] there are only blind, unconscious agencies acting upon one another, [...] In the history of society, on the contrary, the actors are all endowed with consciousness, are men acting with deliberation or passion, working towards definite goals; nothing happens without a conscious purpose, without an intended aim. [...] For here, also, on the whole, in spite of the consciously desired aims of all individuals, accident apparently reigns on the surface. That which is willed happens but rarely; in the majority of instances the numerous desired ends cross and conflict with one another, or these ends themselves are from the outset incapable of realization, or the means of attaining them are insufficient. Thus the conflicts of innumerable individual wills and individual actions in the domain of history produce a state of affairs entirely analogous to [...] the realm of unconscious nature. The ends of the actions are intended, but the results which actually follow from these actions are not intended; or when they do seem to correspond to the end intended, they ultimately have consequences quite other than those intended. Historical events thus appear on the whole to be likewise governed by chance. But where on the surface accident holds sway, there actually it is always governed by inner, hidden laws, and it is only a matter of discovering these laws.

— Ludwig Feuerbach and the End of Classical German Philosophy (Ludwig Feuerbach und der Ausgang der klassischen deutschen Philosophie), 1886.

For his part, for Karl Marx what can be understood as unintended consequences are actually consequences that should be expected but are obtained unconsciously. These consequences (that no one consciously sought) would be (in the same way as it is for Engels) product of conflicts that confront actions from countless individuals. The deviation between the original intended goal and the product derived from conflicts would be a marxist equivalent to «unintended consequences.

This social conflicts would happen as a result of a competitive society, and also lead society to sabotage itself and prevent historical progress. Thus, historical progress (in Marxist terms) should eliminate these conflicts and make unintended consequences predictable.

Austrian School

Unintended consequences are a common topic of study and commentary for the Austrian school of economics given its emphasis on methodological individualism. This is to such an extent that unexpected consequences can be considered as a distinctive part of Austrian tenets.

Carl Menger

In "Principles of Economics", Austrian school founder Carl Menger (1840 - 1921) noted that the relationships that occur in the economy are so intricate that a change in the condition of a single good can have ramifications beyond that good. Menger wrote:

If it is established that the existence of human needs capable of satisfaction is a prerequisite of goods-character [...] This principle is valid whether the goods can be placed in direct causal connection with the satisfaction of human needs, or derive their goods-character from a more or less indirect causal connection with the satisfaction of human needs. [...]
Thus quinine would cease to be a good if the diseases it serves to cure should disappear, since the only need with the satisfaction of which it is causally connected would no longer exist. But the disappearance of the usefulness of quinine would have the further consequence that a large part of the corresponding goods of higher order would also be deprived of their goods-character. The inhabitants of quinine-producing countries, who currently earn their livings by cutting and peeling cinchona trees, would suddenly find that not only their stocks of cinchona bark, but also, in consequence, their cinchona trees, the tools and appliances applicable only to the production of quinine, and above all the specialized labor services, by means of which they previously earned their livings, would at once lose their goods-character, since all these things would, under the changed circumstances, no longer have any causal relationship with the satisfaction of human needs.

— Principles of Economics (Grundsätze der Volkswirtschaftslehre), 1871.

Friedrich Hayek and Catallactics

Economist and philosopher Friedrich Hayek (1899 – 1992) is another key figure in the Austrian School of Economics who is notable for his comments on unintended consequences.

In "The Use of Knowledge in Society" (1945) Hayek argues that a centrally planned economy cannot reach the level of efficiency of the free market economy because the necessary (and pertinent) information for decision-making is not concentrated but dispersed among a vast number of agents. Then, for Hayek, the price system in the free market allows the members of a society to anonymously coordinate for the most efficient use of resources, for example, in a situation of scarcity of a raw material, the price increase would coordinate the actions of an uncountable amount of individuals "in the right direction".

The development of this system of interactions would allow the progress of society, and individuals would carry it out without knowing all its implications, given the dispersion (or lack of concentration) of information.

The implication of this is that the social order (which derives from social progress, which in turn derives from the economy), would be result of a spontaneous cooperation and also an unintended consequence, being born from a process of which no individual or group had all the information available or could know all possible outcomes.

In the Austrian school, this process of social adjustment that generates a social order in an unintendedly way is known as catallactics.

For Hayek and the Austrian School, the number of individuals involved in the process of creating a social order defines the type of unintended consequence:

  1. If the process involves interactions and decision making of as many individuals (members of a society) as possible (thus gathering the greatest amount of knowledge dispersed among them), this process of "catallaxy" will lead to unexpected benefits (a social order and progress)
  2. On the other hand, attempts by individuals or limited groups (who lack all the necessary information) to achieve a new or better order, will end in unexpected drawbacks.

Robert K. Merton

Sociologist Robert K. Merton popularised this concept in the twentieth century.

In "The Unanticipated Consequences of Purposive Social Action" (1936), Merton tried to apply a systematic analysis to the problem of unintended consequences of deliberate acts intended to cause social change. He emphasized that his term purposive action, "[was exclusively] concerned with 'conduct' as distinct from 'behavior.' That is, with action that involves motives and consequently a choice between various alternatives". Merton's usage included deviations from what Max Weber defined as rational social action: instrumentally rational and value rational. Merton also stated that "no blanket statement categorically affirming or denying the practical feasibility of all social planning is warranted."

Everyday usage

More recently, the law of unintended consequences has come to be used as an adage or idiomatic warning that an intervention in a complex system tends to create unanticipated and often undesirable outcomes.

Akin to Murphy's law, it is commonly used as a wry or humorous warning against the hubristic belief that humans can fully control the world around them, not to presuppose a belief in predestination or a lack or a disbelief in that of free will.

Causes

Possible causes of unintended consequences include the world's inherent complexity (parts of a system responding to changes in the environment), perverse incentives, human stupidity, self-deception, failure to account for human nature, or other cognitive or emotional biases. As a sub-component of complexity (in the scientific sense), the chaotic nature of the universe—and especially its quality of having small, apparently insignificant changes with far-reaching effects (e.g., the butterfly effect)—applies.

In 1936, Robert K. Merton listed five possible causes of unanticipated consequences:

  • Ignorance, making it impossible to anticipate everything, thereby leading to incomplete analysis.
  • Errors in analysis of the problem or following habits that worked in the past but may not apply to the current situation.
  • Immediate interests overriding long-term interests.
  • Basic values which may require or prohibit certain actions even if the long-term result might be unfavourable (these long-term consequences may eventually cause changes in basic values).
  • Self-defeating prophecy, or, the fear of some consequence which drives people to find solutions before the problem occurs, thus the non-occurrence of the problem is not anticipated.

In addition to Merton's causes, psychologist Stuart Vyse has noted that groupthink, described by Irving Janis, has been blamed for some decisions that result in unintended consequences.

Types

Unexpected benefits

The creation of "no-man's lands" during the Cold War, in places such as the border between Eastern and Western Europe, and the Korean Demilitarized Zone, has led to large natural habitats.

Sea life on the wreck of the sunken USS Oriskany

The sinking of ships in shallow waters during wartime has created many artificial coral reefs, which can be scientifically valuable and have become an attraction for recreational divers. This led to the deliberate sinking of retired ships for the purpose of replacing coral reefs lost to global warming and other factors.

In medicine, most drugs have unintended consequences ('side effects') associated with their use. However, some are beneficial. For instance, aspirin, a pain reliever, is also an anticoagulant that can help prevent heart attacks and reduce the severity and damage from thrombotic strokes. Beneficial side effects have also lead to off-label use –prescription or use of a drug for an unlicensed purpose. Famously, the drug Viagra was developed to lower blood pressure, with its use for treating erectile dysfunction being discovered as a side effect in clinical trials.

In papal conclave journalism, Cardinal Fridolin Ambongo Besungu of Kinshasa in the Democratic Republic of Congo, the elected leader of all the bishops of Africa (including Madagascar), by early 2024 had come to be regarded as papabile for his adroit handling of the issue of blessing same sex unions, to which he is staunchly opposed.

Unexpected drawbacks

The implementation of a profanity filter by AOL in 1996 had the unintended consequence of blocking residents of Scunthorpe, North Lincolnshire, England, from creating accounts because of a false positive. The accidental censorship of innocent language, known as the Scunthorpe problem, has been repeated and widely documented.

In 1990, the Australian state of Victoria made safety helmets mandatory for all bicycle riders. While there was a reduction in the number of head injuries, there was also an unintended reduction in the number of juvenile cyclists—fewer cyclists obviously leads to fewer injuries, all else being equal. The risk of death and serious injury per cyclist seems to have increased, possibly because of risk compensation. Research by Vulcan et al. found that the reduction in juvenile cyclists was because the youths considered wearing a bicycle helmet unfashionable. A health-benefit model developed at Macquarie University in Sydney suggests that, while helmet use reduces "the risk of head or brain injury by approximately two-thirds or more", the decrease in exercise caused by reduced cycling as a result of helmet laws is counterproductive in terms of net health.

Prohibition in the 1920s United States, originally enacted to suppress the alcohol trade, drove many small-time alcohol suppliers out of business and consolidated the hold of large-scale organized crime over the illegal alcohol industry. Since alcohol was still popular, criminal organisations producing alcohol were well-funded and hence also increased their other activities. Similarly, the war on drugs, intended to suppress the illegal drug trade, instead increased the power and profitability of drug cartels who became the primary source of the products.

In CIA jargon, "blowback" describes the unintended, undesirable consequences of covert operations, such as the funding of the Afghan Mujahideen and the destabilization of Afghanistan contributing to the rise of the Taliban and Al-Qaeda.

The introduction of exotic animals and plants for food, for decorative purposes, or to control unwanted species often leads to more harm than good done by the introduced species.

  • The introduction of rabbits in Australia and New Zealand for food was followed by an explosive growth in the rabbit population; rabbits have become a major feral pest in these countries.
  • Cane toads, introduced into Australia to control canefield pests, were unsuccessful and have become a major pest in their own right.
  • Kudzu, introduced to the US as an ornamental plant in 1876 and later used to prevent erosion in earthworks, has become a major problem in the Southeastern United States. Kudzu has displaced native plants and has effectively taken over significant portions of land.

The protection of the steel industry in the United States reduced production of steel in the United States, increased costs to users, and increased unemployment in associated industries.

Perverse results

The infamous photo of the Streisand Estate

In 2003, Barbra Streisand unsuccessfully sued Kenneth Adelman and Pictopia.com for posting a photograph of her home online. Before the lawsuit had been filed, only 6 people had downloaded the file, two of them Streisand's attorneys. The lawsuit drew attention to the image, resulting in 420,000 people visiting the site. The Streisand Effect was named after this incident, describing when an attempt to censor or remove a certain piece of information instead draws attention to the material being suppressed, resulting in the material instead becoming widely known, reported on, and distributed.

Passenger-side airbags in motorcars were intended as a safety feature, but led to an increase in child fatalities in the mid-1990s because small children were being hit by airbags that deployed automatically during collisions. The supposed solution to this problem, moving the child seat to the back of the vehicle, led to an increase in the number of children forgotten in unattended vehicles, some of whom died under extreme temperature conditions.

Risk compensation, or the Peltzman effect, occurs after implementation of safety measures intended to reduce injury or death (e.g. bike helmets, seatbelts, etc.). People may feel safer than they really are and take additional risks which they would not have taken without the safety measures in place. This may result in no change, or even an increase, in morbidity or mortality, rather than a decrease as intended.

According to an anecdote, the British government, concerned about the number of venomous cobra snakes in Delhi, offered a bounty for every dead cobra. This was a successful strategy as large numbers of snakes were killed for the reward. Eventually, enterprising people began breeding cobras for the income. When the government became aware of this, they scrapped the reward program, causing the cobra breeders to set the now-worthless snakes free. As a result, the wild cobra population further increased. The apparent solution for the problem made the situation even worse, becoming known as the Cobra effect.

Theobald Mathew's temperance campaign in 19th-century Ireland resulted in thousands of people vowing never to drink alcohol again. This led to the consumption of diethyl ether, a much more dangerous intoxicant—owing to its high flammability—by those seeking to become intoxicated without breaking the letter of their pledge.

It was thought that adding south-facing conservatories to British houses would reduce energy consumption by providing extra insulation and warmth from the sun. However, people tended to use the conservatories as living areas, installing heating and ultimately increasing overall energy consumption.

A reward for lost nets found along the Normandy coast was offered by the French government between 1980 and 1981. This resulted in people vandalizing nets to collect the reward.

Beginning in the 1940s and continuing into the 1960s, the Canadian federal government gave Quebec $2.75 per day per psychiatric patient for their cost of care, but only $1.25 a day per orphan. The perverse result is that the orphan children were diagnosed as mentally ill so Quebec could receive the larger amount of money. This psychiatric misdiagnosis affected up to 20,000 people, and the children are known as the Duplessis Orphans in reference to the Premier of Quebec who oversaw the scheme, Maurice Duplessis.

There have been attempts to curb the consumption of sugary beverages by imposing a tax on them. However, a study found that the reduced consumption was only temporary. Also, there was an increase in the consumption of beer among households.

The New Jersey Childproof Handgun Law, which was intended to protect children from accidental discharge of firearms by forcing all future firearms sold in New Jersey to contain "smart" safety features, has delayed, if not stopped entirely, the introduction of such firearms to New Jersey markets. The wording of the law caused significant public backlash, fuelled by gun rights lobbyists, and several shop owners offering such guns received death threats and stopped stocking them. In 2014, 12 years after the law was passed, it was suggested the law be repealed if gun rights lobbyists agree not to resist the introduction of "smart" firearms.

Drug prohibition can lead drug traffickers to prefer stronger, more dangerous substances, that can be more easily smuggled and distributed than other, less concentrated substances.

Televised drug prevention advertisements may lead to increased drug use.

Increasing usage of search engines, also including recent image search features, has contributed in the ease of which media is consumed. Some abnormalities in usage may have shifted preferences for pornographic film actors, as the producers began using common search queries or tags to label the actors in new roles.

The passage of the Stop Enabling Sex Traffickers Act has led to a reported increase in risky behaviors by sex workers as a result of quashing their ability to seek and screen clients online, forcing them back onto the streets or into the dark web. The ads posted were previously an avenue for advocates to reach out to those wanting to escape the trade.

The use of precision guided munitions meant to reduce the rate of civilian casualties encouraged armies to narrow their safety margins, and increase the use of deadly force in densely populated areas. This in turn increased the danger to uninvolved civilians, who in the past would have been out of the line of fire because of armies' aversion of using higher-risk weaponry in densely populated areas. The perceived ability to operate precision weaponry from afar (where in the past heavy munitions or troop deployment would have been needed) also led to the expansion of the list of potential targets. As put by Michael Walzer: "Drones not only make it possible for us to get at our enemies, they may also lead us to broaden the list of enemies, to include presumptively hostile individuals and militant organizations simply because we can get at them—even if they aren't actually involved in attacks against us." This idea is also echoed by Grégoire Chamayou: "In a situation of moral hazard, military action is very likely to be deemed 'necessary' simply because it is possible, and possible at a lower cost."

After Dobbs v. Jackson Women's Health Organization (2022) overturned Roe v. Wade (1973), the number of abortions in the United States increased and the number of births fell, due to the right to travel between states.

Other

According to Lynn White, the invention of the horse stirrup enabled new patterns of warfare that eventually led to the development of feudalism (see Stirrup Thesis).

Perverse consequences of environmental intervention

Almost all environmental problems, from chemical pollution to global warming, are the unexpected consequences of the application of modern technologies. Traffic congestion, deaths and injuries from car accidents, air pollution, and global warming are unintended consequences of the invention and large scale adoption of the automobile. Hospital infections are the unexpected side-effect of antibiotic resistance, and even human population growth leading to environmental degradation is the side effect of various technological (i.e., agricultural and industrial) revolutions.

Because of the complexity of ecosystems, deliberate changes to an ecosystem or other environmental interventions will often have (usually negative) unintended consequences. Sometimes, these effects cause permanent irreversible changes. Examples include:

Chinese poster promoting the Four Pests campaign; a boy with a red neckerchief aims a slingshot at an off-frame overhead target, and a girl next to him looks at the target as well. There is a village in the background. There is a Chinese slogan "大家都来打麻雀" in red letters at the footer.
Chinese poster encouraging children to attack sparrows
  • During the Four Pests campaign, Maoist China ordered the killing of sparrows, as well as rats, flies, and mosquitoes. The campaign was successful in reducing the sparrow population; however, in their absence, locust populations previously kept in check by sparrow predation grew out of control and came to infest crops. Rice yields were substantially decreased; the campaign was one of the causes of the Great Chinese Famine.
  • During the Great Plague of London a killing of dogs and cats was ordered. If left untouched, they would have made a significant reduction in the rat population that carried the fleas which transmitted the disease.
  • The installation of smokestacks to decrease pollution in local areas, resulting in spread of pollution at a higher altitude, and acid rain on an international scale.
  • After about 1900, public demand led the US government to fight forest fires in the American West, and set aside land as national forests and parks to protect them from fires. This policy led to fewer fires, but also led to growth conditions such that, when fires did occur, they were much larger and more damaging. Modern research suggests that this policy was misguided, and that a certain level of wildfires is a natural and important part of forest ecology.
  • Side effects of climate engineering to counter global warming could involve even further warming as a consequence of reflectivity-reducing afforestation or crop yield reductions and rebound effects after solar dimming measures with even more accelerated warming.

Philosophy of science

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Philosophy_of_science Philosophy of science  is the branch of  philosoph...