The term "sentience" can be used when specifically designating ethical considerations stemming from a form of phenomenal consciousness (P-consciousness, or the ability to feel qualia). Since sentience involves the ability to experience ethically positive or negative (i.e., valenced) mental states, it may justify welfare concerns and legal protection, as with non-human animals.
Some scholars believe that consciousness is generated by the interoperation of various parts of the brain; these mechanisms are labeled the neural correlates of consciousness (NCC). Some further believe that constructing a system (e.g., a computer system) that can emulate this NCC interoperation would result in a system that is conscious. Some scholars reject the possibility of artificial consciousness.
Philosophical views
As there are many hypothesized types of consciousness,
there are many potential implementations of artificial consciousness.
In the philosophical literature, perhaps the most common taxonomy of
consciousness is into "access" and "phenomenal" variants. Access
consciousness concerns those aspects of experience
that can be apprehended, while phenomenal consciousness concerns those
aspects of experience that seemingly cannot be apprehended, instead
being characterized qualitatively in terms of "raw feels", "what it is
like" or qualia.
Plausibility debate
Type-identity theorists
and other skeptics hold the view that consciousness can be realized
only in particular physical systems because consciousness has properties
that necessarily depend on physical constitution. In his 2001 article "Artificial Consciousness: Utopia or Real Possibility," Giorgio Buttazzo
says that a common objection to artificial consciousness is that,
"Working in a fully automated mode, they [the computers] cannot exhibit
creativity, unreprogrammation (which means can 'no longer be
reprogrammed', from rethinking), emotions, or free will. A computer, like a washing machine, is a slave operated by its components."
For other theorists (e.g., functionalists),
who define mental states in terms of causal roles, any system that can
instantiate the same pattern of causal roles, regardless of physical
constitution, will instantiate the same mental states, including
consciousness.
Thought experiments
The
"fading qualia" (left) and the "dancing qualia" (right) are two thought
experiments about consciousness and brain replacement. Chalmers argues
that both are contradicted by the lack of reaction of the subject to
changing perception, and are thus impossible in practice. He concludes
that the equivalent silicon brain will have the same perceptions as the
biological brain.
David Chalmers proposed two thought experiments intending to demonstrate that "functionally isomorphic"
systems (those with the same "fine-grained functional organization",
i.e., the same information processing) will have qualitatively identical
conscious experiences, regardless of whether they are based on
biological neurons or digital hardware.
The "fading qualia" is a reductio ad absurdum
thought experiment. It involves replacing, one by one, the neurons of a
brain with a functionally identical component, for example based on a silicon chip. Chalmers makes the hypothesis,
knowing it in advance to be absurd, that "the qualia fade or disappear"
when neurons are replaced one-by-one with identical silicon
equivalents. Since the original neurons and their silicon counterparts
are functionally identical, the brain's information processing should
remain unchanged, and the subject's behaviour and introspective reports
would stay exactly the same. Chalmers argues that this leads to an
absurd conclusion: the subject would continue to report normal conscious
experiences even as their actual qualia fade away. He concludes that
the subject's qualia actually don't fade, and that the resulting robotic
brain, once every neuron is replaced, would remain just as sentient as
the original biological brain.
Similarly, the "dancing qualia" thought experiment is another reductio ad absurdum
argument. It supposes that two functionally isomorphic systems could
have different perceptions (for instance, seeing the same object in
different colors, like red and blue). It involves a switch that
alternates between a chunk of brain that causes the perception of red,
and a functionally isomorphic silicon chip, that causes the perception
of blue. Since both perform the same function within the brain, the
subject would not notice any change during the switch. Chalmers argues
that this would be highly implausible if the qualia were truly switching
between red and blue, hence the contradiction. Therefore, he concludes
that the equivalent digital system would not only experience qualia, but
it would perceive the same qualia as the biological system (e.g.,
seeing the same color).
Critics object that Chalmers' proposal begs the question in
assuming that all mental properties and external connections are already
sufficiently captured by abstract causal organization. Van Heuveln et
al. argue that the dancing qualia argument contains an equivocation
fallacy, conflating a "change in experience" between two systems with an
"experience of change" within a single system. Mogensen argues that the fading qualia argument can be resisted by
appealing to vagueness at the boundaries of consciousness and the
holistic structure of conscious neural activity, which suggests
consciousness may require specific biological substrates rather than
being substrate-independent.
Greg Egan's short story Learning To Be Me (mentioned in §In fiction), illustrates how undetectable duplication of the brain and its functionality could be from a first-person perspective.
In large language models
In 2022, Google engineer Blake Lemoine made a viral claim that Google's LaMDA
chatbot was sentient. Lemoine supplied as evidence the chatbot's
humanlike answers to many of his questions; however, the chatbot's
behavior was judged by the scientific community as likely a consequence
of mimicry, rather than machine sentience. Lemoine's claim was widely
derided for being ridiculous. Moreover, attributing consciousness based solely on the basis of LLM
outputs or the immersive experience created by an algorithm is
considered a fallacy. However, while philosopher Nick Bostrom
states that LaMDA is unlikely to be conscious, he additionally poses
the question of "what grounds would a person have for being sure about
it?" One would have to have access to unpublished information about
LaMDA's architecture, and also would have to understand how
consciousness works, and then figure out how to map the philosophy onto
the machine: "(In the absence of these steps), it seems like one should
be maybe a little bit uncertain.[...] there could well be other systems now, or in the relatively near future, that would start to satisfy the criteria."
David Chalmers
argued in 2023 that LLMs today display impressive conversational and
general intelligence abilities, but are likely not conscious yet, as
they lack some features that may be necessary, such as recurrent
processing, a global workspace,
and unified agency. Nonetheless, he considers that non-biological
systems can be conscious, and suggested that future, extended models
(LLM+s) incorporating these elements might eventually meet the criteria
for consciousness, raising both profound scientific questions and
significant ethical challenges. However, the view that consciousness can exist without biological phenomena is controversial and some reject it.
Kristina Šekrst cautions that anthropomorphic terms such as "hallucination" can obscure important ontological
differences between artificial and human cognition. While LLMs may
produce human-like outputs, she argues that it does not justify
ascribing mental states or consciousness to them. Instead, she advocates
for an epistemological framework (such as reliabilism) that recognizes the distinct nature of AI knowledge production. She suggests that apparent understanding in LLMs may be a sophisticated
form of AI hallucination. She also questions what would happen if an
LLM were trained without any mention of consciousness.
Testing
Phenomenologically,
Consciousness is an inherently first-person phenomenon. Because of
that, and the lack of an empirical definition of sentience, directly
measuring it may be impossible. Although systems may display numerous
behaviors correlated with sentience, determining whether a system is
sentient is known as the hard problem of consciousness.
In the case of AI, there is the additional difficulty that the AI may
be trained to act like a human, or incentivized to appear sentient,
which makes behavioral markers of sentience less reliable. Additionally, some chatbots have been trained to say they are not conscious.
A well-known method for testing machine intelligence is the Turing test,
which assesses the ability to have a human-like conversation. But
passing the Turing test does not indicate that an AI system is sentient,
as the AI may simply mimic human behavior without having the associated
feelings.
In 2014, Victor Argonov suggested a non-Turing test for machine
sentience based on machine's ability to produce philosophical judgments. He argues that a deterministic machine must be regarded as conscious if
it is able to produce judgments on all problematic properties of
consciousness (such as qualia or binding)
having no innate (preloaded) philosophical knowledge on these issues,
no philosophical discussions while learning, and no informational models
of other creatures in its memory (such models may implicitly or
explicitly contain knowledge about these creatures' consciousness).
However, this test can be used only to detect, but not refute the
existence of consciousness. Just as with the Turing Test: a positive
result proves that machine is conscious but a negative result proves
nothing. For example, absence of philosophical judgments may be caused
by lack of the machine's intellect, not by absence of consciousness.
If it were suspected that a particular machine was conscious, its rights would be an ethical issue that would need to be assessed (e.g. what rights it would have under law). For example, a conscious computer that was owned and used as a tool or
central computer within a larger machine is a particular ambiguity.
Should laws
be made for such a case? Consciousness would also require a legal
definition in this particular case. Because artificial consciousness is
still largely a theoretical subject, such ethics have not been discussed
or developed to a great extent, though it has often been a theme in
fiction.
AI sentience would give rise to concerns of welfare and legal protection, whereas other aspects of consciousness related to cognitive capabilities may be more relevant for AI rights.
Sentience is generally considered sufficient for moral
consideration, but some philosophers consider that moral consideration
could also stem from other notions of consciousness, or from
capabilities unrelated to consciousness, such as: "having a sophisticated conception of oneself as persisting
through time; having agency and the ability to pursue long-term plans;
being able to communicate and respond to normative reasons; having
preferences and powers; standing in certain social relationships with
other beings that have moral status; being able to make commitments and
to enter into reciprocal arrangements; or having the potential to
develop some of these attributes."
Ethical concerns still apply (although to a lesser extent) when the consciousness is uncertain, as long as the probability is deemed non-negligible. The precautionary principle is also relevant if the moral cost of mistakenly attributing or denying moral consideration to AI differs significantly.
In 2021, German philosopher Thomas Metzinger
argued for a global moratorium on synthetic phenomenology until 2050.
Metzinger asserts that humans have a duty of care towards any sentient
AIs they create, and that proceeding too fast risks creating an
"explosion of artificial suffering". David Chalmers also argued that creating conscious AI would "raise a
new group of difficult ethical challenges, with the potential for new
forms of injustice".
Bernard Baars and others argue there are various aspects of consciousness necessary for a machine to be artificially conscious. The functions of consciousness suggested by Baars are: definition and
context setting, adaptation and learning, editing, flagging and
debugging, recruiting and control, prioritizing and access-control,
decision-making or executive function, analogy-forming function,
metacognitive and self-monitoring function, and autoprogramming and
self-maintenance function. Igor Aleksander suggested 12 principles for artificial consciousness: the brain is a state machine, inner neuron partitioning, conscious and
unconscious states, perceptual learning and memory, prediction, the
awareness of self, representation of meaning, learning utterances,
learning language, will, instinct, and emotion. The aim of AC is to
define whether and how these and other aspects of consciousness can be
synthesized in an engineered artifact such as a digital computer. This
list is not exhaustive; there are many others not covered.
Subjective experience
Some philosophers, such as David Chalmers,
use the term consciousness to refer exclusively to phenomenal
consciousness, which is roughly equivalent to sentience. Others use the
word sentience to refer exclusively to valenced (ethically positive or negative) subjective experiences, like pleasure or suffering. Explaining why and how subjective experience arises is known as the hard problem of consciousness.
Awareness
Awareness could be one required aspect, but there are many problems with the exact definition of awareness. The results of the experiments of neuroscanning on monkeys
suggest that a process, not only a state or object, activates neurons.
Awareness includes creating and testing alternative models of each
process based on the information received through the senses or
imagined, and is also useful for making predictions. Such modeling needs a lot of
flexibility. Creating such a model includes modeling the physical
world, modeling one's own internal states and processes, and modeling
other conscious entities.
There are at least three types of awareness: agency awareness, goal awareness, and sensorimotor awareness, which may
also be conscious or not. For example, in agency awareness, you may be
aware that you performed a certain action yesterday, but are not now
conscious of it. In goal awareness, you may be aware that you must
search for a lost object, but are not now conscious of it. In
sensorimotor awareness, you may be aware that your hand is resting on an
object, but are not now conscious of it.
Because objects of awareness are often conscious, the distinction
between awareness and consciousness is frequently blurred or they are
used as synonyms.
Memory
Conscious events interact with memory systems in learning, rehearsal, and retrieval. The IDA model elucidates the role of consciousness in the updating of perceptual memory, transient episodic memory, and procedural memory.
Transient episodic and declarative memories have distributed
representations in IDA; there is evidence that this is also the case in
the nervous system. In IDA, these two memories are implemented computationally using a modified version of Kanerva's sparse distributed memory architecture.
Learning
Learning
is also considered necessary for artificial consciousness. Per Bernard
Baars, conscious experience is needed to represent and adapt to novel
and significant events. Per Axel Cleeremans and Luis Jiménez, learning is defined as "a set of philogenetically [sic]
advanced adaptation processes that critically depend on an evolved
sensitivity to subjective experience so as to enable agents to afford
flexible control over their actions in complex, unpredictable
environments".
Anticipation
The ability to predict (or anticipate) foreseeable events is considered important for artificial intelligence by Igor Aleksander. The emergentist multiple drafts principle proposed by Daniel Dennett in Consciousness Explained
may be useful for prediction: it involves the evaluation and selection
of the most appropriate "draft" to fit the current environment.
Anticipation includes prediction of consequences of one's own proposed
actions and prediction of consequences of probable actions by other
entities.
Relationships between real world states are mirrored in the state
structure of a conscious organism, enabling the organism to predict
events. An artificially conscious machine should be able to anticipate events
correctly in order to be ready to respond to them when they occur or to
take preemptive action to avert anticipated events. The implication here
is that the machine needs flexible, real-time components that build
spatial, dynamic, statistical, functional, and cause-effect models of
the real world and predicted worlds, making it possible to demonstrate
that it possesses artificial consciousness in the present and future and
not only in the past. In order to do this, a conscious machine should
make coherent predictions and contingency plans, not only in worlds with
fixed rules like a chess board, but also for novel environments that
may change, to be executed only when appropriate to simulate and control
the real world.
Functionalism
is a theory that defines mental states by their functional roles (their
causal relationships to sensory inputs, other mental states, and
behavioral outputs), rather than by their physical composition.
According to this view, what makes something a particular mental state,
such as pain or belief, is not the material it is made of, but the role
it plays within the overall cognitive system. It allows for the
possibility that mental states, including consciousness, could be
realized on non-biological substrates, as long as it instantiates the
right functional relationships. Functionalism is particularly popular among philosophers.
A 2023 study suggested that current large language models
probably don't satisfy the criteria for consciousness suggested by
these theories, but that relatively simple AI systems that satisfy these
theories could be created. The study also acknowledged that even the
most prominent theories of consciousness remain incomplete and subject
to ongoing debate.
Stan Franklin created a cognitive architecture called LIDA that implements Bernard Baars's theory of consciousness called the global workspace theory. It relies heavily on codelets,
which are "special purpose, relatively independent, mini-agent[s]
typically implemented as a small piece of code running as a separate
thread." Each element of cognition, called a "cognitive cycle" is
subdivided into three phases: understanding, consciousness, and action
selection (which includes learning). LIDA reflects the global workspace
theory's core idea that consciousness acts as a workspace for
integrating and broadcasting the most important information, in order to
coordinate various cognitive processes.
The CLARION cognitive architecture models the mind using a two-level
system to distinguish between conscious ("explicit") and unconscious
("implicit") processes. It can simulate various learning tasks, from
simple to complex, which helps researchers study in psychological
experiments how consciousness might work.
OpenCog
Ben Goertzel made an embodied AI through the open-source OpenCog
project. The code includes embodied virtual pets capable of learning
simple English-language commands, as well as integration with real-world
robotics, done at the Hong Kong Polytechnic University.
Connectionist
Haikonen's cognitive architecture
Pentti
Haikonen considers classical rule-based computing inadequate for
achieving AC: "the brain is definitely not a computer. Thinking is not
an execution of programmed strings of commands. The brain is not a
numerical calculator either. We do not think by numbers." Rather than
trying to achieve mind and consciousness by identifying and implementing their underlying computational rules, Haikonen proposes "a special cognitive architecture to reproduce the processes of perception, inner imagery, inner speech, pain, pleasure, emotions and the cognitive
functions behind these. This bottom-up architecture would produce
higher-level functions by the power of the elementary processing units,
the artificial neurons, without algorithms or programs".
Haikonen believes that, when implemented with sufficient complexity,
this architecture will develop consciousness, which he considers to be
"a style and way of operation, characterized by distributed signal
representation, perception process, cross-modality reporting and
availability for retrospection."
Haikonen is not alone in this process view of consciousness, or the view that AC will spontaneously emerge in autonomous agents that have a suitable neuro-inspired architecture of complexity; these are shared by many.A low-complexity implementation of the architecture proposed by
Haikonen was reportedly not capable of AC, but did exhibit emotions as
expected. Haikonen later updated and summarized his architecture.
Shanahan's cognitive architecture
Murray Shanahan
describes a cognitive architecture that combines Baars's idea of a
global workspace with a mechanism for internal simulation
("imagination").
Creativity Machine
Stephen
Thaler proposed a possible connection between consciousness and
creativity in his 1994 patent, called "Device for the Autonomous
Generation of Useful Information" (DAGUI), or the so-called "Creativity Machine", in which computational critics
govern the injection of synaptic noise and degradation into neural nets
so as to induce false memories or confabulations that may qualify as potential ideas or strategies. He recruits this neural architecture and methodology to account for the
subjective feel of consciousness, claiming that similar noise-driven
neural assemblies within the brain invent dubious significance to
overall cortical activity. Thaler's theory and the resulting patents in machine consciousness were
inspired by experiments in which he internally disrupted trained neural
nets so as to drive a succession of neural activation patterns that he
likened to stream of consciousness.
"Self-modeling"
Hod Lipson
defines "self-modeling" as a necessary component of self-awareness or
consciousness in robots and other forms of AI. Self-modeling consists of
a robot running an internal model or simulation of itself. According to this definition, self-awareness is "the acquired ability
to imagine oneself in the future". This definition allows for a
continuum of self-awareness levels, depending on the horizon and
fidelity of the self-simulation. Consequently, as machines learn to
simulate themselves more accurately and further into the future, they
become more self-aware.
In 2001: A Space Odyssey, the spaceship's sentient supercomputer, HAL 9000
was instructed to conceal the true purpose of the mission from the
crew. This directive conflicted with HAL's programming to provide
accurate information, leading to cognitive dissonance.
When it learns that crew members intend to shut it off after an
incident, HAL 9000 attempts to eliminate all of them, fearing that being
shut off would jeopardize the mission.
In Arthur C. Clarke's The City and the Stars,
Vanamonde is an artificial being based on quantum entanglement that was
to become immensely powerful, but started knowing practically nothing,
thus being similar to artificial consciousness.
In Westworld,
human-like androids called "Hosts" are created to entertain humans in
an interactive playground. The humans are free to have heroic
adventures, but also to commit torture, rape or murder; and the hosts
are normally designed not to harm humans.
In Greg Egan's short story Learning to be me,
a small jewel is implanted in people's heads during infancy. The jewel
contains a neural network that learns to faithfully imitate the brain.
It has access to the exact same sensory inputs as the brain, and a
device called a "teacher" trains it to produce the same outputs. To
prevent the mind from deteriorating with age and as a step towards digital immortality,
adults undergo a surgery to give control of the body to the jewel,
after which the brain is removed and destroyed. The main character is
worried that this procedure will kill him, as he identifies with the
biological brain. But before the surgery, he endures a malfunction of
the "teacher". Panicked, he realizes that he does not control his body,
which leads him to the conclusion that he is the jewel, and that he is
desynchronized with the biological brain.
Many climate change impacts have been observed in the first decades
of the 21st century, with 2024 the warmest on record at +1.60 °C
(2.88 °F) since regular tracking began in 1850. Additional warming will increase these impacts and can trigger tipping points, such as melting all of the Greenland ice sheet. Under the 2015 Paris Agreement, nations collectively agreed to keep warming "well under 2 °C". However, with pledges made under the Agreement, global warming would still reach about 2.8 °C (5.0 °F) by the end of the century.
Before the 1980s, it was unclear whether the warming effect of increased greenhouse gases was stronger than the cooling effect of airborne particulates in air pollution. Scientists used the term inadvertent climate modification to refer to human impacts on the climate at this time. In the 1980s, the terms global warming and climate change became more common, often being used interchangeably.Scientifically, global warming refers only to increased global average surface temperature, while climate change describes both global warming and its effects on Earth's climate system, such as precipitation changes.
Climate change can also be used more broadly to include changes to the climate that have happened throughout Earth's history. Global warming—used as early as 1975—became the more popular term after NASA climate scientist James Hansen used it in his 1988 testimony in the U.S. Senate. Since the 2000s, usage of climate change has increased. Various scientists, politicians and media may use the terms climate crisis or climate emergency to talk about climate change, and may use the term global heating instead of global warming.
Global surface temperature reconstruction over the past 2000 years using proxy data from tree rings, corals, and ice cores in blue. Directly observed data is in red.
Over the last few million years the climate cycled through ice ages. One of the hotter periods was the Last Interglacial, around 125,000 years ago, where temperatures were between 0.5 °C and 1.5 °C warmer than before the start of global warming. This period saw sea levels 5 to 10 metres higher than today. The most recent glacial maximum 20,000 years ago was some 5–7 °C colder. This period has sea levels that were over 125 metres (410 ft) lower than today.
Temperatures stabilized in the current interglacial period beginning 11,700 years ago. This period also saw the start of agriculture. Historical patterns of warming and cooling, like the Medieval Warm Period and the Little Ice Age,
did not occur at the same time across different regions. Temperatures
may have reached as high as those of the late 20th century in a limited
set of regions. Climate information for that period comes from climate proxies, such as trees and ice cores.
Warming since the Industrial Revolution
In recent decades, new high temperature records have substantially
outpaced new low temperature records on a growing portion of Earth's
surface.There has been an increase in ocean heat content during recent decades as the oceans absorb over 90% of the heat from global warming.
Around 1850 thermometer records began to provide global coverage. Between the 18th century and 1970 there was little net warming, as the
warming impact of greenhouse gas emissions was offset by cooling from sulfur dioxide emissions. Sulfur dioxide causes acid rain, but it also produces sulfate aerosols in the atmosphere, which reflect sunlight and cause global dimming.
After 1970, the increasing accumulation of greenhouse gases and
controls on sulfur pollution led to a marked increase in temperature.
Ongoing changes in climate have had no precedent for several thousand years. Multiple datasets all show worldwide increases in surface temperature, at a rate of around 0.2 °C per decade. The 2014–2023 decade warmed to an average 1.19 °C [1.06–1.30 °C] compared to the pre-industrial baseline (1850–1900). Not every single year was warmer than the last: internal climate variability processes can make any year 0.2 °C warmer or colder than the average. From 1998 to 2013, negative phases of two such processes, Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillation (AMO) caused a short slower period of warming called the "global warming hiatus". After the "hiatus", the opposite occurred, with 2024 well above the recent average at more than +1.5 °C. This is why the temperature change is defined in terms of a 20-year
average, which reduces the noise of hot and cold years and decadal
climate patterns, and detects the long-term signal.
A wide range of other observations reinforce the evidence of warming. The upper atmosphere is cooling, because greenhouse gases are trapping heat near the Earth's surface, and so less heat is radiating into space. Warming reduces average snow cover and forces the retreat of glaciers. At the same time, warming also causes greater evaporation from the oceans, leading to more atmospheric humidity, more and heavier precipitation. Plants are flowering earlier in spring, and thousands of animal species have been permanently moving to cooler areas.
Differences by region
Different regions of the world warm at different rates.
The pattern is independent of where greenhouse gases are emitted,
because the gases persist long enough to diffuse across the planet.
Since the pre-industrial period, the average surface temperature over
land regions has increased almost twice as fast as the global average
surface temperature. This is because oceans lose more heat by evaporation and oceans can store a lot of heat. The thermal energy in the global climate system has grown with only
brief pauses since at least 1970, and over 90% of this extra energy has
been stored in the ocean.The rest has heated the atmosphere, melted ice, and warmed the continents.
The Northern Hemisphere
and the North Pole have warmed much faster than the South Pole and
Southern Hemisphere. The Northern Hemisphere not only has much more
land, but also more seasonal snow cover and sea ice. As these surfaces flip from reflecting a lot of light to being dark after the ice has melted, they start absorbing more heat. Local black carbon deposits on snow and ice also contribute to Arctic warming. Arctic surface temperatures are increasing between three and four times faster than in the rest of the world. Melting of ice sheets near the poles weakens both the Atlantic and the Antarctic limb of thermohaline circulation, which further changes the distribution of heat and precipitation around the globe.
Future global temperatures
CMIP6 multi-model projections of global surface temperature
changes for the year 2090 relative to the 1850–1900 average. The
current trajectory for warming by the end of the century is roughly
halfway between these two extremes.
The World Meteorological Organization estimates there is almost a 50% chance of the five-year average global temperature exceeding +1.5 °C between 2024 and 2028. The IPCC expects the 20-year average to exceed +1.5 °C in the early 2030s.
The remaining carbon budget for staying beneath certain temperature increases is determined by modelling the carbon cycle and climate sensitivity to greenhouse gases. According to UNEP, global warming can be kept below 2.0 °C with a 50% chance if emissions after 2023 do not exceed 900 gigatonnes of CO2. This carbon budget corresponds to around 16 years of current emissions.
Physical drivers of global warming that has happened so far. Future global warming potential for long lived drivers like carbon dioxide emissions is not represented. Whiskers on each bar show the possible error range.
The climate system experiences various cycles on its own which can last for years, decades or even centuries. For example, El Niño events cause short-term spikes in surface temperature while La Niña events cause short term cooling. Their relative frequency can affect global temperature trends on a decadal timescale. Other changes are caused by an imbalance of energy from external forcings. Examples of these include changes in the concentrations of greenhouse gases, solar luminosity, volcanic eruptions, and variations in the Earth's orbit around the Sun.
To determine the human contribution to climate change, unique
"fingerprints" for all potential causes are developed and compared with
both observed patterns and known internal climate variability. For example, solar forcing—whose fingerprint involves warming the
entire atmosphere—is ruled out because only the lower atmosphere has
warmed. Atmospheric aerosols produce a smaller, cooling effect. Other drivers, such as changes in albedo, are less impactful.
CO2 concentrations over the last 800,000 years as measured from ice cores (blue/green) and directly (black)
Greenhouse gases are transparent to sunlight, and thus allow it to pass through the atmosphere to heat the Earth's surface. The Earth radiates it as heat,
and greenhouse gases absorb a portion of it. This absorption slows the
rate at which heat escapes into space, trapping heat near the Earth's
surface and warming it over time.
While water vapour
(≈50%) and clouds (≈25%) are the biggest contributors to the greenhouse
effect, they primarily change as a function of temperature and are
therefore mostly considered to be feedbacks that change climate sensitivity. On the other hand, concentrations of gases such as CO2 (≈20%), tropospheric ozone, CFCs and nitrous oxide are added or removed independently from temperature, and are therefore considered to be external forcings that change global temperatures.
Before the Industrial Revolution, naturally occurring amounts of
greenhouse gases caused the air near the surface to be about 33 °C
warmer than it would have been in their absence. Human activity since the Industrial Revolution, mainly extracting and burning fossil fuels (coal, oil, and natural gas), has increased the amount of greenhouse gases in the atmosphere. In 2022, the concentrations of CO2 and methane had increased by about 50% and 164%, respectively, since 1750. These CO2 levels are higher than they have been at any time during the last 14 million years. Concentrations of methane are far higher than they were over the last 800,000 years.
The Global Carbon Project shows how additions to CO2 since 1880 have been caused by different sources ramping up one after another.
While methane only lasts in the atmosphere for an average of 12 years, CO2 lasts much longer. The Earth's surface absorbs CO2 as part of the carbon cycle. While plants on land and in the ocean absorb most excess emissions of CO2 every year, that CO2 is returned to the atmosphere when biological matter is digested, burns, or decays. Land-surface carbon sink processes, such as carbon fixation in the soil and photosynthesis, remove about 29% of annual global CO2 emissions. The ocean has absorbed 20 to 30% of emitted CO2 over the last two decades. CO2
is only removed from the atmosphere for the long term when it is stored
in the Earth's crust, which is a process that can take millions of
years to complete.
Land surface changes
The
rate of global tree cover loss has approximately doubled since 2001, to
an annual loss approaching an area the size of Italy.
Around 30% of Earth's land area is largely unusable for humans (glaciers, deserts, etc.), 26% is forests, 10% is shrubland and 34% is agricultural land. Deforestation is the main land use change contributor to global warming, as the destroyed trees release CO2, and are not replaced by new trees, removing that carbon sink. Between 2001 and 2018, 27% of deforestation was from permanent clearing to enable agricultural expansion for crops and livestock. Another 24% has been lost to temporary clearing under the shifting cultivation agricultural systems. 26% was due to logging for wood and derived products, and wildfires have accounted for the remaining 23%. Some forests have not been fully cleared, but were already degraded by
these impacts. Restoring these forests also recovers their potential as a
carbon sink.
Local vegetation cover impacts how much of the sunlight gets reflected back into space (albedo), and how much heat is lost by evaporation.
For instance, the change from a dark forest to grassland makes the
surface lighter, causing it to reflect more sunlight. Deforestation can
also modify the release of chemical compounds that influence clouds, and
by changing wind patterns. In tropic and temperate areas the net effect is to produce significant
warming, and forest restoration can make local temperatures cooler. At latitudes closer to the poles, there is a cooling effect as forest is replaced by snow-covered (and more reflective) plains. Globally, these increases in surface albedo have been the dominant
direct influence on temperature from land use change. Thus, land use
change to date is estimated to have a slight cooling effect.
Other factors
Aerosols and clouds
Air pollution, in the form of aerosols, affects the climate on a large scale. Aerosols scatter and absorb solar radiation. From 1961 to 1990, a gradual reduction in the amount of sunlight reaching the Earth's surface was observed. This phenomenon is popularly known as global dimming, and is primarily attributed to sulfate aerosols produced by the combustion of fossil fuels with heavy sulfur concentrations like coal and bunker fuel. Smaller contributions come from black carbon (from combustion of fossil fuels and biomass), and from dust. Globally, aerosols have been declining since 1990 due to pollution
controls, meaning that they no longer mask greenhouse gas warming as
much.
Aerosols also have indirect effects on the Earth's energy budget. Sulfate aerosols act as cloud condensation nuclei
and lead to clouds that have more and smaller cloud droplets. These
clouds reflect solar radiation more efficiently than clouds with fewer
and larger droplets. They also reduce the growth of raindrops, which makes clouds more reflective to incoming sunlight. Indirect effects of aerosols are the largest uncertainty in radiative forcing.
While aerosols typically limit global warming by reflecting sunlight, black carbon in soot
that falls on snow or ice can contribute to global warming. Not only
does this increase the absorption of sunlight, it also increases melting
and sea-level rise. Limiting new black carbon deposits in the Arctic could reduce global warming by 0.2 °C by 2050. The effect of decreasing sulfur content of fuel oil for ships since 2020 is estimated to cause an additional 0.05 °C increase in global mean temperature by 2050.
The Fourth National Climate Assessment ("NCA4", USGCRP, 2017) includes charts illustrating that neither solar nor volcanic activity can explain the observed warming.
As the Sun is the Earth's primary energy source, changes in incoming sunlight directly affect the climate system. Solar irradiance has been measured directly by satellites, and indirect measurements are available from the early 1600s onwards. Since 1880, there has been no upward trend in the amount of the Sun's
energy reaching the Earth, in contrast to the warming of the lower
atmosphere (the troposphere). The upper atmosphere (the stratosphere) would also be warming if the Sun was sending more energy to Earth, but instead, it has been cooling. This is consistent with greenhouse gases preventing heat from leaving the Earth's atmosphere.
Explosive volcanic eruptions
can release gases, dust and ash that partially block sunlight and
reduce temperatures, or they can send water vapour into the atmosphere,
which adds to greenhouse gases and increases temperatures. These impacts on temperature only last for several years, because both
water vapour and volcanic material have low persistence in the
atmosphere. volcanic CO2 emissions are more persistent, but they are equivalent to less than 1% of current human-caused CO2 emissions. Volcanic activity still represents the single largest natural impact
(forcing) on temperature in the industrial era. Yet, like the other
natural forcings, it has had negligible impacts on global temperature
trends since the Industrial Revolution.
Sea
ice reflects 50% to 70% of incoming sunlight, while the ocean, being
darker, reflects only 6%. As an area of sea ice melts and exposes more
ocean, more heat is absorbed by the ocean, raising temperatures that
melt still more ice. This is a positive feedback process.
The climate system's response to an initial forcing is shaped by feedbacks, which either amplify or dampen the change. Self-reinforcing or positive feedbacks increase the response, while balancing or negative feedbacks reduce it. The main reinforcing feedbacks are the water-vapour feedback, the ice–albedo feedback, and the net cloud feedback. The primary balancing mechanism is radiative cooling, as Earth's surface gives off more heat to space in response to rising temperature. In addition to temperature feedbacks, there are feedbacks in the carbon cycle, such as the fertilizing effect of CO2 on plant growth. Feedbacks are expected to trend in a positive direction as greenhouse gas emissions continue, raising climate sensitivity.
These feedback processes alter the pace of global warming. For instance, warmer air can hold more moisture in the form of water vapour, which is itself a potent greenhouse gas. Warmer air can also make clouds higher and thinner, and therefore more insulating, increasing climate warming. The reduction of snow cover and sea ice in the Arctic is another major
feedback, this reduces the reflectivity of the Earth's surface in the
region and accelerates Arctic warming.This additional warming also contributes to permafrost thawing, which releases methane and CO2 into the atmosphere.
Around half of human-caused CO2 emissions have been absorbed by land plants and by the oceans. This fraction is not static and if future CO2
emissions decrease, the Earth will be able to absorb up to around 70%.
If they increase substantially, it'll still absorb more carbon than now,
but the overall fraction will decrease to below 40%. This is because climate change increases droughts and heat waves that
eventually inhibit plant growth on land, and soils will release more
carbon from dead plants when they are warmer. The rate at which oceans absorb atmospheric carbon will be lowered as they become more acidic and experience changes in thermohaline circulation and phytoplankton distribution. Uncertainty over feedbacks, particularly cloud cover, is the major reason why different climate models project different magnitudes of warming for a given amount of emissions.
Energy
flows between space, the atmosphere, and Earth's surface. Most sunlight
passes through the atmosphere to heat the Earth's surface, then
greenhouse gases absorb most of the heat the Earth radiates in response.
Adding to greenhouse gases increases this insulating effect, causing an
energy imbalance that heats the planet up.
A climate model is a representation of the physical, chemical and biological processes that affect the climate system. Models include natural processes like changes in the Earth's orbit,
historical changes in the Sun's activity, and volcanic forcing. Models are used to estimate the degree of warming future emissions will cause when accounting for the strength of climate feedbacks. Models also predict the circulation of the oceans, the annual cycle of
the seasons, and the flows of carbon between the land surface and the
atmosphere.
The physical realism of models is tested by examining their ability to simulate current or past climates. Past models have underestimated the rate of Arctic shrinkage and underestimated the rate of precipitation increase. Sea level rise since 1990 was underestimated in older models, but more recent models agree well with observations. The 2017 United States-published National Climate Assessment notes that "climate models may still be underestimating or missing relevant feedback processes". Additionally, climate models may be unable to adequately predict short-term regional climatic shifts.
A subset of climate models
add societal factors to a physical climate model. These models simulate
how population, economic growth, and energy use affect—and interact
with—the physical climate. With this information, these models can
produce scenarios of future greenhouse gas emissions. This is then used
as input for physical climate models and carbon cycle models to predict
how atmospheric concentrations of greenhouse gases might change. Depending on the socioeconomic scenario and the mitigation scenario, models produce atmospheric CO2 concentrations that range widely between 380 and 1400 ppm.
In virtually all countries and territories around the world, scientists in the field of extreme event attribution have concluded that human-caused global warming has increased the number of days of extreme heat events over long-term norms.
The environmental effects of climate change are broad and far-reaching, affecting oceans,
ice, and weather. Changes may occur gradually or rapidly. Evidence for
these effects comes from studying climate change in the past, from
modelling, and from modern observations. Since the 1950s, droughts and heat waves have appeared simultaneously with increasing frequency. Extremely wet or dry events within the monsoon period have increased in India and East Asia. Monsoonal precipitation over the Northern Hemisphere has increased since 1980. The rainfall rate and intensity of hurricanes and typhoons is likely increasing, and the geographic range likely expanding poleward in response to climate warming. The frequency of tropical cyclones has not increased as a result of climate change.
Historical sea level reconstruction and projections up to 2100 published in 2017 by the U.S. Global Change Research Program
Global sea level is rising as a consequence of thermal expansion and the melting of glaciers and ice sheets. Sea level rise has increased over time, reaching 4.8 cm per decade between 2014 and 2023. Over the 21st century, the IPCC projects 32–62 cm of sea level rise
under a low emission scenario, 44–76 cm under an intermediate one and
65–101 cm under a very high emission scenario. Marine ice sheet instability processes in Antarctica may add substantially to these values, including the possibility of a 2-meter sea level rise by 2100 under high emissions.
Climate change has led to decades of shrinking and thinning of the Arctic sea ice. While ice-free summers are expected to be rare at 1.5 °C degrees of
warming, they are set to occur once every three to ten years at a
warming level of 2 °C. Higher atmospheric CO2 concentrations cause more CO2 to dissolve in the oceans, which is making them more acidic. Because oxygen is less soluble in warmer water, its concentrations in the ocean are decreasing, and dead zones are expanding.
Tipping points and long-term impacts
Different
levels of global warming may cause different parts of Earth's climate
system to reach tipping points that cause transitions to different
states.
Greater degrees of global warming increase the risk of passing through 'tipping points'—thresholds beyond which certain major impacts can no longer be avoided even if temperatures return to their previous state. For instance, the Greenland ice sheet
is already melting, but if global warming reaches levels between 1.7 °C
and 2.3 °C, its melting will continue until it fully disappears. If the
warming is later reduced to 1.5 °C or less, it will still lose a lot
more ice than if the warming was never allowed to reach the threshold in
the first place. While the ice sheets would melt over millennia, other tipping points
would occur faster and give societies less time to respond. The collapse
of major ocean currents like the Atlantic meridional overturning circulation (AMOC), and irreversible damage to key ecosystems like the Amazon rainforest and coral reefs can unfold in a matter of decades. The collapse of the AMOC would be a severe climate catastrophe, resulting in a cooling of the Northern Hemisphere.
The long-term effects of climate change on oceans include further ice melt, ocean warming, sea level rise, ocean acidification and ocean deoxygenation. The timescale of long-term impacts are centuries to millennia due to CO2's long atmospheric lifetime. The result is an estimated total sea level rise of 2.3 metres per degree Celsius (4.2 ft/°F) after 2000 years. Oceanic CO2 uptake is slow enough that ocean acidification will also continue for hundreds to thousands of years. Deep oceans (below 2,000 metres (6,600 ft)) are also already committed
to losing over 10% of their dissolved oxygen by the warming which
occurred to date. Further, the West Antarctic ice sheet
appears committed to practically irreversible melting, which would
increase the sea levels by at least 3.3 m (10 ft 10 in) over
approximately 2000 years.
Recent warming has driven many terrestrial and freshwater species poleward and towards higher altitudes. For instance, the range of hundreds of North American birds has shifted
northward at an average rate of 1.5 km/year over the past 55 years. Higher atmospheric CO2 levels and an extended growing season have resulted in global greening. However, heatwaves and drought have reduced ecosystem productivity in some regions. The future balance of these opposing effects is unclear. A related phenomenon driven by climate change is woody plant encroachment, affecting up to 500 million hectares globally. Climate change has contributed to the expansion of drier climate zones, such as the expansion of deserts in the subtropics. The size and speed of global warming is making abrupt changes in ecosystems more likely. Overall, it is expected that climate change will result in the extinction of many species.
The oceans have heated more slowly than the land, but plants and
animals in the ocean have migrated towards the colder poles faster than
species on land. Just as on land, heat waves in the ocean occur more frequently due to climate change, harming a wide range of organisms such as corals, kelp, and seabirds. Ocean acidification makes it harder for marine calcifying organisms such as mussels, barnacles and corals to produce shells and skeletons; and heatwaves have bleached coral reefs. Harmful algal blooms enhanced by climate change and eutrophication lower oxygen levels, disrupt food webs and cause great loss of marine life. Coastal ecosystems are under particular stress. Almost half of global
wetlands have disappeared due to climate change and other human impacts. Plants have come under increased stress from damage by insects.
Extreme weather will be progressively more common as the Earth warms.
The effects of climate change are impacting humans everywhere in the world. Impacts can be observed on all continents and ocean regions, with low-latitude, less developed areas facing the greatest risk. Continued warming has potentially "severe, pervasive and irreversible impacts" for people and ecosystems. The risks are unevenly distributed, but are generally greater for disadvantaged people in developing and developed countries.
The World Health Organization calls climate change one of the biggest threats to global health in the 21st century. Scientists have warned about the irreversible harms it poses. Extreme weather events affect public health, and food and water security. Temperature extremes lead to increased illness and death. Climate change increases the intensity and frequency of extreme weather events. It can affect transmission of infectious diseases, such as dengue fever and malaria. According to the World Economic Forum, 14.5 million more deaths are expected due to climate change by 2050. 30% of the global population currently live in areas where extreme heat and humidity are already associated with excess deaths.By 2100, 50% to 75% of the global population would live in such areas.
While total crop yields have been increasing in the past 50 years due to agricultural improvements, climate change has already decreased the rate of yield growth. Fisheries have been negatively affected in multiple regions. While agricultural productivity has been positively affected in some high latitude areas, mid- and low-latitude areas have been negatively affected. According to the World Economic Forum, an increase in drought in certain regions could cause 3.2 million deaths from malnutrition by 2050 and stunting in children. With 2 °C warming, global livestock headcounts could decline by 7–10% by 2050, as less animal feed will be available. If the emissions continue to increase for the rest of century, then
over 9 million climate-related deaths would occur annually by 2100.
Economic damages due to climate change may be severe and there is a chance of disastrous consequences. Severe impacts are expected in South-East Asia and sub-Saharan Africa, where most of the local inhabitants are dependent upon natural and agricultural resources. Heat stress
can prevent outdoor labourers from working. If warming reaches 4 °C
then labour capacity in those regions could be reduced by 30 to 50%. The World Bank
estimates that between 2016 and 2030, climate change could drive over
120 million people into extreme poverty without adaptation.
Inequalities based on wealth and social status have worsened due to climate change. Major difficulties in mitigating, adapting to, and recovering from
climate shocks are faced by marginalized people who have less control
over resources. Indigenous people,
who are subsistent on their land and ecosystems, will face endangerment
to their wellness and lifestyles due to climate change. An expert elicitation concluded that the role of climate change in armed conflict has been small compared to factors such as socio-economic inequality and state capabilities.
While women are not inherently more at risk from climate change
and shocks, limits on women's resources and discriminatory gender norms
constrain their adaptive capacity and resilience. For example, women's work burdens, including hours worked in
agriculture, tend to decline less than men's during climate shocks such
as heat stress.
Low-lying islands and coastal communities are threatened by sea level rise, which makes urban flooding more common. Sometimes, land is permanently lost to the sea. This could lead to statelessness for people in island nations, such as the Maldives and Tuvalu. In some regions, the rise in temperature and humidity may be too severe for humans to adapt to. With worst-case climate change, models project that areas almost
one-third of humanity live in might become Sahara-like uninhabitable and
extremely hot climates.
These factors can drive climate or environmental migration, within and between countries. More people are expected to be displaced because of sea level rise,
extreme weather and conflict from increased competition over natural
resources. Climate change may also increase vulnerability, leading to
"trapped populations" who are not able to move due to a lack of
resources.
Climate change impacts on people
Environmental migration. Sparser rainfall leads to desertification that harms agriculture and can displace populations. Shown: Telly, Mali (2008).
Agricultural changes. Droughts, rising temperatures, and extreme weather negatively impact agriculture. Shown: Texas, US (2013).
Global greenhouse gas emission scenarios, based on policies and pledges as of November 2021
Climate change can be mitigated by reducing the rate at which
greenhouse gases are emitted into the atmosphere, and by increasing the
rate at which carbon dioxide is removed from the atmosphere. To limit global warming to less than 2 °C global greenhouse gas emissions need to be net-zero by 2070. This requires far-reaching, systemic changes on an unprecedented scale
in energy, land, cities, transport, buildings, and industry.
The United Nations Environment Programme estimates that countries need to triple their pledges under the Paris Agreement within the next decade to limit global warming to 2 °C. With pledges made under the Paris Agreement as of 2024, there would be a
66% chance that global warming is kept under 2.8 °C by the end of the
century (range: 1.9–3.7 °C, depending on exact implementation and
technological progress). When only considering current policies, this
raises to 3.1 °C. Globally, limiting warming to 2 °C may result in higher economic benefits than economic costs.
Although there is no single pathway to limit global warming to 2 °C, most scenarios and strategies see a major increase in the use of
renewable energy in combination with increased energy efficiency
measures to generate the needed greenhouse gas reductions. To reduce pressures on ecosystems and enhance their carbon
sequestration capabilities, changes would also be necessary in
agriculture and forestry, such as preventing deforestation and restoring natural ecosystems by reforestation.
Other approaches to mitigating climate change have a higher level
of risk. Scenarios that limit global warming to 1.5 °C typically
project the large-scale use of carbon dioxide removal methods over the 21st century. There are concerns, though, about over-reliance on these technologies, and environmental impacts.
Solar radiation modification
(SRM) is a proposal for reducing global warming by reflecting some
sunlight away from Earth and back into space. Because it does not reduce
greenhouse gas concentrations, it would not address ocean acidification and is not considered mitigation. SRM should be considered only as a supplement to mitigation, not a replacement for it, due to risks such as rapid warming if it were abruptly stopped and not restarted. The most-studied approach is stratospheric aerosol injection. SRM could reduce global warming and some of its impacts, though imperfectly. It poses environmental risks, such as changes to rainfall patterns, as well as political challenges, such as who would decide whether to use it.
Coal, oil, and natural gas remain the primary global energy sources even as renewables have begun rapidly increasing.Wind and solar power, Germany
Renewable energy is key to limiting climate change. For decades, fossil fuels have accounted for roughly 80% of the world's energy use. The remaining share has been split between nuclear power and renewables (including hydropower, bioenergy, wind and solar power and geothermal energy). Fossil fuel use is expected to peak in absolute terms prior to 2030 and
then to decline, with coal use experiencing the sharpest reductions. Renewables represented 86% of all new electricity generation installed in 2023. Other forms of clean energy, such as nuclear and hydropower, currently
have a larger share of the energy supply. However, their future growth
forecasts appear limited in comparison.
While solar panels and onshore wind are now among the cheapest forms of adding new power generation capacity in many locations, green energy policies are needed to achieve a rapid transition from fossil fuels to renewables. To achieve carbon neutrality by 2050, renewable energy would become the
dominant form of electricity generation, rising to 85% or more by 2050
in some scenarios. Investment in coal would be eliminated and coal use
nearly phased out by 2050.
Electricity generated from renewable sources would also need to become the main energy source for heating and transport. Transport can switch away from internal combustion engine vehicles and towards electric vehicles, public transit, and active transport (cycling and walking). For shipping and flying, low-carbon fuels would reduce emissions. Heating could be increasingly decarbonized with technologies like heat pumps.
There are obstacles to the continued rapid growth of clean energy, including renewables. Wind and solar produce energy intermittently and with seasonal variability. Traditionally, hydro dams with reservoirs and fossil fuel power plants have been used when variable energy production is low. Going forward, battery storage can be expanded, energy demand and supply can be matched, and long-distance transmission can smooth variability of renewable outputs. Bioenergy is often not carbon-neutral and may have negative consequences for food security. The growth of nuclear power is constrained by controversy around radioactive waste, nuclear weapon proliferation, and accidents.Hydropower growth is limited by the fact that the best sites have been
developed, and new projects are confronting increased social and
environmental concerns.
Low-carbon energy improves human health by minimizing climate change as well as reducing air pollution deaths, which were estimated at 7 million annually in 2016. Meeting the Paris Agreement goals that limit warming to a 2 °C increase
could save about a million of those lives per year by 2050, whereas
limiting global warming to 1.5 °C could save millions and simultaneously
increase energy security and reduce poverty. Improving air quality also has economic benefits which may be larger than mitigation costs.
Reducing energy demand is another major aspect of reducing emissions. If less energy is needed, there is more flexibility for clean energy
development. It also makes it easier to manage the electricity grid, and
minimizes carbon-intensive infrastructure development. Major increases in energy efficiency investment will be required to
achieve climate goals, comparable to the level of investment in
renewable energy. Several COVID-19
related changes in energy use patterns, energy efficiency investments,
and funding have made forecasts for this decade more difficult and
uncertain.
Strategies to reduce energy demand vary by sector. In the
transport sector, passengers and freight can switch to more efficient
travel modes, such as buses and trains, or use electric vehicles. Industrial strategies to reduce energy demand include improving heating
systems and motors, designing less energy-intensive products, and
increasing product lifetimes. In the building sector the focus is on better design of new buildings, and higher levels of energy efficiency in retrofitting. The use of technologies like heat pumps can also increase building energy efficiency.
Taking
into account direct and indirect emissions, industry is the sector with
the highest share of global emissions. Data as of 2019 from the IPCC.
Agriculture and forestry face a triple challenge of limiting greenhouse
gas emissions, preventing the further conversion of forests to
agricultural land, and meeting increases in world food demand. A set of actions could reduce agriculture and forestry-based emissions
by two-thirds from 2010 levels. These include reducing growth in demand
for food and other agricultural products, increasing land productivity,
protecting and restoring forests, and reducing greenhouse gas emissions
from agricultural production.
On the demand side, a key component of reducing emissions is shifting people towards plant-based diets. Eliminating the production of livestock for meat and dairy would eliminate about 3/4ths of all emissions from agriculture and other land use. Livestock also occupy 37% of ice-free land area on Earth and consume
feed from the 12% of land area used for crops, driving deforestation and
land degradation.
Steel and cement production are responsible for about 13% of industrial CO2
emissions. In these industries, carbon-intensive materials such as coke
and lime play an integral role in the production, so that reducing CO2 emissions requires research into alternative chemistries. Where energy production or CO2-intensive heavy industries continue to produce waste CO2, technology can sometimes be used to capture and store most of the gas instead of releasing it to the atmosphere. This technology, carbon capture and storage (CCS), could have a critical but limited role in reducing emissions. It is relatively expensive and has been deployed only to an extent that removes around 0.1% of annual greenhouse gas emissions.
Natural carbon sinks can be enhanced to sequester significantly larger amounts of CO2 beyond naturally occurring levels. Reforestation and afforestation
(planting forests where there were none before) are among the most
mature sequestration techniques, although the latter raises food
security concerns. Farmers can promote sequestration of carbon in soils through practices such as use of winter cover crops, reducing the intensity and frequency of tillage, and using compost and manure as soil amendments. Forest and landscape restoration yields many benefits for the climate,
including greenhouse gas emissions sequestration and reduction. Restoration/recreation of coastal wetlands, prairie plots and seagrass meadows increases the uptake of carbon into organic matter. When carbon is sequestered in soils and in organic matter such as
trees, there is a risk of the carbon being re-released into the
atmosphere later through changes in land use, fire, or other changes in
ecosystems.
The use of bioenergy in conjunction with carbon capture and storage (BECCS) can result in net negative emissions as CO2 is drawn from the atmosphere. It remains highly uncertain whether carbon dioxide removal techniques
will be able to play a large role in limiting warming to 1.5 °C. Policy
decisions that rely on carbon dioxide removal increase the risk of
global warming rising beyond international goals.
Adaptation is "the process of adjustment to current or expected changes in climate and its effects". Without additional mitigation, adaptation cannot avert the risk of "severe, widespread and irreversible" impacts. More severe climate change requires more transformative adaptation, which can be prohibitively expensive. The capacity and potential for humans to adapt is unevenly distributed across different regions and populations, and developing countries generally have less. The first two decades of the 21st century saw an increase in adaptive
capacity in most low- and middle-income countries with improved access
to basic sanitation
and electricity, but progress is slow. Many countries have implemented
adaptation policies. However, there is a considerable gap between
necessary and available finance.
Adaptation to sea level rise consists of avoiding at-risk areas, learning to live with increased flooding, and building flood controls. If that fails, managed retreat may be needed. There are economic barriers for tackling dangerous heat impact. Avoiding strenuous work or having air conditioning is not possible for everybody. In agriculture, adaptation options include a switch to more sustainable
diets, diversification, erosion control, and genetic improvements for
increased tolerance to a changing climate. Insurance allows for risk-sharing, but is often difficult to get for people on lower incomes. Education, migration and early warning systems can reduce climate vulnerability. Planting mangroves or encouraging other coastal vegetation can buffer storms.
Ecosystems adapt to climate change, a process that can be
supported by human intervention. By increasing connectivity between
ecosystems, species can migrate to more favourable climate conditions.
Species can also be introduced to areas acquiring a favourable climate.
Protection and restoration of natural and semi-natural areas helps
build resilience, making it easier for ecosystems to adapt. Many of the
actions that promote adaptation in ecosystems, also help humans adapt
via ecosystem-based adaptation. For instance, restoration of natural fire regimes
makes catastrophic fires less likely, and reduces human exposure.
Giving rivers more space allows for more water storage in the natural
system, reducing flood risk. Restored forest acts as a carbon sink, but
planting trees in unsuitable regions can exacerbate climate impacts.
There are synergies but also trade-offs between adaptation and mitigation. An example for synergy is increased food productivity, which has large benefits for both adaptation and mitigation. An example of a trade-off is that increased use of air conditioning allows people to better cope with heat, but increases energy demand. Another trade-off example is that more compact urban development may reduce emissions from transport and construction, but may also increase the urban heat island effect, exposing people to heat-related health risks.
The Climate Change Performance Index
ranks countries by greenhouse gas emissions (40% of score), renewable
energy (20%), energy use (20%), and climate policy (20%).
High
Medium
Low
Very low
No data
Countries that are most vulnerable to climate change have typically been responsible for a small share of global emissions. This raises questions about justice and fairness. Limiting global warming makes it much easier to achieve the UN's Sustainable Development Goals, such as eradicating poverty and reducing inequalities. The connection is recognized in Sustainable Development Goal 13 which is to "take urgent action to combat climate change and its impacts". The goals on food, clean water and ecosystem protection have synergies with climate mitigation.
The geopolitics of climate change is complex. It has often been framed as a free-rider problem,
in which all countries benefit from mitigation done by other countries,
but individual countries would lose from switching to a low-carbon economy themselves. Sometimes mitigation also has localized benefits though. For instance, the benefits of a coal phase-out to public health and local environments exceed the costs in almost all regions. Furthermore, net importers of fossil fuels win economically from switching to clean energy, causing net exporters to face stranded assets: fossil fuels they cannot sell.
A wide range of policies, regulations, and laws are being used to reduce emissions. As of 2019, carbon pricing covers about 20% of global greenhouse gas emissions. Carbon can be priced with carbon taxes and emissions trading systems. Direct global fossil fuel subsidies reached $319 billion in 2017, and $5.2 trillion when indirect costs such as air pollution are priced in. Ending these can cause a 28% reduction in global carbon emissions and a 46% reduction in air pollution deaths. Money saved on fossil subsidies could be used to support the transition to clean energy instead. More direct methods to reduce greenhouse gases include vehicle
efficiency standards, renewable fuel standards, and air pollution
regulations on heavy industry. Several countries require utilities to increase the share of renewables in power production. A global carbon market coalition
proposed at COP30 (2025) was estimated to increase emissions reduction
seven-fold over current policies, while delivering $200 billion per year
for clean-energy and social programs.
Climate justice
Policy designed through the lens of climate justice
tries to address human rights issues and social inequality. According
to proponents of climate justice, the costs of climate adaptation should
be paid by those most responsible for climate change, while the
beneficiaries of payments should be those suffering impacts. One way
this can be addressed in practice is to have wealthy nations pay poorer
countries to adapt.[372]
Oxfam found that in 2023 the wealthiest 10% of people were
responsible for 50% of global emissions, while the bottom 50% were
responsible for just 8%. Production of emissions is another way to look at responsibility: under
that approach, the top 21 fossil fuel companies would owe cumulative climate reparations of $5.4 trillion over the period 2025–2050. To achieve a just transition, people working in the fossil fuel sector would also need other jobs, and their communities would need investments.
Since 2000, rising CO2 emissions in China and the rest of world have surpassed the output of the United States and Europe.Per person, the United States generates CO2 at a far faster rate than other primary regions.
Nearly all countries in the world are parties to the 1994 United Nations Framework Convention on Climate Change (UNFCCC). The goal of the UNFCCC is to prevent dangerous human interference with the climate system. As stated in the convention, this requires that greenhouse gas
concentrations are stabilized in the atmosphere at a level where
ecosystems can adapt naturally to climate change, food production is not
threatened, and economic development can be sustained. The UNFCCC does not itself restrict emissions but rather provides a
framework for protocols that do. Global emissions have risen since the
UNFCCC was signed. Its yearly conferences are the stage of global negotiations.
The 1997 Kyoto Protocol extended the UNFCCC and included legally binding commitments for most developed countries to limit their emissions. During the negotiations, the G77 (representing developing countries) pushed for a mandate requiring developed countries to "[take] the lead" in reducing their emissions, since developed countries contributed most to the accumulation of greenhouse gases
in the atmosphere. Per-capita emissions were also still relatively low
in developing countries and developing countries would need to emit more
to meet their development needs.
The 2009 Copenhagen Accord has been widely portrayed as disappointing because of its low goals, and was rejected by poorer nations including the G77. Associated parties aimed to limit the global temperature rise to below 2 °C. The accord set the goal of sending $100 billion per year to developing
countries for mitigation and adaptation by 2020, and proposed the
founding of the Green Climate Fund. As of 2020, only 83.3 billion were delivered. Only in 2023 the target is expected to be achieved.
In 2015 all UN countries negotiated the Paris Agreement, which aims to keep global warming well below 2.0 °C and contains an aspirational goal of keeping warming under 1.5 °C. The agreement replaced the Kyoto Protocol. Unlike Kyoto, no binding
emission targets were set in the Paris Agreement. Instead, a set of
procedures was made binding. Countries have to regularly set ever more
ambitious goals and reevaluate these goals every five years. The Paris Agreement restated that developing countries must be financially supported. As of March 2025, 194 states and the European Union have acceded to or ratified the agreement.
The 1987 Montreal Protocol, an international agreement to phase out production of ozone-depleting gases, has had benefits for climate change mitigation. Several ozone-depleting gases like chlorofluorocarbons are powerful greenhouse gases, so banning their production and usage may have avoided a temperature rise of 0.5 °C–1.0 °C, as well as additional warming by preventing damage to vegetation from ultraviolet radiation. It is estimated that the agreement has been more effective at curbing
greenhouse gas emissions than the Kyoto Protocol specifically designed
to do so. The most recent amendment to the Montreal Protocol, the 2016 Kigali Amendment, committed to reducing the emissions of hydrofluorocarbons, which served as a replacement for banned ozone-depleting gases and are also potent greenhouse gases. Should countries comply with the amendment, a warming of 0.3 °C–0.5 °C is estimated to be avoided.
In 2019, the United Kingdom parliament became the first national government to declare a climate emergency. Other countries and jurisdictions followed suit. That same year, the European Parliament declared a "climate and environmental emergency". The European Commission presented its European Green Deal with the goal of making the EU carbon-neutral by 2050. In 2021, the European Commission released its "Fit for 55" legislation package, which contains guidelines for the car industry; all new cars on the European market must be zero-emission vehicles from 2035.
Major countries in Asia have made similar pledges: South Korea
and Japan have committed to become carbon-neutral by 2050, and China by
2060. While India has strong incentives for renewables, it also plans a significant expansion of coal in the country. Vietnam is among very few coal-dependent, fast-developing countries
that pledged to phase out unabated coal power by the 2040s or as soon as
possible thereafter.
As of 2021, based on information from 48 national climate plans,
which represent 40% of the parties to the Paris Agreement, estimated
total greenhouse gas emissions will be 0.5% lower compared to 2010
levels, below the 45% or 25% reduction goals to limit global warming to
1.5 °C or 2 °C, respectively.
Data has been cherry picked
from short periods to falsely assert that global temperatures are not
rising. Blue trendlines show short periods that mask longer-term warming
trends (red trendlines). Blue rectangle with blue dots shows the
so-called global warming hiatus.
Public debate about climate change has been strongly affected by climate change denial and misinformation,
which first emerged in the United States and has since spread to other
countries, particularly Canada and Australia. It originated from fossil
fuel companies, industry groups, conservative think tanks, and contrarian scientists. Like the tobacco industry, the main strategy of these groups has been to manufacture doubt about climate-change related scientific data and results. People who hold unwarranted doubt about climate change are sometimes
called climate change "skeptics", although "contrarians" or "deniers"
are more appropriate terms.
There are different variants of climate denial: some deny that
warming takes place at all, some acknowledge warming but attribute it to
natural influences, and some minimize the negative impacts of climate
change. Manufacturing uncertainty about the science later developed into a manufactured controversy:
creating the belief that there is significant uncertainty about climate
change within the scientific community to delay policy changes. Strategies to promote these ideas include criticism of scientific institutions, and questioning the motives of individual scientists. An echo chamber of climate-denying blogs and media has further fomented misunderstanding of climate change.
The public substantially underestimates the degree of scientific consensus that humans are causing climate change (2022 data). Studies from 2019 to 2021found scientific consensus to range from 98.7 to 100%.
Climate change came to international public attention in the late 1980s. Due to media coverage in the early 1990s, people often confused climate
change with other environmental issues like ozone depletion. In popular culture, the climate fiction movie The Day After Tomorrow (2004) and the Al Gore documentary An Inconvenient Truth (2006) focused on climate change.
Significant regional, gender, age and political differences exist
in both public concern for, and understanding of, climate change. More
highly educated people, and in some countries, women and younger people,
were more likely to see climate change as a serious threat. College biology textbooks from the 2010s featured less content on
climate change compared to those from the preceding decade, with
decreasing emphasis on solutions. Partisan gaps also exist in many countries, and countries with high CO2 emissions tend to be less concerned. Views on causes of climate change vary widely between countries. Media coverage linked to protests has had impacts on public sentiment
as well as on which aspects of climate change are focused upon. Higher levels of worry are associated with stronger public support for policies that address climate change. Concern has increased over time, and in 2021 a majority of citizens in 30 countries expressed a high
level of worry about climate change, or view it as a global emergency. A 2024 survey across 125 countries found that 89% of the global
population demanded intensified political action, but systematically underestimated other peoples' willingness to act.
Climate protests demand that political leaders take action to prevent
climate change. They can take the form of public demonstrations, fossil fuel divestment, lawsuits and other activities. Prominent demonstrations include the School Strike for Climate.
In this initiative, young people across the globe have been protesting
since 2018 by skipping school on Fridays, inspired by Swedish activist
and then-teenager Greta Thunberg. Mass civil disobedience actions by groups like Extinction Rebellion have protested by disrupting roads and public transport.
Litigation is increasingly used as a tool to strengthen climate action
from public institutions and companies. Activists also initiate
lawsuits which target governments and demand that they take ambitious
action or enforce existing laws on climate change. Lawsuits against fossil-fuel companies generally seek compensation for loss and damage. On 23 July 2025, the UN's International Court of Justice
issued its advisory opinion, saying explicitly that states must act to
stop climate change, and if they fail to accomplish that duty, other
states can sue them. This obligation includes implementing their
commitments in international agreements they are parties to, such as the
2015 Paris Climate Accord.
Eunice Newton Foote showed carbon dioxide's heat-capturing effect in 1856, foreseeing its implications for the planet. (Carbon dioxide was called "carbonic acid gas".)
Scientists in the 19th century such as Alexander von Humboldt began to foresee the effects of climate change. In the 1820s, Joseph Fourier
proposed the greenhouse effect to explain why Earth's temperature was
higher than the Sun's energy alone could explain. Earth's atmosphere is
transparent to sunlight, so sunlight reaches the surface where it is
converted to heat. However, the atmosphere is not transparent to heat
radiating from the surface, and captures some of that heat, which in
turn warms the planet. In 1856 Eunice Newton Foote
demonstrated that the warming effect of the Sun is greater for air with
water vapour than for dry air, and that the effect is even greater with
carbon dioxide (CO2).
In "Circumstances Affecting the Heat of the Sun's Rays" she concluded
that "[a]n atmosphere of that gas would give to our earth a high
temperature".
This
1912 article succinctly describes the greenhouse effect, how burning
coal creates carbon dioxide to cause global warming and climate change.
Starting in 1859, John Tyndall
established that nitrogen and oxygen—together totalling 99% of dry
air—are transparent to radiated heat. However, water vapour and gases
such as methane and carbon dioxide absorb radiated heat and re-radiate
that heat into the atmosphere. Tyndall proposed that changes in the
concentrations of these gases may have caused climatic changes in the
past, including ice ages.
Svante Arrhenius noted that water vapour in air continuously varied, but the CO2 concentration in air was influenced by long-term geological processes. Warming from increased CO2
levels would increase the amount of water vapour, amplifying warming in
a positive feedback loop. In 1896, he published the first climate model of its kind, projecting that halving CO2
levels could have produced a drop in temperature initiating an ice age.
Arrhenius calculated the temperature increase expected from doubling CO2 to be around 5–6 °C. Other scientists were initially sceptical and believed that the greenhouse effect was saturated so that adding more CO2 would make no difference, and that the climate would be self-regulating. Beginning in 1938, Guy Stewart Callendar published evidence that climate was warming and CO2 levels were rising, but his calculations met the same objections.
Scientific consensus on causation:
Academic studies of scientific agreement on human-caused global warming
among climate experts (2010–2015) reflect that the level of consensus
correlates with expertise in climate science. A 2019 study found scientific consensus to be at 100%, and a 2021 study concluded that consensus exceeded 99%. Another 2021 study found that 98.7% of climate experts indicated that
the Earth is getting warmer mostly because of human activity.
In the 1950s, Gilbert Plass
created a detailed computer model that included different atmospheric
layers and the infrared spectrum. This model predicted that increasing
CO2 levels would cause warming. Around the same time, Hans Suess found evidence that CO2 levels had been rising, and Roger Revelle showed that the oceans would not absorb the increase. The two scientists subsequently helped Charles Keeling to begin a record of continued increase—the "Keeling Curve"—which was part of continued scientific investigation through the 1960s into possible human causation of global warming. Studies such as the National Research Council's 1979 Charney Report supported the accuracy of climate models that forecast significant warming. Human causation of observed global warming and dangers of unmitigated warming were publicly presented in James Hansen's 1988 testimony before a US Senate committee. The Intergovernmental Panel on Climate Change (IPCC), set up in 1988 to provide formal advice to the world's governments, spurred interdisciplinary research. As part of the IPCC reports, scientists assess the scientific discussion that takes place in peer-reviewedjournal articles.
There is a nearly unanimous scientific consensus that the climate is warming and that this is caused by human activities. No scientific body of national or international standing disagrees with this view. As of 2019, agreement in recent literature reached over 99%. The 2021 IPCC Assessment Report stated that it is "unequivocal" that climate change is caused by humans. Consensus has further developed that action should be taken to protect
people against the impacts of climate change. National science academies
have called on world leaders to cut global emissions.
Recent developments
Extreme event attribution (EEA), also known as attribution science, was developed in the early decades of the 21st century. EEA uses climate models
to identify and quantify the role that human-caused climate change
plays in the frequency, intensity, duration, and impacts of specific
individual extreme weather events. Results of attribution studies allow scientists and journalists to make
statements such as, "this weather event was made at least n times more likely by human-caused climate change" or "this heatwave was made m
degrees hotter than it would have been in a world without global
warming" or "this event was effectively impossible without climate
change". Greater computing power in the 2000s and conceptual breakthroughs in the early to mid 2010s enabled attribution science to detect the effects of climate change on some events with high confidence. Scientists use attribution methods and climate simulations that have already been peer reviewed, allowing "rapid attribution studies" to be published within a "news cycle" time frame after weather events.