Carbon pricing (or CO2 pricing) is a method for governments to mitigate climate change, in which a monetary cost is applied to greenhouse gas emissions in order to encourage polluters to reduce fossil fuel combustion, the main driver of climate change. A carbon price usually takes the form of a carbon tax, or an emissions trading scheme (ETS) that requires firms to purchase allowances to emit.
The method is widely agreed to be an efficient policy for reducing
greenhouse gas emissions. Carbon pricing seeks to address the economic
problem that emissions of CO2 and other greenhouse gases are a negative externality – a detrimental product that is not charged for by any market.
21.7% of global GHG emissions are covered by carbon pricing in 2021, a major increase due to the introduction of the Chinese national carbon trading scheme. Regions with carbon pricing include most European countries and Canada. On the other hand, top emitters like India, Russia, the Gulf states and many US states have not introduced carbon pricing. Australia had a carbon pricing scheme from 2012 to 2014. In 2020, carbon pricing generated $53B in revenue.
According to the Intergovernmental Panel on Climate Change, a price level of $135–$5500 in 2030 and $245–$13,000 per metric ton CO2 in 2050 would be needed to drive carbon emissions to stay below the 1.5°C limit. Latest models of the social cost of carbon calculate a damage of more than $300 per ton of CO2 as a result of economy feedbacks and falling global GDP growth rates, while policy recommendations range from about $50 to $200. Many carbon pricing schemes including the ETS in China remain below $10 per ton of CO2.[3] One exception is the European Union Emissions Trading System (EU-ETS) which exceeded €100 ($108) per ton of CO2 in February 2023.
A carbon tax is generally favoured on economic grounds for its
simplicity and stability, while cap-and-trade theoretically offers the
possibility to limit allowances to the remaining carbon budget. Current implementations are only designed to meet certain reduction targets.
Overview
Carbon pricing is considered by many economists to be the most economically efficient way to reduce emissions,
taking into account the costs of both efficiency measures and the
inconvenience of lesser fossil fuels. By pricing the externalities of
carbon emissions, efficiency comes about by eliminating the market failure of the unpriced external costs of carbon emissions at its source. It is regarded as more efficient than renewable energy subsidies given to individual firms, because the difficulties of determining the value of emissions to each firm makes command and control regulation less likely to be efficient.
In a carbon tax model, a tax is imposed on carbon emissions produced by a firm. In a cap-and-trade
design, the government establishes an emissions cap and allocates to
firms emission allowances, which can thereafter be privately traded.
Emitters without the required allowances face a penalty more than the
price of permits. Assuming all else is equal, the market for permits will automatically adjust the carbon price to a level that ensures that the cap is met. The EU ETS
uses this method. In practice, it has resulted in a fairly strong
carbon price from 2005 to 2009, but that was later undermined by an
oversupply and the Great Recession. Recent policy changes have led to a steep increase of the carbon price since 2018, exceeding 100€ ($118) per ton of CO2 in February 2023.
The exact monetary damage of the social cost caused by a tonne of CO2
depends on climate and economic feedback effects and remains to some
degree uncertain. Latest calculations show an increasing trend:
Source
Year
Carbon price per ton of CO2
Remarks
Interagency Working Group (US government)
2013 / 2016
$42
Central estimate for 3% discount rate in 2020
$212
High impact value for 2050 / 3% discount / 95th percentile
Cap-and-trade systems can include price stability provisions with floor and ceiling limits. Such designs are often referred to as hybrid designs. To the extent the price is controlled by these limits, it can be considered a tax.
Carbon tax versus emissions trade
Carbon emissions trading
works by setting a quantitative limit on the emissions produced by
emitters. As a result, the price automatically adjusts to this target.
This is the main advantage compared to a fixed carbon tax.
A carbon tax is considered easier to enforce on a broad-base scale than
cap-and-trade programs. The simplicity and immediacy of a carbon tax
has been proven effective in British Columbia, Canada – enacted and
implemented in five months.
A hybrid cap-and-trade program puts a limit on price increases and, in
some cases, sets a floor price as well. The upper limit is set by adding
more allowances to the market at a set price while the floor price is
maintained by not allowing sales into the market at a price below the
floor. The Regional Greenhouse Gas Initiative, for example, sets an upper limit on allowance prices through its cost containment provision.
However, industries may successfully lobby to exempt themselves
from a carbon tax. It is therefore argued that with emissions trading,
polluters have an incentive to cut emissions, but if they are exempted
from a carbon tax, they have no incentive to cut emissions. On the other hand, freely distributing emission permits could potentially lead to corrupt behaviour.
Most cap and trade programs have a descending cap, usually a
fixed percentage every year, which gives certainty to the market and
guarantees that emissions will decline over time. With a tax, there can
be estimates of reduction in carbon emissions, which may not be
sufficient to change the course of climate change. A declining cap gives
allowance for firm reduction targets and a system for measuring when
targets are met. It also allows for flexibility, unlike rigid taxes.
Providing emission permits (also called allowances) under emissions
trading is preferred in situations where a more accurate target level of
emissions certainty is needed.
Revenue policies
Standard proposals for using carbon revenues include
a return to the public on a per-capita basis
This can compensate the risk of rising energy prices reaching high
levels as long as cheap wind and solar power is not available yet. Rich
people who tend to have a larger carbon footprint would pay more while
poorer people can even benefit from such a regulation.
subsidies accelerating the transition to renewable energy
research, public transport, car sharing and other policies that promote carbon neutrality
subsidies for negative emissions: Depending on the technology, such as PyCCS or BECCS, the cost for generating negative emissions is about $150–165 per ton of CO2. The removal past emissions – 1,700 Gt in total
– can theoretically be addressed by auctioning allowances starting with
a price that exceeds the removal costs of the proposed emissions.
Price levels
About one third of the systems stays below $10/tCO2,
the majority is below $40. One exception is the steep incline in the
EU-ETS reaching $60 in September 2021. Sweden and Switzerland are the
only countries with more than $100/tCO2.
Market price surge in fossil fuels
Unexpected
spikes in natural gas prices and commodities such as oil and coal in
2021 caused a debate whether a carbon price increase should be postponed
to avoid additional social burden. On the other hand, a redistribution
on a per-capita-basis would even release poorer households which tend to
consume less energy compared to wealthier parts of the population. The
higher the high carbon price the greater the relief. Looking at
individual situations though, the compensation would not apply to
commuters in rural areas or people living in houses with poor
insulation. They neither have liquidity to invest into solutions using
less fossil fuels
and would be dependent on credits or subsidies. On the other hand, a
carbon price still helps to provide an incentive to use more effective
fossil fuel technologies such as CCGT gas turbines in contrast to high-emission coal.
Scope and coverage
In the relevant countries with ETS and taxes, about 40% to 80% of emissions are covered.
The schemes differ much in detail. They include or exclude fuels,
transport, heating, agriculture or other greenhouse gases apart from CO2 like methane or fluorinated gases.
In many EU member states like France or Germany, there is a coexistence
of two systems: The EU-ETS covers power generation and large industry
emissions while national ETS or taxes put a different price on petrol,
natural gas and oil for private consumption.
Carbon pricing schemes with more than $2 bn revenue
The
final consumer price for fuels and electric energy depends on
individual tax regulations and conditions in each country. Though carbon
pricing is playing an increasing role, energy taxes, VAT, utility expenses and other components are still the main cause for completely different price levels between countries.
Impact on retail prices
The table gives examples for a carbon price of $100 or 100 units of any other currency accordingly. Food calculation is all based on CO2 equivalents including the high impact of methane emissions.
FUEL
impact
1 L petrol
$0.24
1 L diesel
$0.27
TRANSPORT
impact
remarks
500 km car travel, 1 passenger
$8.40
7 L petrol per 100 km
500 km jet aircraft per seat
$6.70
0.134 kgCO2/km, Domestic flight NZ, A320, 173 seats, all occupied, with radiative forcing multiplier
500 km small aircraft per seat
$32.95
0.659 kgCO2/km, Domestic flight NZ, less than 50 seats, all occupied
Many economic properties of carbon pricing hold regardless of whether
carbon is priced with a cap or a tax. However, there are a few
important differences. Cap-based prices are more volatile and so they
are riskier for investors, consumers and for governments that auction
permits. Also, caps tend to short-out the effect of non-price policies
such as renewables subsidies, while carbon taxes do not.
Carbon leakage
Carbon leakage
is the effect that regulation of emissions in one country/sector has on
the emissions in other countries/sectors that are not subject to the
same regulation. There is no consensus over the magnitude of long-term carbon leakage.
The leakage rate is defined as the increase in CO2
emissions outside the countries taking domestic mitigation action,
divided by the reduction in emissions of countries taking domestic
mitigation action. Accordingly, a leakage rate greater than 100% means
that actions to reduce emissions within countries had the effect of
increasing emissions in other countries to a greater extent, i.e.,
domestic mitigation action had actually led to an increase in global
emissions.
Estimates of leakage rates for action under the Kyoto Protocol
ranged from 5% to 20% as a result of a loss in price competitiveness,
but these leakage rates were considered very uncertain.
For energy-intensive industries, the beneficial effects of Annex I
actions through technological development were considered possibly
substantial. However, this beneficial effect had not been reliably
quantified. On the empirical evidence they assessed, Barker et al. (2007) concluded that the competitive losses of then-current mitigation actions, e.g., the EU-ETS, were not significant.
Under the EU ETS rules Carbon Leakage Exposure Factor is used to determine the volumes of free allocation of emission permits to industrial installations.
A general perception among developing countries is that discussion of climate change in trade negotiations could lead to green protectionism by high-income countries Eco-tariffs on imports ("virtual carbon") consistent with a carbon price of $50 per ton of CO2
could be significant for developing countries. In 2010, World Bank
commented that introducing border tariffs could lead to a proliferation
of trade measures where the competitive playing field is viewed as being
uneven. Tariffs could also be a burden on low-income countries that
have contributed very little to the problem of climate change.
Interactions with renewable energy policies
Cap-and-trade and carbon taxes interact differently with non-price policies such as renewable energy subsidies. The IPCC explains this as follows:
A carbon tax can have an additive environmental effect to policies such as subsidies for the supply of RE.
By contrast, if a cap-and-trade system has a binding cap (sufficiently
stringent to affect emission-related decisions), then other policies
such as RE subsidies have no further impact on reducing emissions within the time period that the cap applies [emphasis added].
Carbon pricing and economic growth
According to a 2020 study carbon prices have not harmed economic growth in wealthy industrialized democracies.
In order for such a business model to become attractive, the
subsidies would therefore have to exceed this value. Here, a technology
openness could be the best choice, as a reduction in costs due to
technical progress can be expected. Already today, these costs of
generating negative emissions are below the costs of CO2 of $220 per ton, which means that a state-subsidized business model for creating negative emissions already makes economic sense today.
In sum, while a carbon price has the potential to reduce future
emissions, a carbon subsidy has the potential to reduce past emissions.
In late 2013, William Nordhaus, president of the American Economic Association, published The Climate Casino,
which culminates in a description of an international "carbon price
regime". Such a regime would require national commitments to a carbon
price, but not to a specific policy. Carbon taxes, caps, and hybrid schemes could all be used to satisfy such a commitment. At the same time Martin Weitzman,
a leading climate economist at Harvard, published a theoretical study
arguing that such a regime would make it far easier to reach an
international agreement, while a focus on national targets would
continue to make it nearly impossible. Nordhaus also makes this argument, but less formally.
Similar views have previously been discussed by Joseph Stiglitz and have previously appeared in a number of papers. The price-commitment view appears to have gained major support from independent positions taken by the World Bank and the International Monetary Fund (IMF).
The "Economists' Statement on Climate Change"
was signed by over 2500 economists including nine Nobel Laureates in
1997. This statement summarizes the economic case for carbon pricing as
follows:
The most efficient approach to slowing
climate change is through market-based policies. In order for the world
to achieve its climatic objectives at minimum cost, a cooperative
approach among nations is required – such as an international emissions
trading agreement. The United States and other nations can most
efficiently implement their climate policies through market mechanisms,
such as carbon taxes or the auction of emissions permits.
This
statement argues that carbon pricing is a "market mechanism" in
contrast to renewable subsidies or direct regulation of individual
sources of carbon emissions and hence is the way that the "United States
and other nations can most efficiently implement their climate
policies."
A new quantity commitment approach, suggested by Mutsuyoshi Nishimura, is for all countries to commit to the same global emission target.
The "assembly of governments" would issue permits in the amount of the
global target and all upstream fossil-fuel providers would be forced to
buy these permits.
The economics of carbon pricing is much the same for taxes and cap-and-trade. Both prices are efficient; they have the same social cost and the same effect on profits if permits are auctioned. However, some economists argue that caps prevent non-price policies, such as renewable energy subsidies, from reducing carbon emissions,
while carbon taxes do not. Others argue that an enforced cap is the
only way to guarantee that carbon emissions will actually be reduced; a
carbon tax will not prevent those who can afford to do so from
continuing to generate emissions.
Besides cap and trade, emission trading can refer to
project-based programs, also referred to as a credit or offset programs.
Such programs can sell credits for emission reductions provided by
approved projects. Generally there is an additionality
requirement that states that they must reduce emissions more than is
required by pre-existing regulation. An example of such a program is the
Clean Development Mechanism
under the Kyoto Protocol. These credits can be traded to other
facilities where they can be used for compliance with a cap-and-trade
program.
Unfortunately the concept of additionality is difficult to define and
monitor, with the result that some companies purposefully increased
emissions in order to get paid to eliminate them.
Cap-and-trade programs often allow "banking" of permits. This means that permits can be saved and can be used in the future. This allows an entity to over-comply in early periods in anticipation of higher carbon prices in subsequent years. This helps to stabilize the price of permits.
Free-market environmentalists therefore argue that the best way
to protect the environment is to clarify and protect property rights.
This allows parties to negotiate improvements in environmental quality.
It also allows them to use torts to stop environmental harm. If
affected parties can compel polluters to compensate them they will reduce or eliminate the externality.
Market proponents advocate changes to the legal system that empower
affected parties to obtain such compensation. They further claim that
governments have limited affected parties' ability to do so by
complicating the tort system to benefit producers over others.
Tenets
While environmental problems may be viewed as market failures, free market environmentalists argue that environmental problems arise because:
The state encodes, provides and enforces laws which override or
obscure property rights and thus fail to protect them adequately.
Given the technological and legal context in which people operate,
transaction costs are too high to allow parties to negotiate to a
solution better for the environment.
Laws governing class or individual tort claims provide polluters
with immunity from tort claims, or interfere with those claims in such a
way as to make it difficult to legally sustain them.
Though many environmentalists blame markets for many of today's
environmental problems, free-market environmentalists blame many of
these problems on distortions of the market and the lack of markets.
Government actions are blamed for a number of environmental detriments.
A misunderstanding of the tragedy of the commons,
which is seen as a fundamental problem for the environment. When land
is held in common, anybody may use it. Since resources are consumable,
this creates the incentive for entrepreneurs to use common resources
before somebody else does. Many environmental resources are held by the
government or in common, such as air, water, forests. A claimed problem
with regulation is that it puts property into a political commons, where
individuals try to appropriate public resources for their own gain, a
phenomenon called rent-seeking.
Tenure – Renters do not benefit from value accrued during their
tenure and thus face an incentive to extract as much value as possible
without conservation.
Political allocation – Political information does not have the
incentives that markets do to seek superior information (profit and
loss).
Though many participants provide input to governments, they can only
make one decision. This means that governments create rules that are not
well crafted for local situations.
The government's strategy is that of anticipation, to hide from danger
through regulations. A healthier society would use resilience, facing
and overcoming risks.
Perverse subsidies – Governments offer cross subsidies that distort price systems.
This means that underconsumers and overconsumers are paying the same
rates, so the underconsumer is overpaying and the overconsumer is
underpaying. The incentive leads to more overconsumers and fewer
underconsumers.
Increased transaction costs – Governments may create rules that make
it difficult to transfer rights in ways that benefit the environment.
For example, in the western United States, many states have laws over
water rights that make it difficult for environmental groups to purchase
in-stream flows from farmers.
Market tools
Markets
are not perfect, and free-market environmentalists assert that
market-based solutions will have their mistakes. Through strong feedback
mechanisms such as risk, profit and loss, market-driven have strong
incentives to learn from mistakes.
Individual choice Consumers have the incentive to maximize
their satisfaction and try to find low cost, high value options. Markets
allocate resources to the highest bidder. Producers make purchases on
behalf of the consumer. Due to many actors in the market, there is no
one-size-fits-all solution and entrepreneurs will seek to fulfill many
values of society, including conservation.
Entrepreneurship – Entrepreneurs seek value, problem-solve, and coordinate resources.
Price system – When resources become scarce, prices rise. Rising
prices incentivize entrepreneurs to find substitutions for these
resources. These resources are often conserved. E.g. as prices for coal
rise, consumers will use less and higher prices will drive substitution
for different energy sources.
Property rights
– Owners face a strong incentive to take care of and protect their
property. They must decide how much to use today and how much to use
tomorrow. Everybody is trying to grow value. Corporate value and share
price is based on their anticipated future profits. Owners with the
possibility of transferring their property, either to an heir or through
sale want their property to grow in value. Property rights encourage
conservation and defend resources against depletion, since there is a
strong incentive to maximize the value of the resource for the future.
Common law
– In order to have working property rights, you need a good system to
defend them. When rights are weak, people will violate them. By creating
a strong system, where common resources can be homesteaded,
transferred, and defended against harm, resources can be protected,
managed, allocated with the results that aggregate and balance
humanity's needs and wants.
The market is a non-political allocation device. Many
environmentalists proposals call to return resources from markets to
become political problems.
Issues
Coasian bargaining
Some economists
argue that, if industries internalized the costs of negative
externalities, they would face an incentive to reduce them, perhaps even
becoming enthusiastic about taking advantage of opportunities to
improve profitability through lower costs.
Moreover, economists claim this would lead to the optimal balance
between the marginal benefits of pursuing an activity and the marginal
cost of its environmental consequences. One well-known means of
internalizing a negative consequence is to establish a property right over some phenomenon formerly in the public domain.
The Coase theorem
is one extreme version of this logic. If property rights are well
defined and if there are no transaction costs, then market participants
can negotiate to a solution that internalizes the externality.
Moreover, this solution will not depend on who is allocated the property
right. For example, a paper mill and a resort might be on the same
lake. Suppose the benefits to the resort of a clean lake outweigh the
benefits to the mill of being able to pollute. If the mill has the
right to pollute, the resort will pay it not to. If the resort has the
right to a pollution-free lake, it will keep that right, as the mill
will be unable to compensate it for its pollution. However, critics
have charged that the "theorem" attributed to Coase is of extremely
limited practicability because of its assumptions, including no
transaction costs, and is ill-suited to real world externalities which
have high bargaining costs due to many factors.
More generally, free-market environmentalists argue that
transaction costs "count" as real costs. If the cost of re-allocating
property rights exceeds the benefits of doing so, then it is actually
optimal to stay in the status quo. This means the initial allocation of
property rights is not neutral and also that it has important
implications for efficiency. Nevertheless, given the existing property
rights regime, costly changes to it are not necessarily efficient, even
if in hindsight an alternative regime would have been better. But if there are opportunities for property rights to evolve, entrepreneurs can find them to create new wealth.
Geolibertarianism
Libertarian Georgists (or Geolibertarians) maintain a strong essential commitment to free markets but reject the Coasian solution in favor of land value taxation, wherein the economic rent of land is collected by the community and either equally distributed to adult residents in the form of universal basic income, called the citizen's dividend, or used to fund necessary functions of a minimal government. Under the LVT system, only landholders are taxed and on the basis of the market value of the earth
in its unimproved state, that is to say, apart from the value of any
structures or products of human labor. Geolibertarians regard the LVT as
just compensation for a legal land title granting exclusive access to that which logically precedes and generates private capital, whose supply is inelastic,
which properly belongs to all, and to which all have an equal right
because it is vital to human existence and economic activity—the ground
itself—and thus consider land value capture both morally imperative and a natural source of revenue.
Rothbardiananarcho-capitalists
also reject the proposed Coasian solution as making invalid assumptions
about the purely subjective notion of costs being measurable in
monetary terms, and also of making unexamined and invalid value
judgments (i.e., ethical judgments). (Wayback Machine PDF) The Rothbardians' solution is to recognize individuals' Lockean property rights, of which the Rothbardians maintain that Wertfreiheit (i.e., value-free) economic analysis demonstrates that this arrangement necessarily maximizes social utility. (Toward a Reconstruction of Utility and Welfare Economics PDF)
Murray Rothbard himself believed the term "free-market
environmentalism" to be oxymoronic. On his view the unimproved natural
environment, undeveloped and unowned, can in no sense be considered
property until it is transformed via Lockean homesteading. Unlike geolibertarians and many classical liberals, however, Rothbard emphatically rejected Locke's proviso as inconsistent with his theory of property acquisition. Against environmentalism
Rothbard said: "The problem is that environmentalists are not
interested in efficiency or preserving private property....The
environmentalists are acolytes and prisoners of a monstrous literally
anti-human philosophy. They despise and condemn the human race, which by
its very nature and in contrast to other creatures, changes and
transforms the environment instead of being passively subjected to
it....I have come to the conclusion that a 'free-market
environmentalist' is an oxymoron. Scratch one and you get...an
environmentalist."
Markets and ecosystems as spontaneous orders
Recent arguments in the academic literature have used Friedrich Hayek's concept of a spontaneous order to defend a broadly non-interventionistenvironmental policy. Hayek originally used the concept of a spontaneous order to argue against government intervention in the market. Like the market, ecosystems contain complex networks of information,
involve an ongoing dynamic process, contain orders within orders, and
the entire system operates without being directed by a conscious mind.
On this analysis, species takes the place of price as a visible element
of the system formed by a complex set of largely unknowable elements.
Human ignorance about the countless interactions between the organisms
of an ecosystem limits our ability to manipulate nature. Since humans
rely on the ecosystem to sustain themselves, it is argued that we have
an obligation to not disrupt such systems. This analysis of ecosystems
as spontaneous orders does not rely on markets qualifying as spontaneous
orders. As such, one need not endorse Hayek's analysis of markets to
endorse ecosystems as spontaneous orders.
Others
Proponents of free-market environmentalism use the example of the recent destruction of the once prosperous Grand Banks fishery off Newfoundland.
Once one of the world's most abundant fisheries, it has been almost
completely depleted of fish. Those primarily responsible were large
"factory-fishing" enterprises driven by the imperative to realize
profits in a competitive global market.
It is contended that if the fishery had been owned by a single entity,
the owner would have had an interest in keeping a renewable supply of
fish to maintain profits over the long term. The owner would thus have
charged high fees to fish in the area, sharply reducing how many fish
were caught. The owner also would have closely enforced rules on not
catching young fish. Instead commercial ships from around the world
raced to get the fish out of the water before competitors could,
including catching fish that had not yet reproduced.
Another example is in the 19th century early gold miners in
California developed a trade in rights to draw from water courses based
on the doctrine of prior appropriation. This was curtailed in 1902 by the Newlands Reclamation Act which introduced subsidies for irrigation projects. This had the effect of sending a signal to farmers
that water was inexpensive and abundant, leading to uneconomic use of a
scarce resource. Increasing difficulties in meeting demand for water in
the western United States
have been blamed on the continuing establishment of governmental
control and a return to tradable property rights has been proposed.
The Medea hypothesis is a term coined by paleontologistPeter Ward for a hypothesis that contests the Gaian hypothesis and proposes that multicellular life, understood as a superorganism, may be self-destructive or suicidal.
The metaphor refers to the mythological Medea (representing the Earth), who kills her own children (multicellular life).
In this view, microbial-triggered mass extinctions result in returns to the microbial-dominated state it has been for most of its history.
Examples
Possible examples of extinction events induced entirely or partially by biotic activities include:
The Great Oxidation Event, 2.45 billion years ago, believed to be responsible for the mass poisoning of anaerobic microbes to which oxygen was toxic, and for the Huronian glaciation
that resulted from the reaction of methane with oxygen to form carbon
dioxide (a less potent greenhouse gas than methane) and subsequent
depletion of atmospheric carbon dioxide by aerobic photosynthesisers
The Sturtian and MarinoanSnowball Earth glaciations, 715 to 680 and 650 to 632.3 million years ago, respectively, resulting from the sequestration of atmospheric carbon dioxide during the Neoproterozoic Oxygenation Event
The Late Ordovician Mass Extinction (LOME), 445.2 million years ago to 443.8 million years ago,
suggested by some studies to have been caused by glaciation resulting
from carbon dioxide depletion driven by the radiation of land plants
Peter Ward proposes that the current man-made climate change
and mass extinction event may be considered to be the most recent
Medean event. As these events are anthropogenic, he postulates that
Medean events are not necessarily caused by microbes, but by intelligent
life as well and that the final mass extinction of complex life,
roughly about 500–900 million years in the future, can also be
considered a Medean event: "Plant life that still exists then will be
forced to adapt to a warming and expanding Sun, causing them to remove
even more carbon dioxide from the atmosphere (which in turn will have
already been lowered due to the increasing heat from the Sun gradually
speeding up the weathering process that removes these molecules from the
atmosphere), and ultimately accelerating the complete extinction of
complex life by making carbon dioxide levels drop down to just 10 ppm,
below which plants can no longer survive." However, Ward simultaneously
argues that intelligent life such as humans may not necessarily just
trigger future Medean events, but may eventually prevent them from
occurring.
The Gaia hypothesis (/ˈɡaɪ.ə/), also known as the Gaia theory, Gaia paradigm, or the Gaia principle, proposes that living organisms interact with their inorganic surroundings on Earth to form a synergistic and self-regulating, complex system that helps to maintain and perpetuate the conditions for life on the planet.
The Gaia hypothesis was formulated by the chemist James Lovelock and co-developed by the microbiologist Lynn Margulis in the 1970s. Following the suggestion by his neighbour, novelist William Golding, Lovelock named the hypothesis after Gaia, the primordial deity who personified the Earth in Greek mythology. In 2006, the Geological Society of London awarded Lovelock the Wollaston Medal in part for his work on the Gaia hypothesis.
The Gaia hypothesis was initially criticized for being teleological and against the principles of natural selection, but later refinements aligned the Gaia hypothesis with ideas from fields such as Earth system science, biogeochemistry and systems ecology.
Even so, the Gaia hypothesis continues to attract criticism, and today
many scientists consider it to be only weakly supported by, or at odds
with, the available evidence.
Overview
Gaian hypotheses suggest that organisms co-evolve with their environment: that is, they "influence their abiotic environment, and that environment in turn influences the biota by Darwinian process". Lovelock (1995) gave evidence of this in his second book, Ages of Gaia, showing the evolution from the world of the early thermo-acido-philic and methanogenic bacteria towards the oxygen-enriched atmosphere today that supports more complex life.
A reduced version of the hypothesis has been called "influential Gaia"
in the 2002 paper "Directed Evolution of the Biosphere: Biogeochemical
Selection or Gaia?" by Andrei G. Lapenis, which states the biota influence certain aspects of the abiotic world, e.g. temperature
and atmosphere. This is not the work of an individual but a collective
of Russian scientific research that was combined into this
peer-reviewed publication. It states the coevolution of life and the
environment through "micro-forces" and biogeochemical processes. An example is how the activity of photosynthetic bacteria during Precambrian times completely modified the Earth atmosphere to turn it aerobic, and thus supports the evolution of life (in particular eukaryotic life).
Since barriers existed throughout the twentieth century between
Russia and the rest of the world, it is only relatively recently that
the early Russian scientists who introduced concepts overlapping the
Gaia paradigm have become better known to the Western scientific
community. These scientists include Piotr Alekseevich Kropotkin (1842–1921) (although he spent much of his professional life outside Russia), Rafail Vasil’evich Rizpolozhensky (1862 – c. 1922), Vladimir Ivanovich Vernadsky (1863–1945), and Vladimir Alexandrovich Kostitzin (1886–1963).
Biologists and Earth scientists usually view the factors that stabilize the characteristics of a period as an undirected emergent property or entelechy
of the system; as each individual species pursues its own
self-interest, for example, their combined actions may have
counterbalancing effects on environmental change. Opponents of this view
sometimes reference examples of events that resulted in dramatic change
rather than stable equilibrium, such as the conversion of the Earth's
atmosphere from a reducing environment to an oxygen-rich one at the end of the Archaean and the beginning of the Proterozoic periods.
Less accepted versions of the hypothesis claim that changes in the biosphere are brought about through the coordination of living organisms and maintain those conditions through homeostasis. In some versions of Gaia philosophy, all lifeforms are considered part of one single living planetary being called Gaia.
In this view, the atmosphere, the seas and the terrestrial crust would
be results of interventions carried out by Gaia through the coevolving diversity of living organisms.
The Gaia paradigm was an influence on the deep ecology movement.
Details
The Gaia hypothesis posits that the Earth is a self-regulating complex system involving the biosphere, the atmosphere, the hydrospheres and the pedosphere,
tightly coupled as an evolving system. The hypothesis contends that
this system as a whole, called Gaia, seeks a physical and chemical
environment optimal for contemporary life.
Gaia evolves through a cyberneticfeedback system operated by the biota,
leading to broad stabilization of the conditions of habitability in a
full homeostasis. Many processes in the Earth's surface, essential for
the conditions of life, depend on the interaction of living forms,
especially microorganisms, with inorganic elements. These processes establish a global control system that regulates Earth's surface temperature, atmosphere composition and oceansalinity, powered by the global thermodynamic disequilibrium state of the Earth system.
The existence of a planetary homeostasis influenced by living forms had been observed previously in the field of biogeochemistry, and it is being investigated also in other fields like Earth system science.
The originality of the Gaia hypothesis relies on the assessment that
such homeostatic balance is actively pursued with the goal of keeping
the optimal conditions for life, even when terrestrial or external
events menace them.
Since life started on Earth, the energy provided by the Sun has increased by 25% to 30%;
however, the surface temperature of the planet has remained within the
levels of habitability, reaching quite regular low and high margins.
Lovelock has also hypothesised that methanogens produced elevated levels
of methane in the early atmosphere, giving a situation similar to that
found in petrochemical smog, similar in some respects to the atmosphere
on Titan.
This, he suggests, helped to screen out ultraviolet light until the
formation of the ozone layer, maintaining a degree of homeostasis.
However, the Snowball Earth research has suggested that "oxygen shocks" and reduced methane levels led, during the Huronian, Sturtian and Marinoan/Varanger Ice Ages, to a world that very nearly became a solid "snowball". These epochs are evidence against the ability of the pre Phanerozoic biosphere to fully self-regulate.
Processing of the greenhouse gas CO2, explained below, plays a critical role in the maintenance of the Earth temperature within the limits of habitability.
Currently the increase in human population and the environmental impact of their activities, such as the multiplication of greenhouse gases may cause negative feedbacks in the environment to become positive feedback. Lovelock has stated that this could bring an extremely accelerated global warming, but he has since stated the effects will likely occur more slowly.
In response to the criticism that the Gaia hypothesis seemingly required unrealistic group selection and cooperation between organisms, James Lovelock and Andrew Watson developed a mathematical model, Daisyworld, in which ecological competition underpinned planetary temperature regulation.
Daisyworld examines the energy budget of a planet populated by two different types of plants, black daisies and white daisies, which are assumed to occupy a significant portion of the surface. The colour of the daisies influences the albedo
of the planet such that black daisies absorb more light and warm the
planet, while white daisies reflect more light and cool the planet. The
black daisies are assumed to grow and reproduce best at a lower
temperature, while the white daisies are assumed to thrive best at a
higher temperature. As the temperature rises closer to the value the
white daisies like, the white daisies outreproduce the black daisies,
leading to a larger percentage of white surface, and more sunlight is
reflected, reducing the heat input and eventually cooling the planet.
Conversely, as the temperature falls, the black daisies outreproduce the
white daisies, absorbing more sunlight and warming the planet. The
temperature will thus converge to the value at which the reproductive
rates of the plants are equal.
Lovelock and Watson showed that, over a limited range of conditions, this negative feedback
due to competition can stabilize the planet's temperature at a value
which supports life, if the energy output of the Sun changes, while a
planet without life would show wide temperature changes. The percentage
of white and black daisies will continually change to keep the
temperature at the value at which the plants' reproductive rates are
equal, allowing both life forms to thrive.
It has been suggested that the results were predictable because
Lovelock and Watson selected examples that produced the responses they
desired.
Regulation of oceanic salinity
Ocean salinity has been constant at about 3.5% for a very long time.
Salinity stability in oceanic environments is important as most cells
require a rather constant salinity and do not generally tolerate values
above 5%. The constant ocean salinity was a long-standing mystery,
because no process counterbalancing the salt influx from rivers was
known. Recently it was suggested that salinity may also be strongly influenced by seawater circulation through hot basaltic rocks, and emerging as hot water vents on mid-ocean ridges.
However, the composition of seawater is far from equilibrium, and it is
difficult to explain this fact without the influence of organic
processes. One suggested explanation lies in the formation of salt
plains throughout Earth's history. It is hypothesized that these are
created by bacterial colonies that fix ions and heavy metals during
their life processes.
In the biogeochemical processes of Earth, sources and sinks are
the movement of elements. The composition of salt ions within our oceans
and seas is: sodium (Na+), chlorine (Cl−), sulfate (SO42−), magnesium (Mg2+), calcium (Ca2+) and potassium (K+). The elements that comprise salinity do not readily change and are a conservative property of seawater.
There are many mechanisms that change salinity from a particulate form
to a dissolved form and back. Considering the metallic composition of
iron sources across a multifaceted grid of thermomagnetic design, not
only would the movement of elements hypothetically help restructure the
movement of ions, electrons, and the like, but would also potentially
and inexplicably assist in balancing the magnetic bodies of the Earth's
geomagnetic field. The known sources of sodium i.e. salts are when
weathering, erosion, and dissolution of rocks are transported into
rivers and deposited into the oceans.
The Gaia theorem states that the Earth's atmospheric composition is kept at a dynamically steady state by the presence of life.
The atmospheric composition provides the conditions that contemporary
life has adapted to. All the atmospheric gases other than noble gases present in the atmosphere are either made by organisms or processed by them.
The stability of the atmosphere in Earth is not a consequence of chemical equilibrium. Oxygen
is a reactive compound, and should eventually combine with gases and
minerals of the Earth's atmosphere and crust. Oxygen only began to
persist in the atmosphere in small quantities about 50 million years
before the start of the Great Oxygenation Event. Since the start of the Cambrian period, atmospheric oxygen concentrations have fluctuated between 15% and 35% of atmospheric volume. Traces of methane (at an amount of 100,000 tonnes produced per year) should not exist, as methane is combustible in an oxygen atmosphere.
Dry air in the atmosphere of Earth contains roughly (by volume) 78.09% nitrogen, 20.95% oxygen, 0.93% argon, 0.039% carbon dioxide, and small amounts of other gases including methane.
Lovelock originally speculated that concentrations of oxygen above
about 25% would increase the frequency of wildfires and conflagration of
forests. This mechanism, however, would not raise oxygen levels if they
became too low. If plants can be shown to robustly over-produce O2
then perhaps only the high oxygen forest fires regulator is necessary.
Recent work on the findings of fire-caused charcoal in Carboniferous and
Cretaceous coal measures, in geologic periods when O2 did exceed 25%, has supported Lovelock's contention.
Gaia scientists see the participation of living organisms in the carbon cycle as one of the complex processes that maintain conditions suitable for life. The only significant natural source of atmospheric carbon dioxide (CO2) is volcanic activity, while the only significant removal is through the precipitation of carbonate rocks. Carbon precipitation, solution and fixation are influenced by the bacteria
and plant roots in soils, where they improve gaseous circulation, or in
coral reefs, where calcium carbonate is deposited as a solid on the sea
floor. Calcium carbonate is used by living organisms to manufacture
carbonaceous tests and shells. Once dead, the living organisms' shells
fall. Some arrive at the bottom of shallow seas where the heat and
pressure of burial, and/or the forces of plate tectonics, eventually
convert them to deposits of chalk and limestone. Much of the falling
dead shells, however, redissolve into the ocean below the carbon
compensation depth.
One of these organisms is Emiliania huxleyi, an abundant coccolithophorealgae which may have a role in the formation of clouds. CO2 excess is compensated by an increase of coccolithophorid life, increasing the amount of CO2
locked in the ocean floor. Coccolithophorids, if the CLAW Hypothesis
turns out to be supported (see "Regulation of Global Surface
Temperature" above), could help increase the cloud cover, hence control
the surface temperature, help cool the whole planet and favor
precipitation necessary for terrestrial plants. Lately the atmospheric CO2 concentration has increased and there is some evidence that concentrations of ocean algal blooms are also increasing.
Lichen and other organisms accelerate the weathering
of rocks in the surface, while the decomposition of rocks also happens
faster in the soil, thanks to the activity of roots, fungi, bacteria and
subterranean animals. The flow of carbon dioxide from the atmosphere to
the soil is therefore regulated with the help of living organisms. When
CO2 levels rise in the atmosphere the temperature increases and plants grow. This growth brings higher consumption of CO2 by the plants, who process it into the soil, removing it from the atmosphere.
History
Precedents
The idea of the Earth as an integrated whole, a living being, has a long tradition. The mythical Gaia was the primal Greek goddess personifying the Earth, the Greek version of "Mother Nature" (from Ge = Earth, and Aia = PIE grandmother), or the Earth Mother. James Lovelock gave this name to his hypothesis after a suggestion from the novelist William Golding, who was living in the same village as Lovelock at the time (Bowerchalke, Wiltshire,
UK). Golding's advice was based on Gea, an alternative spelling for the
name of the Greek goddess, which is used as prefix in geology,
geophysics and geochemistry. Golding later made reference to Gaia in his Nobel prize acceptance speech.
In the eighteenth century, as geology consolidated as a modern science, James Hutton maintained that geological and biological processes are interlinked. Later, the naturalist and explorer Alexander von Humboldt recognized the coevolution of living organisms, climate, and Earth's crust. In the twentieth century, Vladimir Vernadsky formulated a theory of Earth's development that is now one of the foundations of ecology. Vernadsky was a Ukrainian geochemist
and was one of the first scientists to recognize that the oxygen,
nitrogen, and carbon dioxide in the Earth's atmosphere result from
biological processes. During the 1920s he published works arguing that
living organisms could reshape the planet as surely as any physical
force. Vernadsky was a pioneer of the scientific bases for the
environmental sciences.
His visionary pronouncements were not widely accepted in the West, and
some decades later the Gaia hypothesis received the same type of initial
resistance from the scientific community.
Also in the turn to the 20th century Aldo Leopold, pioneer in the development of modern environmental ethics and in the movement for wilderness conservation, suggested a living Earth in his biocentric or holistic ethics regarding land.
It is at least not impossible to
regard the earth's parts—soil, mountains, rivers, atmosphere etc,—as
organs or parts of organs of a coordinated whole, each part with its
definite function. And if we could see this whole, as a whole, through a
great period of time, we might perceive not only organs with
coordinated functions, but possibly also that process of consumption as
replacement which in biology we call metabolism, or growth. In such case
we would have all the visible attributes of a living thing, which we do
not realize to be such because it is too big, and its life processes
too slow.
— Stephan Harding, Animate Earth
Another influence for the Gaia hypothesis and the environmental movement in general came as a side effect of the Space Race
between the Soviet Union and the United States of America. During the
1960s, the first humans in space could see how the Earth looked as a
whole. The photograph Earthrise taken by astronaut William Anders in 1968 during the Apollo 8 mission became, through the Overview Effect an early symbol for the global ecology movement.
Formulation of the hypothesis
Lovelock started defining the idea of a self-regulating Earth
controlled by the community of living organisms in September 1965, while
working at the Jet Propulsion Laboratory in California on methods of detecting life on Mars. The first paper to mention it was Planetary Atmospheres: Compositional and other Changes Associated with the Presence of Life, co-authored with C.E. Giffin.
A main concept was that life could be detected in a planetary scale by
the chemical composition of the atmosphere. According to the data
gathered by the Pic du Midi observatory, planets like Mars or Venus had atmospheres in chemical equilibrium. This difference with the Earth atmosphere was considered to be a proof that there was no life in these planets.
Lovelock formulated the Gaia Hypothesis in journal articles in 1972 and 1974, followed by a popularizing 1979 book Gaia: A new look at life on Earth. An article in the New Scientist of February 6, 1975, and a popular book length version of the hypothesis, published in 1979 as The Quest for Gaia, began to attract scientific and critical attention.
Lovelock called it first the Earth feedback hypothesis, and it was a way to explain the fact that combinations of chemicals including oxygen and methane
persist in stable concentrations in the atmosphere of the Earth.
Lovelock suggested detecting such combinations in other planets'
atmospheres as a relatively reliable and cheap way to detect life.
Later, other relationships such as sea creatures producing sulfur and
iodine in approximately the same quantities as required by land
creatures emerged and helped bolster the hypothesis.
In 1971 microbiologist Dr. Lynn Margulis
joined Lovelock in the effort of fleshing out the initial hypothesis
into scientifically proven concepts, contributing her knowledge about
how microbes affect the atmosphere and the different layers in the
surface of the planet.
The American biologist had also awakened criticism from the scientific
community with her advocacy of the theory on the origin of eukaryoticorganelles and her contributions to the endosymbiotic theory, nowadays accepted. Margulis dedicated the last of eight chapters in her book, The Symbiotic Planet,
to Gaia. However, she objected to the widespread personification of
Gaia and stressed that Gaia is "not an organism", but "an emergent
property of interaction among organisms". She defined Gaia as "the
series of interacting ecosystems that compose a single huge ecosystem at
the Earth's surface. Period". The book's most memorable "slogan" was
actually quipped by a student of Margulis'.
James Lovelock called his first proposal the Gaia hypothesis but has also used the term Gaia theory.
Lovelock states that the initial formulation was based on observation,
but still lacked a scientific explanation. The Gaia hypothesis has since
been supported by a number of scientific experiments[45] and provided a number of useful predictions.
During the "philosophical foundations" session of the conference, David Abram
spoke on the influence of metaphor in science, and of the Gaia
hypothesis as offering a new and potentially game-changing metaphorics,
while James Kirchner
criticised the Gaia hypothesis for its imprecision. Kirchner claimed
that Lovelock and Margulis had not presented one Gaia hypothesis, but
four:
CoEvolutionary
Gaia: that life and the environment had evolved in a coupled way.
Kirchner claimed that this was already accepted scientifically and was
not new.
Homeostatic Gaia: that life maintained the stability of the natural environment, and that this stability enabled life to continue to exist.
Geophysical
Gaia: that the Gaia hypothesis generated interest in geophysical cycles
and therefore led to interesting new research in terrestrial
geophysical dynamics.
Optimising Gaia: that Gaia shaped the planet in a way that made it
an optimal environment for life as a whole. Kirchner claimed that this
was not testable and therefore was not scientific.
Of Homeostatic Gaia, Kirchner recognised two alternatives. "Weak
Gaia" asserted that life tends to make the environment stable for the
flourishing of all life. "Strong Gaia" according to Kirchner, asserted
that life tends to make the environment stable, to enable the flourishing of all life. Strong Gaia, Kirchner claimed, was untestable and therefore not scientific.
Lovelock and other Gaia-supporting scientists, however, did
attempt to disprove the claim that the hypothesis is not scientific
because it is impossible to test it by controlled experiment. For
example, against the charge that Gaia was teleological, Lovelock and
Andrew Watson offered the Daisyworld Model (and its modifications, above) as evidence against most of these criticisms.
Lovelock said that the Daisyworld model "demonstrates that
self-regulation of the global environment can emerge from competition
amongst types of life altering their local environment in different
ways".
Lovelock was careful to present a version of the Gaia hypothesis
that had no claim that Gaia intentionally or consciously maintained the
complex balance in her environment that life needed to survive. It would
appear that the claim that Gaia acts "intentionally" was a statement in
his popular initial book and was not meant to be taken literally. This
new statement of the Gaia hypothesis was more acceptable to the
scientific community. Most accusations of teleologism ceased, following this conference.
Third Gaia conference
By the time of the 2nd Chapman Conference on the Gaia Hypothesis, held at Valencia, Spain, on 23 June 2000,
the situation had changed significantly. Rather than a discussion of
the Gaian teleological views, or "types" of Gaia hypotheses, the focus
was upon the specific mechanisms by which basic short term homeostasis
was maintained within a framework of significant evolutionary long term
structural change.
The major questions were:
"How has the global biogeochemical/climate system called Gaia
changed in time? What is its history? Can Gaia maintain stability of the
system at one time scale but still undergo vectorial change at longer
time scales? How can the geologic record be used to examine these
questions?"
"What is the structure of Gaia? Are the feedbacks sufficiently
strong to influence the evolution of climate? Are there parts of the
system determined pragmatically by whatever disciplinary study is being
undertaken at any given time or are there a set of parts that should be
taken as most true for understanding Gaia as containing evolving
organisms over time? What are the feedbacks among these different parts
of the Gaian system, and what does the near closure of matter mean for
the structure of Gaia as a global ecosystem and for the productivity of
life?"
"How do models of Gaian processes and phenomena relate to reality
and how do they help address and understand Gaia? How do results from
Daisyworld transfer to the real world? What are the main candidates for
"daisies"? Does it matter for Gaia theory whether we find daisies or
not? How should we be searching for daisies, and should we intensify the
search? How can Gaian mechanisms be collaborated with using process models or global models of the climate system that include the biota and allow for chemical cycling?"
In 1997, Tyler Volk
argued that a Gaian system is almost inevitably produced as a result of
an evolution towards far-from-equilibrium homeostatic states that
maximise entropy production,
and Axel Kleidon (2004) agreed stating: "...homeostatic behavior can
emerge from a state of MEP associated with the planetary albedo";
"...the resulting behavior of a symbiotic Earth at a state of MEP may
well lead to near-homeostatic behavior of the Earth system on long time
scales, as stated by the Gaia hypothesis". M. Staley (2002) has
similarly proposed "...an alternative form of Gaia theory based on more
traditional Darwinian principles... In [this] new approach,
environmental regulation is a consequence of population dynamics. The
role of selection is to favor organisms that are best adapted to
prevailing environmental conditions. However, the environment is not a
static backdrop for evolution, but is heavily influenced by the presence
of living organisms. The resulting co-evolving dynamical process
eventually leads to the convergence of equilibrium and optimal
conditions".
Fourth Gaia conference
A
fourth international conference on the Gaia hypothesis, sponsored by
the Northern Virginia Regional Park Authority and others, was held in
October 2006 at the Arlington, VA campus of George Mason University.
Martin Ogle, Chief Naturalist, for NVRPA, and long-time Gaia
hypothesis proponent, organized the event. Lynn Margulis, Distinguished
University Professor in the Department of Geosciences, University of
Massachusetts-Amherst, and long-time advocate of the Gaia hypothesis,
was a keynote speaker. Among many other speakers: Tyler Volk,
co-director of the Program in Earth and Environmental Science at New
York University; Dr. Donald Aitken, Principal of Donald Aitken
Associates; Dr. Thomas Lovejoy, President of the Heinz Center for Science, Economics and the Environment; Robert Corell, Senior Fellow, Atmospheric Policy Program, American Meteorological Society and noted environmental ethicist, J. Baird Callicott.
Criticism
After
initially receiving little attention from scientists (from 1969 until
1977), thereafter for a period the initial Gaia hypothesis was
criticized by a number of scientists, including Ford Doolittle, Richard Dawkins and Stephen Jay Gould. Lovelock has said that because his hypothesis is named after a Greek goddess, and championed by many non-scientists, the Gaia hypothesis was interpreted as a neo-Paganreligion. Many scientists in particular also criticized the approach taken in his popular book Gaia, a New Look at Life on Earth for being teleological—a
belief that things are purposeful and aimed towards a goal. Responding
to this critique in 1990, Lovelock stated, "Nowhere in our writings do
we express the idea that planetary self-regulation is purposeful, or
involves foresight or planning by the biota".
Stephen Jay Gould criticized Gaia as being "a metaphor, not a mechanism."
He wanted to know the actual mechanisms by which self-regulating
homeostasis was achieved. In his defense of Gaia, David Abram argues
that Gould overlooked the fact that "mechanism", itself, is a
metaphor—albeit an exceedingly common and often unrecognized
metaphor—one which leads us to consider natural and living systems as
though they were machines organized and built from outside (rather than
as autopoietic
or self-organizing phenomena). Mechanical metaphors, according to
Abram, lead us to overlook the active or agentic quality of living
entities, while the organismic metaphors of the Gaia hypothesis
accentuate the active agency of both the biota and the biosphere as a
whole.
With regard to causality in Gaia, Lovelock argues that no single
mechanism is responsible, that the connections between the various known
mechanisms may never be known, that this is accepted in other fields of
biology and ecology as a matter of course, and that specific hostility
is reserved for his own hypothesis for other reasons.
Aside from clarifying his language and understanding of what is
meant by a life form, Lovelock himself ascribes most of the criticism to
a lack of understanding of non-linear mathematics by his critics, and a
linearizing form of greedy reductionism
in which all events have to be immediately ascribed to specific causes
before the fact. He also states that most of his critics are biologists
but that his hypothesis includes experiments in fields outside biology,
and that some self-regulating phenomena may not be mathematically
explainable.
Natural selection and evolution
Lovelock has suggested that global biological feedback mechanisms could evolve by natural selection,
stating that organisms that improve their environment for their
survival do better than those that damage their environment. However, in
the early 1980s, W. Ford Doolittle and Richard Dawkins separately argued against this aspect of Gaia. Doolittle argued that nothing in the genome
of individual organisms could provide the feedback mechanisms proposed
by Lovelock, and therefore the Gaia hypothesis proposed no plausible
mechanism and was unscientific.
Dawkins meanwhile stated that for organisms to act in concert would
require foresight and planning, which is contrary to the current
scientific understanding of evolution. Like Doolittle, he also rejected the possibility that feedback loops could stabilize the system.
Margulis argued in 1999 that "Darwin's grand vision was not wrong, only incomplete.
In accentuating the direct competition between individuals for
resources as the primary selection mechanism, Darwin (and especially his
followers) created the impression that the environment was simply a
static arena". She wrote that the composition of the Earth's atmosphere,
hydrosphere, and lithosphere are regulated around "set points" as in homeostasis, but those set points change with time.
Evolutionary biologist W. D. Hamilton called the concept of Gaia Copernican, adding that it would take another Newton to explain how Gaian self-regulation takes place through Darwinian natural selection. More recently Ford Doolittle building on his and Inkpen's ITSNTS (It's The Song Not The Singer) proposal
proposed that differential persistence can play a similar role to
differential reproduction in evolution by natural selections, thereby
providing a possible reconciliation between the theory of natural
selection and the Gaia hypothesis.
Criticism in the 21st century
The
Gaia hypothesis continues to be broadly skeptically received by the
scientific community. For instance, arguments both for and against it
were laid out in the journal Climatic Change in 2002 and 2003. A
significant argument raised against it are the many examples where life
has had a detrimental or destabilising effect on the environment rather
than acting to regulate it.
Several recent books have criticised the Gaia hypothesis, expressing
views ranging from "... the Gaia hypothesis lacks unambiguous
observational support and has significant theoretical difficulties" to "Suspended uncomfortably between tainted metaphor, fact, and false science, I prefer to leave Gaia firmly in the background" to "The Gaia hypothesis is supported neither by evolutionary theory nor by the empirical evidence of the geological record". The CLAW hypothesis,
initially suggested as a potential example of direct Gaian feedback,
has subsequently been found to be less credible as understanding of cloud condensation nuclei has improved. In 2009 the Medea hypothesis
was proposed: that life has highly detrimental (biocidal) impacts on
planetary conditions, in direct opposition to the Gaia hypothesis.
In a 2013 book-length evaluation of the Gaia hypothesis
considering modern evidence from across the various relevant
disciplines, Toby Tyrrell concluded that: "I believe Gaia is a dead end.
Its study has, however, generated many new and thought provoking
questions. While rejecting Gaia, we can at the same time appreciate
Lovelock's originality and breadth of vision, and recognize that his
audacious concept has helped to stimulate many new ideas about the
Earth, and to champion a holistic approach to studying it". Elsewhere he presents his conclusion "The Gaia hypothesis is not an accurate picture of how our world works".
This statement needs to be understood as referring to the "strong" and
"moderate" forms of Gaia—that the biota obeys a principle that works to
make Earth optimal (strength 5) or favourable for life (strength 4) or
that it works as a homeostatic mechanism (strength 3). The latter is the
"weakest" form of Gaia that Lovelock has advocated. Tyrrell rejects it.
However, he finds that the two weaker forms of Gaia—Coeveolutionary
Gaia and Influential Gaia, which assert that there are close links
between the evolution of life and the environment and that biology
affects the physical and chemical environment—are both credible, but
that it is not useful to use the term "Gaia" in this sense and that
those two forms were already accepted and explained by the processes of
natural selection and adaptation.
Anthropic principle
As
emphasized by multiple critics, no plausible mechanism exists that
would drive the evolution of negative feedback loops leading to
planetary self-regulation of the climate. Indeed, multiple incidents in Earth's history (see the Medea hypothesis)
have shown that the Earth and the biosphere can enter self-destructive
positive feedback loops that lead to mass extinction events.
For example, the Snowball Earth glaciations appeared to result from the development of photosynthesis during a period when the Sun was cooler than it is now. The removal of carbon dioxide from the atmosphere, along with the oxidation of atmospheric methane by the released oxygen, resulted in a dramatic diminishment of the greenhouse effect.
The resulting expansion of the polar ice sheets decreased the overall
fraction of sunlight absorbed by the Earth, resulting in a runaway ice–albedo positive feedback loop ultimately resulting in glaciation over nearly the entire surface of the Earth.
Breaking out of the Earth from the frozen condition appears to have
primarily been due to the release of carbon dioxide and methane by
volcanos, although release of methane by microbes trapped underneath the ice could also have played a part.
Lesser contributions to warming would come from the fact that coverage
of the Earth by ice sheets largely inhibited photosynthesis and lessened
the removal of carbon dioxide from the atmosphere by the weathering of
siliceous rocks. However, in the absence of tectonic activity, the
snowball condition could have persisted indefinitely.
Geologic events with amplifying positive feedbacks (along with
some possible biologic participation) led to the greatest mass
extinction event on record, the Permian–Triassic extinction event about 250 million years ago. The precipitating event appears to have been volcanic eruptions in the Siberian Traps, a hilly region of flood basalts in Siberia. These eruptions released high levels of carbon dioxide and sulfur dioxide which elevated world temperatures and acidified the oceans.
Estimates of the rise in carbon dioxide levels range widely, from as
little as a two-fold increase, to as much as a twenty-fold increase.
Amplifying feedbacks increased the warming to considerably greater than
that to be expected merely from the greenhouse effect of carbon
dioxide: these include the ice albedo feedback, the increased
evaporation of water vapor (another greenhouse gas) into the atmosphere,
the release of methane from the warming of methane hydrate deposits buried under the permafrost and beneath continental shelf sediments, and increased wildfires.
The rising carbon dioxide acidified the oceans, leading to widespread
die-off of creatures with calcium carbonate shells, killing mollusks and
crustaceans like crabs and lobsters and destroying coral reefs. Their demise led to disruption of the entire oceanic food chain. It has been argued that rising temperatures may have led to disruption of the chemocline separating sulfidic deep waters from oxygenated surface waters, which led to massive release of toxic hydrogen sulfide (produced by anerobic
bacteria) to the surface ocean and even into atmosphere, contributing
to the (primarily methane-driven) collapse of the ozone layer, and helping to explain the die-off of terrestrial animal and plant life.
Despite the evidence from multiple mass extinction events that
the biosphere is not fully capable of self-regulation, the fact remains
that negative feedback loops do exist. As mentioned above, despite the
energy provided by the Sun having increased by 25% to 30% over the last
four billion years of life on Earth, the surface temperature of the
planet has remained within habitable limits. The participation of living
organisms in the oxygen and carbon cycles is well-established. Yet
given the lack of plausible explanation by natural selection whereby life on Earth could have evolved to regulate its abiotic environment, how could such feedback loops have arisen?
According to the weak anthropic principle, our observation of such stabilizing feedback loops is an observer selection effect.
In all the universe, it is only planets with Gaian properties that
could have evolved intelligent, self-aware organisms capable of asking
such questions.
One can imagine innumerable worlds where life evolved with different
biochemistries or where the worlds had different geophysical properties
such that the worlds are presently dead due to runaway greenhouse
effect, or else are in perpetual Snowball, or else due to one factor or
another, life has been inhibited from evolving beyond the microbial
level.
If no means exists for natural selection to operate at the
biosphere level, then it would appear that the anthropic principle
provides the only explanation for the survival of Earth's biosphere over
geologic time. But in recent years, this strictly reductionistic view
has been modified by recognition that natural selection can operate at
multiple levels of the biological hierarchy — not just at the level of
individual organisms.
Traditional Darwinian natural selection requires reproducing entities
that display inheritable properties or abilities that result in their
having more offspring than their competitors. Successful biospheres
clearly cannot reproduce to spawn copies of themselves, and so
traditional Darwinian natural selection cannot operate. A mechanism for
biosphere-level selection was proposed by Ford Doolittle: Although he
had been a strong and early critic of the Gaia hypothesis,
he had by 2015 started to think of ways whereby Gaia might be
"Darwinised", seeking means whereby the planet could have evolved
biosphere-level adaptations. Doolittle has suggested that differential persistence
— mere survival — could be considered a legitimate mechanism for
natural selection. As the Earth passes through various challenges, the
phenomenon of differential persistence enables selected entities to
achieve fixation by surviving the death of their competitors. Although
Earth's biosphere is not competing against other biospheres on other
planets, there are many competitors for survival on this planet. Collectively, Gaia constitutes the single clade of all living survivors descended from life’s last universal common ancestor (LUCA). Various other proposals for biosphere-level selection include sequential selection, entropic hierarchy, and considering Gaia as a holobiont-like system. Ultimately speaking, differential persistence and sequential selection are variants of the anthropic principle,
while entropic hierarchy and holobiont arguments may possibly allow
understanding the emergence of Gaia without anthropic arguments.