Search This Blog

Wednesday, December 10, 2025

Tipping points in the climate system

From Wikipedia, the free encyclopedia
Map showing global and regional tipping elements: if the global temperature increases past a certain point (color-coded for temperature thresholds), this particular element would be tipped. The result would be a transition to a different state.

In climate science, a tipping point is a critical threshold that, when crossed, leads to large, accelerating and often irreversible changes in the climate system. If tipping points are crossed, they are likely to have severe impacts on human society and may accelerate global warming. Tipping behavior is found across the climate system, for example in ice sheets, mountain glaciers, circulation patterns in the ocean, in ecosystems, and the atmosphere. Examples of tipping points include thawing permafrost, which will release methane, a powerful greenhouse gas, or melting ice sheets and glaciers reducing Earth's albedo, which would warm the planet faster. Thawing permafrost is a threat multiplier because it holds roughly twice as much carbon as the amount currently circulating in the atmosphere.

Tipping points are often, but not necessarily, abrupt. For example, with average global warming somewhere between 0.8 °C (1.4 °F) and 3 °C (5.4 °F), the Greenland ice sheet passes a tipping point and is doomed, but its melt would take place over millennia. Tipping points are possible at today's global warming of just over 1 °C (1.8 °F) above preindustrial times, and highly probable above 2 °C (3.6 °F) of global warming. It is possible that some tipping points are close to being crossed or have already been crossed, like those of the West Antarctic and Greenland ice sheets, the Amazon rainforest and warm-water coral reefs. A 2022 study published in Science found that exceeding 1.5 °C of global warming could trigger multiple tipping points, including the collapse of major ice sheets, abrupt thawing of permafrost, and coral reef die-off, with potential for cascading system effects.

A danger is that if the tipping point in one system is crossed, this could cause a cascade of other tipping points, leading to severe, potentially catastrophic, impacts. Crossing a threshold in one part of the climate system may trigger another tipping element to tip into a new state. For example, ice loss in West Antarctica and Greenland will significantly alter ocean circulation. Sustained warming of the northern high latitudes as a result of this process could activate tipping elements in that region, such as permafrost degradation, and boreal forest dieback.

Scientists have identified many elements in the climate system which may have tipping points. As of September 2022, nine global core tipping elements and seven regional impact tipping elements are known. Out of those, one regional and three global climate elements will likely pass a tipping point if global warming reaches 1.5 °C (2.7 °F). They are the Greenland ice sheet collapse, West Antarctic ice sheet collapse, tropical coral reef die off, and boreal permafrost abrupt thaw.

Tipping points exist in a range of systems, for example in the cryosphere, within ocean currents, and in terrestrial systems. The tipping points in the cryosphere include: Greenland ice sheet disintegration, West Antarctic ice sheet disintegration, East Antarctic ice sheet disintegration, arctic sea ice decline, retreat of mountain glaciers, permafrost thaw. The tipping points for ocean current changes include the Atlantic Meridional Overturning Circulation (AMOC), the North Subpolar Gyre and the Southern Ocean overturning circulation. Lastly, the tipping points in terrestrial systems include Amazon rainforest dieback, boreal forest biome shift, Sahel greening, and vulnerable stores of tropical peat carbon.

Definition

A system going past a tipping point. The system starts (blue) in one of two alternative stable states, represented by the ball in the left hand valley. Under external forcing over time (left to right) this state loses stability (purple), represented by the valley getting shallower, lowering the hilltop. Past a tipping point the initial stable state disappears and the system undergoes an abrupt, self-propelling change into the alternative, remaining stable state (red).
Positive tipping point in society.

The IPCC Sixth Assessment Report defines a tipping point as a "critical threshold beyond which a system reorganizes, often abruptly and/or irreversibly". It can be brought about by a small disturbance causing a disproportionately large change in the system. It can also be associated with self-reinforcing feedbacks, which could lead to changes in the climate system irreversible on a human timescale. For any particular climate component, the shift from one state to a new stable state may take many decades or centuries.

The 2019 IPCC Special Report on the Ocean and Cryosphere in a Changing Climate defines a tipping point as: "A level of change in system properties beyond which a system reorganises, often in a non-linear manner, and does not return to the initial state even if the drivers of the change are abated. For the climate system, the term refers to a critical threshold at which global or regional climate changes from one stable state to another stable state.".

In ecosystems and in social systems, a tipping point can trigger a regime shift, a major systems reorganisation into a new stable state. Such regime shifts need not be harmful. In the context of the climate crisis, the tipping point metaphor is sometimes used in a positive sense, such as to refer to shifts in public opinion in favor of action to mitigate climate change, or the potential for minor policy changes to rapidly accelerate the transition to a green economy.

Comparison of tipping points

Scientists have identified many elements in the climate system which may have tipping points. In the early 2000s the IPCC began considering the possibility of tipping points, originally referred to as large-scale discontinuities. At that time the IPCC concluded they would only be likely in the event of global warming of 4 °C (7.2 °F) or more above preindustrial times, and another early assessment placed most tipping point thresholds at 3–5 °C (5.4–9.0 °F) above 1980–1999 average warming. Since then estimates for global warming thresholds have generally fallen, with some thought to be possible in the Paris Agreement range (1.5–2 °C (2.7–3.6 °F)) by 2016. As of 2021 tipping points are considered to have significant probability at today's warming level of just over 1 °C (1.8 °F), with high probability above 2 °C (3.6 °F) of global warming. Some tipping points may be close to being crossed or have already been crossed, like those of the ice sheets in West Antarctic and Greenland, warm-water coral reefs, and the Amazon rainforest.

As of September 2022, nine global core tipping elements and seven regional impact tipping elements have been identified. Out of those, one regional and three global climate elements are estimated to likely pass a tipping point if global warming reaches 1.5 °C (2.7 °F), namely Greenland ice sheet collapse, West Antarctic ice sheet collapse, tropical coral reef die off, and boreal permafrost abrupt thaw. Two further tipping points are forecast as likely if warming continues to approach 2 °C (3.6 °F): Barents sea ice abrupt loss, and the Labrador Sea subpolar gyre collapse.

Global core tipping elements
Proposed climate tipping element (and tipping point) Threshold ( °C) Timescale (years) Maximum Impact ( °C)
Estimated Minimum Maximum Estimated Minimum Maximum Global Regional
Greenland Ice Sheet (collapse) 1.5 0.8 3.0 10,000 1,000 15,000 0.13 0.5 to 3.0
West Antarctic Ice Sheet (collapse) 1.5 1.0 3.0 2,000 500 13,000 0.05 1.0
Labrador-Irminger Seas/SPG Convection (collapse) 1.8 1.1 3.8 10 5 50 -0.5 -3.0
East Antarctic Subglacial Basins (collapse) 3.0 2.0 6.0 2,000 500 10,000 0.05 ?
Arctic Winter Sea Ice (collapse) 6.3 4.5 8.7 20 10 100 0.6 0.6 to 1.2
East Antarctic Ice Sheet (collapse) 7.5 5.0 10.0 ? 10,000 ? 0.6 2.0
Amazon Rainforest (dieback) 3.5 2.0 6.0 100 50 200 0.1 (partial) 0.2 (total) 0.4 to 2.0
Boreal Permafrost (collapse) 4.0 3.0 6.0 50 10 300 0.2 - 0.4 ~
Atlantic Meridional Overturning Circulation (collapse) 4.0 1.4 8.0 50 15 300 -0.5 -4 to -10

  1. The paper also provides the same estimate in terms of emissions: between 125 and 250 billion tonnes of carbon and between 175 and 350 billion tonnes of carbon equivalent.
Regional impact tipping elements
Proposed climate tipping element (and tipping point) Threshold ( °C) Timescale (years) Maximum Impact ( °C)
Estimated Minimum Maximum Estimated Minimum Maximum Global Regional
Low-latitude Coral Reefs (dieoff) 1.5 1.0 2.0 10 ~ ~ ~ ~
Boreal Permafrost (abrupt thaw) 1.5 1.0 2.3 200 100 300 0.04 per °C by 2100; 0.11 per °C by 2300 ~
Barents Sea Ice (abrupt loss) 1.6 1.5 1.7 25 ? ? ~ +
Mountain Glaciers (loss) 2.0 1.5 3.0 200 50 1,000 0.08 +
Sahel and W.African Monsoon (greening) 2.8 2.0 3.5 50 10 500 ~ +
Boreal Forest (southern dieoff) 4.0 1.4 5.0 100 50 ? net -0.18 -0.5 to -2
Boreal Forest (northern expansion) 4.0 1.5 7.2 100 40 ? net +0.14 0.5-1.0

  1. Extra forest growth here would absorb around 6 billion tons of carbon, but because this area receives a lot of sunlight, this is very minor when compared to reduced albedo, as this vegetation absorbs more heat than the snow-covered ground it moves into.

Tipping points in the cryosphere

Greenland ice sheet disintegration

Changes in extent (colored lines) and thickness (black lines) of the Greenland ice sheet over time, showing its rapid, sustained melting since 2000.

The Greenland ice sheet is the second largest ice sheet in the world, and completely melting the water which it holds would raise sea levels globally by 7.2 metres (24 ft). Due to global warming, the ice sheet is currently melting at an accelerating rate, adding almost 1 mm to global sea levels every year. Around half of the ice loss occurs via surface melting, and the remainder occurs at the base of the ice sheet where it touches the sea, by calving (breaking off) icebergs from its margins.

The Greenland ice sheet has a tipping point because of the melt-elevation feedback. Surface melting reduces the height of the ice sheet, and air at a lower altitude is warmer. The ice sheet is then exposed to warmer temperatures, accelerating its melt. A 2021 analysis of sub-glacial sediment at the bottom of a 1.4 kilometres (0.87 mi) Greenland ice core finds that the Greenland ice sheet melted away at least once during the last million years, and therefore strongly suggests that its tipping point is below the 2.5 °C (4.5 °F) maximum temperature increase over the preindustrial conditions observed over that period. There is some evidence that the Greenland ice sheet is losing stability, and getting close to a tipping point.

West Antarctic ice sheet disintegration

A topographic and bathymetric map of Antarctica without its ice sheets, assuming constant sea levels and no post-glacial rebound

The West Antarctic Ice Sheet (WAIS) is a large ice sheet in Antarctica; in places more than 4 kilometres (2.5 mi) thick. It sits on bedrock mostly below sea level, having formed a deep subglacial basin due to the weight of the ice sheet over millions of years. As such, it is in contact with the heat from the ocean which makes it vulnerable to fast and irreversible ice loss. A tipping point could be reached once the WAIS's grounding lines (the point at which ice no longer sits on rock and becomes floating ice shelves) retreat behind the edge of the subglacial basin, resulting in self-sustaining retreat in to the deeper basin - a process known as the Marine Ice Sheet Instability (MISI). Thinning and collapse of the WAIS's ice shelves is helping to accelerate this grounding line retreat. If completely melted, the WAIS would contribute around 3.3 metres (11 ft) of sea level rise over thousands of years.

Ice loss from the WAIS is accelerating, and some outlet glaciers are estimated to be close to or possibly already beyond the point of self-sustaining retreat. The paleo record suggests that during the past few hundred thousand years, the WAIS largely disappeared in response to similar levels of warming and CO2 emission scenarios projected for the next few centuries.

Like with the other ice sheets, there is a counteracting negative feedback - greater warming also intensifies the effects of climate change on the water cycle, which result in an increased precipitation over the ice sheet in the form of snow during the winter, which would freeze on the surface, and this increase in the surface mass balance (SMB) counteracts some fraction of the ice loss. In the IPCC Fifth Assessment Report, it was suggested that this effect could potentially overpower increased ice loss under the higher levels of warming and result in small net ice gain, but by the time of the IPCC Sixth Assessment Report, improved modelling had proven that the glacier breakup would consistently accelerate at a faster rate.

East Antarctic ice sheet disintegration

The East Antarctic ice sheet is the largest and thickest ice sheet on Earth, with the maximum thickness of 4,800 metres (3.0 mi). A complete disintegration would raise the global sea levels by 53.3 metres (175 ft), but this may not occur until global warming of 10 °C (18 °F), while the loss of two-thirds of its volume may require at least 6 °C (11 °F) of warming to trigger. Its melt would also occur over a longer timescale than the loss of any other ice on the planet, taking no less than 10,000 years to finish. However, the subglacial basin portions of the East Antarctic ice sheet may be vulnerable to tipping at lower levels of warming. The Wilkes Basin is of particular concern, as it holds enough ice to raise sea levels by about 3–4 metres (10–13 ft).

Arctic sea ice decline

Average decadal extent and area of the Arctic Ocean sea ice since 1979.
Average decadal extent and area of the Arctic Ocean sea ice since the start of satellite observations.
Annual trend in the Arctic sea ice extent and area for the 2011-2022 time period.
Annual trend in the Arctic sea ice extent and area for the 2011-2022 time period.

Arctic sea ice was once identified as a potential tipping element. The loss of sunlight-reflecting sea ice during summer exposes the (dark) ocean, which would warm. Arctic sea ice cover is likely to melt entirely under even relatively low levels of warming, and it was hypothesised that this could eventually transfer enough heat to the ocean to prevent sea ice recovery even if the global warming is reversed. Modelling now shows that this heat transfer during the Arctic summer does not overcome the cooling and the formation of new ice during the Arctic winter. As such, the loss of Arctic ice during the summer is not a tipping point for as long as the Arctic winter remains cool enough to enable the formation of new Arctic sea ice. However, if the higher levels of warming prevent the formation of new Arctic ice even during winter, then this change may become irreversible. Consequently, Arctic Winter Sea Ice is included as a potential tipping point in a 2022 assessment.

Additionally, the same assessment argued that while the rest of the ice in the Arctic Ocean may recover from a total summertime loss during the winter, ice cover in the Barents Sea may not reform during the winter even below 2 °C (3.6 °F) of warming. This is because the Barents Sea is already the fastest-warming part of the Arctic: in 2021-2022 it was found that while the warming within the Arctic Circle has already been nearly four times faster than the global average since 1979, Barents Sea warmed up to seven times faster than the global average. This tipping point matters because of the decade-long history of research into the connections between the state of Barents-Kara Sea ice and the weather patterns elsewhere in Eurasia.

Retreat of mountain glaciers

Projected loss of mountain glaciers over the 21st century, for different amounts of global warming.

Mountain glaciers are the largest repository of land-bound ice after the Greenland and the Antarctica ice sheets, and they are also undergoing melting as the result of climate change. A glacier tipping point is when it enters a disequilibrium state with the climate and will melt away unless the temperatures go down. Examples include glaciers of the North Cascade Range, where even in 2005 67% of the glaciers observed were in disequilibrium and will not survive the continuation of the present climate, or the French Alps, where The Argentière and Mer de Glace glaciers are expected to disappear completely by end of the 21st century if current climate trends persist. Altogether, it was estimated in 2023 that 49% of the world's glaciers would be lost by 2100 at 1.5 °C (2.7 °F) of global warming, and 83% of glaciers would be lost at 4 °C (7.2 °F). This would amount to one quarter and nearly half of mountain glacier *mass* loss, respectively, as only the largest, most resilient glaciers would survive the century. This ice loss would also contribute ~9 cm (3+12 in) and ~15 cm (6 in) to sea level rise, while the current likely trajectory of 2.7 °C (4.9 °F) would result in the SLR contribution of ~11 cm (4+12 in) by 2100.

The absolute largest amount of glacier ice is located in the Hindu Kush Himalaya region, which is colloquially known as the Earth's Third Pole as the result. It is believed that one third of that ice will be lost by 2100 even if the warming is limited to 1.5 °C (2.7 °F), while the intermediate and severe climate change scenarios (Representative Concentration Pathways (RCP) 4.5 and 8.5) are likely to lead to the losses of 50% and >67% of the region's glaciers over the same timeframe. Glacier melt is projected to accelerate regional river flows until the amount of meltwater peaks around 2060, going into an irreversible decline afterwards. Since regional precipitation will continue to increase even as the glacier meltwater contribution declines, annual river flows are only expected to diminish in the western basins where contribution from the monsoon is low: however, irrigation and hydropower generation would still have to adjust to greater interannual variability and lower pre-monsoon flows in all of the region's rivers.

Permafrost thaw

Ground collapse caused by abrupt permafrost thaw in Herschel Island, Canada, 2013
Feedback processes related to land and subsea permafrost

Perennially frozen ground, or permafrost, covers large fractions of land – mainly in Siberia, Alaska, northern Canada and the Tibetan plateau – and can be up to a kilometre thick. Subsea permafrost up to 100 metres thick also occurs on the sea floor under part of the Arctic Ocean. This frozen ground holds vast amounts of carbon from plants and animals that have died and decomposed over thousands of years. Scientists believe there is nearly twice as much carbon in permafrost than is present in Earth's atmosphere.

As the climate warms and the permafrost begins to thaw, carbon dioxide and methane are released into the atmosphere. With higher temperatures, microbes become active and decompose the biological material in the permafrost, some of which is irreversibly lost. While most thaw is gradual and will take centuries, abrupt thaw can occur in some places where permafrost is rich in large ice masses, which once melted cause the ground to slump or form 'thermokarst' lakes over years to decades. These processes can become self-sustaining, leading to localised tipping dynamics, and could increase greenhouse gas emissions by around 40%. Because CO2 and methane are both greenhouse gases, they act as a self-reinforcing feedback on permafrost thaw, but are unlikely to lead to a global tipping point or runaway warming process.

Atlantic meridional overturning circulation (AMOC)

The Northern part of the Atlantic meridional overturning circulation

The Atlantic meridional overturning circulation (AMOC), also known as the Gulf Stream System, is a large system of ocean currents. It is driven by differences in the density of water; colder and more salty water is heavier than warmer fresh water. The AMOC acts as a conveyor belt, sending warm surface water from the tropics north, and carrying cold fresh water back south. As warm water flows northwards, some evaporates which increases salinity. It also cools when it is exposed to cooler air. Cold, salty water is more dense and slowly begins to sink. Several kilometres below the surface, cold, dense water begins to move south. Increased rainfall and the melting of ice due to global warming dilutes the salty surface water, and warming further decreases its density. The lighter water is less able to sink, slowing down the circulation.

Theory, simplified models, and reconstructions of abrupt changes in the past suggest the AMOC has a tipping point. If freshwater input from melting glaciers reaches a certain threshold, it could collapse into a state of reduced flow. Even after melting stops, the AMOC may not return to its current state. It is unlikely that the AMOC will tip in the 21st century, but it may do so before 2300 if greenhouse gas emissions are very high. A weakening of 24% to 39% is expected depending on greenhouse emissions, even without tipping behaviour. If the AMOC does shut down, a new stable state could emerge that lasts for thousands of years, possibly triggering other tipping points.

In 2021, a study which used a primitive finite-difference ocean model estimated that AMOC collapse could be invoked by a sufficiently fast increase in ice melt even if it never reached the common thresholds for tipping obtained from slower change. Thus, it implied that the AMOC collapse is more likely than what is usually estimated by the complex and large-scale climate models. Another 2021 study found early-warning signals in a set of AMOC indices, suggesting that the AMOC may be close to tipping. However, it was contradicted by another study published in the same journal the following year, which found a largely stable AMOC which had so far not been affected by climate change beyond its own natural variability. Two more studies published in 2022 have also suggested that the modelling approaches commonly used to evaluate AMOC appear to overestimate the risk of its collapse. In October 2024, 44 climate scientists published an open letter, claiming that according to scientific studies in the past few years, the risk of AMOC collapse has been greatly underestimated, it can occur in the next few decades, with devastating impacts especially for Nordic countries. An August 2025 study concluded that the collapse of AMOC could start as early as the 2060s.

North subpolar gyre

Modelled 21st century warming under the "intermediate" climate change scenario (top). The potential collapse of the subpolar gyre in this scenario (middle). The collapse of the entire AMOC (bottom).

Some climate models indicate that the deep convection in Labrador-Irminger Seas could collapse under certain global warming scenarios, which would then collapse the entire circulation in the North subpolar gyre. It is considered unlikely to recover even if the temperature is returned to a lower level, making it an example of a climate tipping point. This would result in rapid cooling, with implications for economic sectors, agriculture industry, water resources and energy management in Western Europe and the East Coast of the United States. Frajka-Williams et al. 2017 pointed out that recent changes in cooling of the subpolar gyre, warm temperatures in the subtropics and cool anomalies over the tropics, increased the spatial distribution of meridional gradient in sea surface temperatures, which is not captured by the AMO Index.

A 2021 study found that this collapse occurs in only four CMIP6 models out of 35 analyzed. However, only 11 models out of 35 can simulate North Atlantic Current with a high degree of accuracy, and this includes all four models which simulate collapse of the subpolar gyre. As the result, the study estimated the risk of an abrupt cooling event over Europe caused by the collapse of the current at 36.4%, which is lower than the 45.5% chance estimated by the previous generation of models. In 2022, a paper suggested that previous disruption of subpolar gyre was connected to the Little Ice Age.

Southern Ocean overturning circulation

Since the 1970s, the upper cell of the circulation has strengthened, while the lower cell weakened.

Southern ocean overturning circulation itself consists of two parts, the upper and the lower cell. The smaller upper cell is most strongly affected by winds due to its proximity to the surface, while the behaviour of the larger lower cell is defined by the temperature and salinity of Antarctic bottom water. The strength of both halves had undergone substantial changes in the recent decades: the flow of the upper cell has increased by 50–60% since 1970s, while the lower cell has weakened by 10–20%. This has been partly due to the natural cycle of Interdecadal Pacific Oscillation, and climate change has played a substantial role in both trends, as it had altered the Southern Annular Mode weather pattern, while the massive growth of ocean heat content in the Southern Ocean has increased the melting of the Antarctic ice sheets, and this fresh meltwater dilutes salty Antarctic bottom water.

Paleoclimate evidence shows that the entire circulation had strongly weakened or outright collapsed before: some preliminary research suggests that such a collapse may become likely once global warming reaches levels between 1.7 °C (3.1 °F) and 3 °C (5.4 °F). However, there is far less certainty than with the estimates for most other tipping points in the climate system. Even if the circulation's collapse starts in the near future, it is unlikely to be complete until close to 2300, Similarly, impacts such as the reduction in precipitation in the Southern Hemisphere, with a corresponding increase in the North, or a decline of fisheries in the Southern Ocean with a potential collapse of certain marine ecosystems, are also expected to unfold over multiple centuries.

Tipping points in terrestrial systems

As of 2022, 20% of the Amazon rainforest has been "transformed" (deforested) and another 6% has been "highly degraded", causing Amazon Watch to warn that the Amazonia is in the midst of a tipping point crisis.

Amazon rainforest dieback

The Amazon rainforest is the largest tropical rainforest in the world. It is twice as big as India and spans nine countries in South America. It produces around half of its own rainfall by recycling moisture through evaporation and transpiration as air moves across the forest. This moisture recycling expands the area in which there is enough rainfall for rainforest to be maintained, and without it one model indicates around 40% of the current forest area would be too dry to sustain rainforest. However, when forest is lost via climate change (from droughts and wildfires) or deforestation, there will be less rain in downwind regions, increasing tree stress and mortality there. Eventually, if enough forest is lost a threshold can be reached beyond which large parts of the remaining rainforest may die off and transform into drier degraded forest or savanna landscapes, particularly in the drier south and east. In 2022, a study reported that the rainforest has been losing resilience since the early 2000s. Resilience is measured by recovery-time from short-term perturbations, with delayed return to equilibrium of the rainforest termed as critical slowing down. The observed loss of resilience reinforces the theory that the rainforest could be approaching a critical transition, although it cannot determine exactly when or if a tipping point will be reached.

Boreal forest biome shift

During the last quarter of the twentieth century, the zone of latitude occupied by taiga experienced some of the greatest temperature increases on Earth. Winter temperatures have increased more than summer temperatures. In summer, the daily low temperature has increased more than the daily high temperature. It has been hypothesised that the boreal environments have only a few states which are stable in the long term - a treeless tundra/steppe, a forest with >75% tree cover and an open woodland with ≈20% and ≈45% tree cover. Thus, continued climate change would be able to force at least some of the presently existing taiga forests into one of the two woodland states or even into a treeless steppe - but it could also shift tundra areas into woodland or forest states as they warm and become more suitable for tree growth.

The response of six tree species common in Quebec's forests to 2 °C (3.6 °F) and 4 °C (7.2 °F) warming under different precipitation levels.

These trends were first detected in the Canadian boreal forests in the early 2010s, and summer warming had also been shown to increase water stress and reduce tree growth in dry areas of the southern boreal forest in central Alaska and portions of far eastern Russia. In Siberia, the taiga is converting from predominantly needle-shedding larch trees to evergreen conifers in response to a warming climate.

Subsequent research in Canada found that even in the forests where biomass trends did not change, there was a substantial shift towards the deciduous broad-leaved trees with higher drought tolerance over the past 65 years. A Landsat analysis of 100,000 undisturbed sites found that the areas with low tree cover became greener in response to warming, but tree mortality (browning) became the dominant response as the proportion of existing tree cover increased. A 2018 study of the seven tree species dominant in the Eastern Canadian forests found that while 2 °C (3.6 °F) warming alone increases their growth by around 13% on average, water availability is much more important than temperature. Also, further warming of up to 4 °C (7.2 °F) would result in substantial declines unless matched by increases in precipitation.

A 2021 paper had confirmed that the boreal forests are much more strongly affected by climate change than the other forest types in Canada and projected that most of the eastern Canadian boreal forests would reach a tipping point around 2080 under the RCP 8.5 scenario, which represents the largest potential increase in anthropogenic emissions. Another 2021 study projected that under the moderate SSP2-4.5 scenario, boreal forests would experience a 15% worldwide increase in biomass by the end of the century, but this would be more than offset by the 41% biomass decline in the tropics. In 2022, the results of a 5-year warming experiment in North America had shown that the juveniles of tree species which currently dominate the southern margins of the boreal forests fare the worst in response to even 1.5 °C (2.7 °F) or 3.1 °C (5.6 °F) of warming and the associated reductions in precipitation. While the temperate species which would benefit from such conditions are also present in the southern boreal forests, they are both rare and have slower growth rates.

Sahel greening

Greening of the Sahel between 1982 and 1999

The Special Report on Global Warming of 1.5 °C and the IPCC Fifth Assessment Report indicate that global warming will likely result in increased precipitation across most of East Africa, parts of Central Africa and the principal wet season of West Africa. However, there is significant uncertainty related to these projections especially for West Africa.Currently, the Sahel is becoming greener but precipitation has not fully recovered to levels reached in the mid-20th century.

A study from 2022 concluded: "Clearly the existence of a future tipping threshold for the WAM (West African Monsoon) and Sahel remains uncertain as does its sign but given multiple past abrupt shifts, known weaknesses in current models, and huge regional impacts but modest global climate feedback, we retain the Sahel/WAM as a potential regional impact tipping element (low confidence)."

Some simulations of global warming and increased carbon dioxide concentrations have shown a substantial increase in precipitation in the Sahel/Sahara. This and the increased plant growth directly induced by carbon dioxide could lead to an expansion of vegetation into present-day desert, although it might be accompanied by a northward shift of the desert, i.e. a drying of northernmost Africa.

Vulnerable stores of tropical peat carbon: Cuvette Centrale peatland

Map of Cuvette Centrale location in the Congo Basin. Three graphs portray the evolution of its peatland carbon content over the past 20,000 years, as reconstructed from three peat cores.

In 2017, it was discovered that 40% of the Cuvette Centrale wetlands are underlain with a dense layer of peat, which contains around 30 petagrams (billions of tons) of carbon. This amounts to 28% of all tropical peat carbon, equivalent to the carbon contained in all the forests of the Congo Basin. In other words, while this peatland only covers 4% of the Congo Basin area, its carbon content is equal to that of all trees in the other 96%. It was then estimated that if all of that peat burned, the atmosphere would absorb the equivalent of 20 years of current United States carbon dioxide emissions, or three years of all anthropogenic CO2 emissions.

This threat prompted the signing of Brazzaville Declaration in March 2018: an agreement between Democratic Republic of Congo, the Republic of Congo and Indonesia (a country with longer experience of managing its own tropical peatlands) aiming to promote better management and conservation of this region. However, 2022 research by the same team which had originally discovered this peatland not only revised its area (from the original estimate of 145,500 square kilometres (56,200 sq mi) to 167,600 square kilometres (64,700 sq mi)) and depth (from 2 m (6.6 ft) to (1.7 m (5.6 ft)) but also noted that only 8% of this peat carbon is currently covered by the existing protected areas. For comparison, 26% of its peat is located in areas open to logging, mining or palm oil plantations, and nearly all of this area is open for fossil fuel exploration.

Even in the absence of local disturbance from these activities, this area is the most vulnerable store of tropical peat carbon in the world, as its climate is already much drier than that of the other tropical peatlands in the Southeast Asia and the Amazon rainforest. A 2022 study suggests that the geologically recent conditions between 7,500 years ago and 2,000 years ago were already dry enough to cause substantial peat release from this area, and that these conditions are likely to recur in the near future under continued climate change. In this case, Cuvette Centrale would act as one of the tipping points in the climate system at some yet unknown time.

Other tipping points

Coral reef die-off

Bleached coral with normal coral in the background

Around 500 million people around the world depend on coral reefs for food, income, tourism and coastal protection. Since the 1980s, this is being threatened by the increase in sea surface temperatures which is triggering mass bleaching of coral, especially in sub-tropical regions. A sustained ocean temperature spike of 1 °C (1.8 °F) above average is enough to cause bleaching. Under heat stress, corals expel the small colourful algae which live in their tissues, which causes them to turn white. The algae, known as zooxanthellae, have a symbiotic relationship with coral such that without them, the corals slowly die. After these zooxanthellae have disappeared, the corals are vulnerable to a transition towards a seaweed-dominated ecosystem, making it very difficult to shift back to a coral-dominated ecosystem. The IPCC estimates that by the time temperatures have risen to 1.5 °C (2.7 °F) above pre-industrial times, "Coral reefs... are projected to decline by a further 70–90%"; and that if the world warms by 2 °C (3.6 °F), they will become extremely rare.

Break-up of equatorial stratocumulus clouds

In 2019, a study employed a large eddy simulation model to estimate that equatorial stratocumulus clouds could break up and scatter when CO2 levels rise above 1,200 ppm (almost three times higher than the current levels, and over 4 times greater than the preindustrial levels). The study estimated that this would cause a surface warming of about 8 °C (14 °F) globally and 10 °C (18 °F) in the subtropics, which would be in addition to at least 4 °C (7.2 °F) already caused by such CO2 concentrations. In addition, stratocumulus clouds would not reform until the CO2 concentrations drop to a much lower level. It was suggested that this finding could help explain past episodes of unusually rapid warming such as Paleocene-Eocene Thermal Maximum. In 2020, further work from the same authors revealed that in their large eddy simulation, this tipping point cannot be stopped with solar radiation modification: in a hypothetical scenario where very high CO2 emissions continue for a long time but are offset with extensive solar radiation modification, the break-up of stratocumulus clouds is simply delayed until CO2 concentrations hit 1,700 ppm, at which point it would still cause around 5 °C (9.0 °F) of unavoidable warming.

However, because large eddy simulation models are simpler and smaller-scale than the general circulation models used for climate projections, with limited representation of atmospheric processes like subsidence, this finding is currently considered speculative. Other scientists say that the model used in that study unrealistically extrapolates the behavior of small cloud areas onto all cloud decks, and that it is incapable of simulating anything other than a rapid transition, with some comparing it to "a knob with two settings". Additionally, CO2 concentrations would only reach 1,200 ppm if the world follows Representative Concentration Pathway 8.5, which represents the highest possible greenhouse gas emission scenario and involves a massive expansion of coal infrastructure. In that case, 1,200 ppm would be passed shortly after 2100.

Cascading tipping points

A proposed tipping cascade with four tipping elements.

Crossing a threshold in one part of the climate system may trigger another tipping element to tip into a new state. Such sequences of thresholds are called cascading tipping points, an example of a domino effect. Ice loss in West Antarctica and Greenland will significantly alter ocean circulation. Sustained warming of the northern high latitudes as a result of this process could activate tipping elements in that region, such as permafrost degradation, and boreal forest dieback. Thawing permafrost is a threat multiplier because it holds roughly twice as much carbon as the amount currently circulating in the atmosphere. Loss of ice in Greenland likely destabilises the West Antarctic ice sheet via sea level rise, and vice-versa, especially if Greenland were to melt first as West Antarctica is particularly vulnerable to contact with warm sea water.

A 2021 study with three million computer simulations of a climate model showed that nearly one-third of those simulations resulted in domino effects, even when temperature increases were limited to 2 °C (3.6 °F) – the upper limit set by the Paris Agreement in 2015. The authors of the study said that the science of tipping points is so complex that there is great uncertainty as to how they might unfold, but nevertheless, argued that the possibility of cascading tipping points represents "an existential threat to civilisation". A network model analysis suggested that temporary overshoots of climate change – increasing global temperature beyond Paris Agreement goals temporarily as often projected – can substantially increase risks of climate tipping cascades ("by up to 72% compared with non-overshoot scenarios").

Formerly considered tipping elements

Earlier (2008) list of tipping elements in the climate system. When compared to later lists, the major differences are that in 2008 ENSO, Indian summer monsoon, Arctic ozone hole and all of Arctic sea ice were all listed as tipping points. Labrador-Irminger circulation, mountain glaciers and East Antarctic ice however were not included. This 2008 list also includes Antarctic bottom water (part of the Southern Ocean overturning circulation), which was left out of the 2022 list, but included in some subsequent ones.

The possibility that the El Niño–Southern Oscillation (ENSO) is a tipping element had attracted attention in the past. Normally strong winds blow west across the South Pacific Ocean from South America to Australia. Every two to seven years, the winds weaken due to pressure changes and the air and water in the middle of the Pacific warms up, causing changes in wind movement patterns around the globe. This is known as El Niño and typically leads to droughts in India, Indonesia and Brazil, and increased flooding in Peru. In 2015/2016, this caused food shortages affecting over 60 million people. El Niño-induced droughts may increase the likelihood of forest fires in the Amazon. The threshold for tipping was estimated to be between 3.5 °C (6.3 °F) and 7 °C (13 °F) of global warming in 2016. After tipping, the system would be in a more permanent El Niño state, rather than oscillating between different states. This has happened in Earth's past, in the Pliocene, but the layout of the ocean was significantly different from now. So far, there is no definitive evidence indicating changes in ENSO behaviour, and the IPCC Sixth Assessment Report concluded that it is "virtually certain that the ENSO will remain the dominant mode of interannual variability in a warmer world". Consequently, the 2022 assessment no longer includes it in the list of likely tipping elements.

The Indian summer monsoon is another part of the climate system which was considered suspectible to irreversible collapse in the earlier research. However, more recent research has demonstrated that warming tends to strengthen the Indian monsoon, and it is projected to strengthen in the future.

Methane hydrate deposits in the Arctic were once thought to be vulnerable to a rapid dissociation which would have a large impact on global temperatures, in a dramatic scenario known as a clathrate gun hypothesis. Later research found that it takes millennia for methane hydrates to respond to warming, while methane emissions from the seafloor rarely transfer from the water column into the atmosphere. IPCC Sixth Assessment Report states "It is very unlikely that gas clathrates (mostly methane) in deeper terrestrial permafrost and subsea clathrates will lead to a detectable departure from the emissions trajectory during this century."

Mathematical theory

Illustration of three types of tipping point; (a), (b) noise-, (c), (d) bifurcation- and (e), (f) rate-induced. (a), (c), (e) example time-series (coloured lines) through the tipping point with black solid lines indicating stable climate states (e.g. low or high rainfall) and dashed lines represent the boundary between stable states. (b), (d), (f) stability landscapes provide an understanding for the different types of tipping point. The valleys represent different climate states the system can occupy with hill tops separating the stable states.

Tipping point behaviour in the climate can be described in mathematical terms. Three types of tipping points have been identified—bifurcation, noise-induced and rate-dependent.

Bifurcation-induced tipping

Bifurcation-induced tipping happens when a particular parameter in the climate (for instance a change in environmental conditions or forcing), passes a critical level – at which point a bifurcation takes place – and what was a stable state loses its stability or simply disappears. The Atlantic Meridional Overturning Circulation (AMOC) is an example of a tipping element that can show bifurcation-induced tipping. Slow changes to the bifurcation parameters in this system – the salinity and temperature of the water – may push the circulation towards collapse.

Many types of bifurcations show hysteresis, which is the dependence of the state of a system on its history. For instance, depending on how warm it was in the past, there can be differing amounts of ice on the poles at the same concentration of greenhouse gases or temperature.

Early warning signals

For tipping points that occur because of a bifurcation, it may be possible to detect whether a system is getting closer to a tipping point, as it becomes less resilient to perturbations on approach of the tipping threshold. These systems display critical slowing down, with an increased memory (rising autocorrelation) and variance. Depending on the nature of the tipping system, there may be other types of early warning signals. Abrupt change is not an early warning signal (EWS) for tipping points, as abrupt change can also occur if the changes are reversible to the control parameter.

These EWSs are often developed and tested using time series from the paleo record, like sediments, ice caps, and tree rings, where past examples of tipping can be observed. It is not always possible to say whether increased variance and autocorrelation is a precursor to tipping, or caused by internal variability, for instance in the case of the collapse of the AMOC. Quality limitations of paleodata further complicate the development of EWSs. They have been developed for detecting tipping due to drought in forests in California, and melting of the Pine Island Glacier in West Antarctica, among other systems. Using early warning signals (increased autocorrelation and variance of the melt rate time series), it has been suggested that the Greenland ice sheet is currently losing resilience, consistent with modelled early warning signals of the ice sheet.

Human-induced changes in the climate system may be too fast for early warning signals to become evident, especially in systems with inertia.

Noise-induced tipping

Noise-induced tipping is the transition from one state to another due to random fluctuations or internal variability of the system. Noise-induced transitions do not show any of the early warning signals which occur with bifurcations. This means they are unpredictable because the underlying potential does not change. Because they are unpredictable, such occurrences are often described as a "one-in-x-year" event. An example is the Dansgaard–Oeschger events during the last ice age, with 25 occurrences of sudden climate fluctuations over a 500-year period.

Rate-induced tipping

Rate-induced tipping occurs when a change in the environment is faster than the force that restores the system to its stable state. In peatlands, for instance, after years of relative stability, rate-induced tipping can lead to an "explosive release of soil carbon from peatlands into the atmosphere" – sometimes known as "compost bomb instability". The AMOC may also show rate-induced tipping: if the rate of ice melt increases too fast, it may collapse, even before the ice melt reaches the critical value where the system would undergo a bifurcation.

Potential impacts

Schematic of some possible interactions and cascading effects between the Earth's climate system and humanity's social system

Tipping points can have very severe impacts. They can exacerbate current dangerous impacts of climate change, or give rise to new impacts. Some potential tipping points would take place abruptly, such as disruptions to the Indian monsoon, with severe impacts on food security for hundreds of millions. Other impacts would likely take place over longer timescales, such as the melting of the ice caps. The circa 10 metres (33 ft) of sea level rise from the combined melt of Greenland and West Antarctica would require moving many cities inland over the course of centuries, but would also accelerate sea level rise this century, with Antarctic ice sheet instability projected to expose 120 million more people to annual floods in a mid-emissions scenario. A collapse of the Atlantic Overturning Circulation would cause over 10 degrees Celsius of cooling in parts of Europe, cause drying in Europe, Central America, West Africa, and southern Asia, and lead to about 1 metre (3+12 ft) of sea level rise in the North Atlantic. The impacts of AMOC collapse would have serious implications for food security, with one projection showing reduced yields of key crops across most world regions, with for example arable agriculture becoming economically infeasible in Britain. These impacts could happen simultaneously in the case of cascading tipping points. A review of abrupt changes over the last 30,000 years showed that tipping points can lead to a large set of cascading impacts in climate, ecological and social systems. For instance, the abrupt termination of the African humid period cascaded, and desertification and regime shifts led to the retreat of pastoral societies in North Africa and a change of dynasty in Egypt.

Some scholars have proposed a threshold which, if crossed, could trigger multiple tipping points and self-reinforcing feedback loops that would prevent stabilisation of the climate, causing much greater warming and sea-level rises and leading to severe disruption to ecosystems, society, and economies. This scenario is sometimes called the Hothouse Earth scenario. The researchers proposed that this scenario could unfold beyond a threshold of around 2 °C above pre-industrial levels. However, while this scenario is possible, the existence and value of this threshold remains speculative, and doubts have been raised if tipping points would lock in much extra warming in the shorter term. Decisions taken over the next decade could influence the climate of the planet for tens to hundreds of thousands of years and potentially even lead to conditions which are inhospitable to current human societies. The report also states that there is a possibility of a cascade of tipping points being triggered even if the goal outlined in the Paris Agreement to limit warming to 1.5–2.0 °C (2.7–3.6 °F) is achieved.

Geological timescales

Meltwater pulse 1A was a period of abrupt sea level rise around 14,000 years ago. It may be an example of a tipping point.

The geological record shows many abrupt changes on geologic time scales that suggest tipping points may have been crossed in pre-historic times. For instance, the Dansgaard–Oeschger events during the last ice age were periods of abrupt warming (within decades) in Greenland and Europe, that may have involved the abrupt changes in major ocean currents. During the deglaciation in the early Holocene, sea level rise was not smooth, but rose abruptly during meltwater pulses. The monsoon in North Africa saw abrupt changes on decadal timescales during the African humid period. This period, spanning from 15,000 to 5,000 years ago, also ended suddenly in a drier state.

Runaway greenhouse effect

A runaway greenhouse effect is a tipping point so extreme that oceans evaporate and the water vapour escapes to space, an irreversible climate state that happened on Venus. A runaway greenhouse effect has virtually no chance of being caused by people. Venus-like conditions on the Earth require a large long-term forcing that is unlikely to occur until the sun brightens by a ten of percents, which will take 600 - 700 million years.
  • The paper also provides the same estimate in terms of equivalent emissions: partial dieback would be equivalent to the emissions of 30 billion tonnes of carbon, while total dieback would be equivalent to 75 billion tonnes of carbon.
  • The paper clarifies that this represents a 50% increase of gradual permafrost thaw: it also provides the same estimate in terms of emissions per each degree of warming: 10 billion tonnes of carbon and 14 billion tonnes of carbon equivalent by 2100, and 25/35 billion tonnes of carbon/carbon equivalent by 2300.
  • The loss of these forests would be equivalent to the emissions of 52 billion tons of carbon, but this would be more than offset by the area's albedo effect increasing and reflecting more sunlight.
  • Cognitive science

    From Wikipedia, the free encyclopedia
    Figure illustrating the fields that contributed to the birth of cognitive science, including philosophy of mind, linguistics, neuroscience, artificial intelligence, anthropology, and psychology

    Cognitive science is the interdisciplinary, scientific study of the mind and its processes. It examines the nature, the tasks, and the functions of cognition (in a broad sense). Mental faculties of concern to cognitive scientists include perception, memory, attention, reasoning, language, and emotion. To understand these faculties, cognitive scientists borrow from fields such as psychology, philosophy, artificial intelligence, neuroscience, linguistics, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision-making to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."

    History

    The cognitive sciences began as an intellectual movement in the 1950s, called the cognitive revolution. Cognitive science has a prehistory traceable back to ancient Greek philosophical texts (see Plato's Meno and Aristotle's De Anima).

    The modern culture of cognitive science can be traced back to the early cyberneticists in the 1930s and 1940s, such as Warren McCulloch and Walter Pitts, who sought to understand the organizing principles of the mind. McCulloch and Pitts developed the first variants of what are now known as artificial neural networks, models of computation inspired by the structure of biological neural networks.

    Another precursor was the early development of the theory of computation and the digital computer in the 1940s and 1950s. Kurt Gödel, Alonzo Church, Claude Shannon, Alan Turing, and John von Neumann were instrumental in these developments. The modern computer, or Von Neumann machine, would play a central role in cognitive science, both as a metaphor for the mind, and as a tool for investigation.

    The first instance of cognitive science experiments being done at an academic institution took place at MIT Sloan School of Management, established by J.C.R. Licklider working within the psychology department and conducting experiments using computer memory as models for human cognition. In 1959, Noam Chomsky published a scathing review of B. F. Skinner's book Verbal Behavior. At the time, Skinner's behaviorist paradigm dominated the field of psychology within the United States. Most psychologists focused on functional relations between stimulus and response, without positing internal representations. Chomsky argued that in order to explain language, we needed a theory like generative grammar, which not only attributed internal representations but characterized their underlying order.

    The term cognitive science was coined by Christopher Longuet-Higgins in his 1973 commentary on the Lighthill report, which concerned the then-current state of artificial intelligence research. In the same decade, the journal Cognitive Science and the Cognitive Science Society were founded. The founding meeting of the Cognitive Science Society was held at the University of California, San Diego in 1979, which resulted in cognitive science becoming an internationally visible enterprise. In 1972, Hampshire College started the first undergraduate education program in Cognitive Science, led by Neil Stillings. In 1982, with assistance from Professor Stillings, Vassar College became the first institution in the world to grant an undergraduate degree in Cognitive Science. In 1986, the first Cognitive Science Department in the world was founded at the University of California, San Diego.

    In the 1970s and early 1980s, as access to computers increased, artificial intelligence research expanded. Researchers such as Marvin Minsky would write computer programs in languages such as LISP to attempt to formally characterize the steps that human beings went through, for instance, in making decisions and solving problems, in the hope of better understanding human thought, and also in the hope of creating artificial minds. This approach is known as "symbolic AI".

    Eventually the limits of the symbolic AI research program became apparent. For instance, it seemed to be unrealistic to comprehensively list human knowledge in a form usable by a symbolic computer program. The late 80s and 90s saw the rise of neural networks and connectionism as a research paradigm. Under this point of view, often attributed to James McClelland and David Rumelhart, the mind could be characterized as a set of complex associations, represented as a layered network. Critics argue that there are some phenomena which are better captured by symbolic models, and that connectionist models are often so complex as to have little explanatory power. Recently symbolic and connectionist models have been combined, making it possible to take advantage of both forms of explanation. While both connectionism and symbolic approaches have proven useful for testing various hypotheses and exploring approaches to understanding aspects of cognition and lower level brain functions, neither are biologically realistic and therefore, both suffer from a lack of neuroscientific plausibility. Connectionism has proven useful for exploring computationally how cognition emerges in development and occurs in the human brain, and has provided alternatives to strictly domain-specific / domain general approaches. For example, scientists such as Jeff Elman, Liz Bates, and Annette Karmiloff-Smith have posited that networks in the brain emerge from the dynamic interaction between them and environmental input.

    Recent developments in quantum computation, including the ability to run quantum circuits on quantum computers such as IBM Quantum Platform, has accelerated work using elements from quantum mechanics in cognitive models.

    Principles

    Levels of analysis

    A central tenet of cognitive science is that a complete understanding of the mind/brain cannot be attained by studying only a single level. An example would be the problem of remembering a phone number and recalling it later. One approach to understanding this process would be to study behavior through direct observation, or naturalistic observation. A person could be presented with a phone number and be asked to recall it after some delay of time; then the accuracy of the response could be measured. Another approach to measure cognitive ability would be to study the firings of individual neurons while a person is trying to remember the phone number. Neither of these experiments on its own would fully explain how the process of remembering a phone number works. Even if the technology to map out every neuron in the brain in real-time were available and it were known when each neuron fired it would still be impossible to know how a particular firing of neurons translates into the observed behavior. Thus an understanding of how these two levels relate to each other is imperative. Francisco Varela, in The Embodied Mind: Cognitive Science and Human Experience, argues that "the new sciences of the mind need to enlarge their horizon to encompass both lived human experience and the possibilities for transformation inherent in human experience". On the classic cognitivist view, this can be provided by a functional level account of the process. Studying a particular phenomenon from multiple levels creates a better understanding of the processes that occur in the brain to give rise to a particular behavior. Marr gave a famous description of three levels of analysis:

    1. The computational theory, specifying the goals of the computation;
    2. Representation and algorithms, giving a representation of the inputs and outputs and the algorithms which transform one into the other; and
    3. The hardware implementation, or how algorithm and representation may be physically realized.

    Interdisciplinary nature

    Cognitive science is an interdisciplinary field with contributors from various fields, including psychology, neuroscience, linguistics, philosophy of mind, computer science, anthropology and biology. Cognitive scientists work collectively in hope of understanding the mind and its interactions with the surrounding world much like other sciences do. The field regards itself as compatible with the physical sciences and uses the scientific method as well as simulation or modeling, often comparing the output of models with aspects of human cognition. Similarly to the field of psychology, there is some doubt whether there is a unified cognitive science, which have led some researchers to prefer 'cognitive sciences' in plural.

    Many, but not all, who consider themselves cognitive scientists hold a functionalist view of the mind—the view that mental states and processes should be explained by their function – what they do. According to the multiple realizability account of functionalism, even non-human systems such as robots and computers can be ascribed as having cognition.

    Cognitive science, the term

    The term "cognitive" in "cognitive science" is used for "any kind of mental operation or structure that can be studied in precise terms" (Lakoff and Johnson, 1999). This conceptualization is very broad, and should not be confused with how "cognitive" is used in some traditions of analytic philosophy, where "cognitive" has to do only with formal rules and truth-conditional semantics.

    The earliest entries for the word "cognitive" in the OED take it to mean roughly "pertaining to the action or process of knowing". The first entry, from 1586, shows the word was at one time used in the context of discussions of Platonic theories of knowledge. Most in cognitive science, however, presumably do not believe their field is the study of anything as certain as the knowledge sought by Plato.

    Scope

    Cognitive science is a large field, and covers a wide array of topics on cognition. However, it should be recognized that cognitive science has not always been equally concerned with every topic that might bear relevance to the nature and operation of minds. Classical cognitivists have largely de-emphasized or avoided social and cultural factors, embodiment, emotion, consciousness, animal cognition, and comparative and evolutionary psychologies. However, with the decline of behaviorism, internal states such as affects and emotions, as well as awareness and covert attention became approachable again. For example, situated and embodied cognition theories take into account the current state of the environment as well as the role of the body in cognition. With the newfound emphasis on information processing, observable behavior was no longer the hallmark of psychological theory, but the modeling or recording of mental states.

    Below are some of the main topics that cognitive science is concerned with; see List of cognitive science topics for a more exhaustive list.

    Artificial intelligence

    Artificial intelligence (AI) involves the study of cognitive phenomena in machines. One of the practical goals of AI is to implement aspects of human intelligence in computers. Computers are also widely used as a tool with which to study cognitive phenomena. Computational modeling uses simulations to study how human intelligence may be structured. (See § Computational modeling.)

    There is some debate in the field as to whether the mind is best viewed as a huge array of small but individually feeble elements (i.e. neurons), or as a collection of higher-level structures such as symbols, schemes, plans, and rules. The former view uses connectionism to study the mind, whereas the latter emphasizes symbolic artificial intelligence. One way to view the issue is whether it is possible to accurately simulate a human brain on a computer without accurately simulating the neurons that make up the human brain.

    Attention

    Attention is the selection of important information. The human mind is bombarded with millions of stimuli and it must have a way of deciding which of this information to process. Attention is sometimes seen as a spotlight, meaning one can only shine the light on a particular set of information. Experiments that support this metaphor include the dichotic listening task (Cherry, 1957) and studies of inattentional blindness (Mack and Rock, 1998). In the dichotic listening task, subjects are bombarded with two different messages, one in each ear, and told to focus on only one of the messages. At the end of the experiment, when asked about the content of the unattended message, subjects cannot report it.

    The psychological construct of attention is sometimes confused with the concept of intentionality due to some degree of semantic ambiguity in their definitions. At the beginning of experimental research on attention, Wilhelm Wundt defined this term as "that psychical process, which is operative in the clear perception of the narrow region of the content of consciousness." His experiments showed the limits of attention in space and time, which were 3-6 letters during an exposition of 1/10 s. Because this notion develops within the framework of the original meaning during a hundred years of research, the definition of attention would reflect the sense when it accounts for the main features initially attributed to this term – it is a process of controlling thought that continues over time. While intentionality is the power of minds to be about something, attention is the concentration of awareness on some phenomenon during a period of time, which is necessary to elevate the clear perception of the narrow region of the content of consciousness and which is feasible to control this focus in mind.

    The significance of knowledge about the scope of attention for studying cognition is that it defines the intellectual functions of cognition such as apprehension, judgment, reasoning, and working memory. The development of attention scope increases the set of faculties responsible for the mind relies on how it perceives, remembers, considers, and evaluates in making decisions. The ground of this statement is that the more details (associated with an event) the mind may grasp for their comparison, association, and categorization, the closer apprehension, judgment, and reasoning of the event are in accord with reality. According to Latvian professor Sandra Mihailova and professor Igor Val Danilov, the more elements of the phenomenon (or phenomena ) the mind can keep in the scope of attention simultaneously, the more significant number of reasonable combinations within that event it can achieve, enhancing the probability of better understanding features and particularity of the phenomenon (phenomena). For example, three items in the focal point of consciousness yield six possible combinations (3 factorial) and four items – 24 (4 factorial) combinations. The number of reasonable combinations becomes significant in the case of a focal point with six items with 720 possible combinations (6 factorial).

    Embodied cognition approaches to cognitive science emphasize the role of body and environment in cognition. This includes both neural and extra-neural bodily processes, and factors that range from affective and emotional processes, to posture, motor control, proprioception, and kinaesthesis, to autonomic processes that involve heartbeat and respiration, to the role of the enteric gut microbiome. It also includes accounts of how the body engages with or is coupled to social and physical environments. 4E cognition includes a broad range of views about brain-body-environment interaction, from causal embeddedness to stronger claims about how the mind extends to include tools and instruments, as well as the role of social interactions, action-oriented processes, and affordances. 4E theories range from those closer to classic cognitivism (so-called "weak" embodied cognition) to stronger extended and enactive versions that are sometimes referred to as radical embodied cognitive science.

    A hypothesis of pre-perceptual multimodal integration supports embodied cognition approaches and converges two competing naturalist and constructivist viewpoints about cognition and the development of emotions. According to this hypothesis supported by empirical data, cognition and emotion development are initiated by the association of affective cues with stimuli responsible for triggering the neuronal pathways of simple reflexes. This pre-perceptual multimodal integration can succeed owing to neuronal coherence in mother-child dyads beginning from pregnancy. These cognitive-reflex and emotion-reflex stimuli conjunctions further form simple innate neuronal assemblies, shaping the cognitive and emotional neuronal patterns in statistical learning that are continuously connected with the neuronal pathways of reflexes.

    Knowledge and processing of language

    A well known example of a phrase structure tree. This is one way of representing human language that shows how different components are organized hierarchically.

    The ability to learn and understand language is an extremely complex process. Language is acquired within the first few years of life, and all humans under normal circumstances are able to acquire language proficiently. A major driving force in the theoretical linguistic field is discovering the nature that language must have in the abstract in order to be learned in such a fashion. Some of the driving research questions in studying how the brain itself processes language include: (1) To what extent is linguistic knowledge innate or learned?, (2) Why is it more difficult for adults to acquire a second-language than it is for infants to acquire their first-language?, and (3) How are humans able to understand novel sentences?

    The study of language processing ranges from the investigation of the sound patterns of speech to the meaning of words and whole sentences. Linguistics often divides language processing into orthography, phonetics, phonology, morphology, syntax, semantics, and pragmatics. Many aspects of language can be studied from each of these components and from their interaction.

    The study of language processing in cognitive science is closely tied to the field of linguistics. Linguistics was traditionally studied as a part of the humanities, including studies of history, art and literature. In the last fifty years or so, more and more researchers have studied knowledge and use of language as a cognitive phenomenon, the main problems being how knowledge of language can be acquired and used, and what precisely it consists of. Linguists have found that, while humans form sentences in ways apparently governed by very complex systems, they are remarkably unaware of the rules that govern their own speech. Thus linguists must resort to indirect methods to determine what those rules might be, if indeed rules as such exist. In any event, if speech is indeed governed by rules, they appear to be opaque to any conscious consideration.

    Learning and development

    Learning and development are the processes by which we acquire knowledge and information over time. Infants are born with little or no knowledge (depending on how knowledge is defined), yet they rapidly acquire the ability to use language, walk, and recognize people and objects. Research in learning and development aims to explain the mechanisms by which these processes might take place.

    A major question in the study of cognitive development is the extent to which certain abilities are innate or learned. This is often framed in terms of the nature and nurture debate. The nativist view emphasizes that certain features are innate to an organism and are determined by its genetic endowment. The empiricist view, on the other hand, emphasizes that certain abilities are learned from the environment. Although clearly both genetic and environmental input is needed for a child to develop normally, considerable debate remains about how genetic information might guide cognitive development. In the area of language acquisition, for example, some (such as Steven Pinker) have argued that specific information containing universal grammatical rules must be contained in the genes, whereas others (such as Jeffrey Elman and colleagues in Rethinking Innateness) have argued that Pinker's claims are biologically unrealistic. They argue that genes determine the architecture of a learning system, but that specific "facts" about how grammar works can only be learned as a result of experience.

    Memory

    Memory allows us to store information for later retrieval. Memory is often thought of as consisting of both a long-term and short-term store. Long-term memory allows us to store information over prolonged periods (days, weeks, years). We do not yet know the practical limit of long-term memory capacity. Short-term memory allows us to store information over short time scales (seconds or minutes).

    Memory is also often grouped into declarative and procedural forms. Declarative memory—grouped into subsets of semantic and episodic forms of memory—refers to our memory for facts and specific knowledge, specific meanings, and specific experiences (e.g. "Are apples food?", or "What did I eat for breakfast four days ago?"). Procedural memory allows us to remember actions and motor sequences (e.g. how to ride a bicycle) and is often dubbed implicit knowledge or memory .

    Cognitive scientists study memory just as psychologists do, but tend to focus more on how memory bears on cognitive processes, and the interrelationship between cognition and memory. One example of this could be, what mental processes does a person go through to retrieve a long-lost memory? Or, what differentiates between the cognitive process of recognition (seeing hints of something before remembering it, or memory in context) and recall (retrieving a memory, as in "fill-in-the-blank")?

    Perception and action

    The Necker cube, an example of an optical illusion
    An optical illusion. The square A is exactly the same shade of gray as square B. See checker shadow illusion.

    Perception is the ability to take in information via the senses, and process it in some way. Vision and hearing are two dominant senses that allow us to perceive the environment. Some questions in the study of visual perception, for example, include: (1) How are we able to recognize objects?, (2) Why do we perceive a continuous visual environment, even though we only see small bits of it at any one time? One tool for studying visual perception is by looking at how people process optical illusions. The image on the right of a Necker cube is an example of a bistable percept, that is, the cube can be interpreted as being oriented in two different directions.

    The study of haptic (tactile), olfactory, and gustatory stimuli also fall into the domain of perception.

    Action is taken to refer to the output of a system. In humans, this is accomplished through motor responses. Spatial planning and movement, speech production, and complex motor movements are all aspects of action.

    Consciousness

    17th century representation of consciousness by Robert Fludd, an English Paracelsian physician

    Consciousness, at its simplest, is awareness of states or objects either internal to one's self or in one's external environment. However, its nature has led to millennia of explanations, analyses, and debate among philosophers, scientists, and theologians. Opinions differ about what exactly needs to be studied, or can even be considered consciousness. In some explanations, it is synonymous with mind, and at other times, an aspect of it.

    In the past, consciousness meant one's "inner life": the world of introspection, private thought, imagination, and volition. Today, it often includes any kind of cognition, experience, feeling, or perception. It may be awareness, awareness of awareness, metacognition, or self-awareness, either continuously changing or not. There is also a medical definition that helps, for example, to discern "coma" from other states. The disparate range of research, notions, and speculations raises some curiosity about whether the right questions are being asked.

    Examples of the range of descriptions, definitions and explanations are: ordered distinction between self and environment, simple wakefulness, one's sense of selfhood or soul explored by "looking within", being a metaphorical "stream" of contents, or being a mental state, mental event, or mental process of the brain.

    Research methods

    Many different methodologies are used to study cognitive science. As the field is highly interdisciplinary, research often cuts across multiple areas of study, drawing on research methods from psychology, neuroscience, computer science and systems theory.

    Behavioral experiments

    In order to have a description of what constitutes intelligent behavior, one must study behavior itself. This type of research is closely tied to that in cognitive psychology and psychophysics. By measuring behavioral responses to different stimuli, one can understand something about how those stimuli are processed. Lewandowski & Strohmetz (2009) reviewed a collection of innovative uses of behavioral measurement in psychology including behavioral traces, behavioral observations, and behavioral choice. Behavioral traces are pieces of evidence that indicate behavior occurred, but the actor is not present (e.g., litter in a parking lot or readings on an electric meter). Behavioral observations involve the direct witnessing of the actor engaging in the behavior (e.g., watching how close a person sits next to another person). Behavioral choices are when a person selects between two or more options (e.g., voting behavior, choice of a punishment for another participant).

    • Reaction time. The time between the presentation of a stimulus and an appropriate response can indicate differences between two cognitive processes, and can indicate some things about their nature. For example, if in a search task the reaction times vary proportionally with the number of elements, then it is evident that this cognitive process of searching involves serial instead of parallel processing.
    • Psychophysical responses. Psychophysical experiments are an old psychological technique, which has been adopted by cognitive psychology. They typically involve making judgments of some physical property, e.g. the loudness of a sound. Correlation of subjective scales between individuals can show cognitive or sensory biases as compared to actual physical measurements. Some examples include:
      • sameness judgments for colors, tones, textures, etc.
      • threshold differences for colors, tones, textures, etc.
    • Eye tracking. This methodology is used to study a variety of cognitive processes, most notably visual perception and language processing. The fixation point of the eyes is linked to an individual's focus of attention. Thus, by monitoring eye movements, we can study what information is being processed at a given time. Eye tracking allows us to study cognitive processes on extremely short time scales. Eye movements reflect online decision making during a task, and they provide us with some insight into the ways in which those decisions may be processed.

    Brain imaging

    Image of the human head with the brain. The arrow indicates the position of the hypothalamus.

    Brain imaging involves analyzing activity within the brain while performing various tasks. This allows us to link behavior and brain function to help understand how information is processed. Different types of imaging techniques vary in their temporal (time-based) and spatial (location-based) resolution. Brain imaging is often used in cognitive neuroscience.

    • Single-photon emission computed tomography and positron emission tomography. SPECT and PET use radioactive isotopes, which are injected into the subject's bloodstream and taken up by the brain. By observing which areas of the brain take up the radioactive isotope, we can see which areas of the brain are more active than other areas. PET has similar spatial resolution to fMRI, but it has extremely poor temporal resolution.
    • Electroencephalography. EEG measures the electrical fields generated by large populations of neurons in the cortex by placing a series of electrodes on the scalp of the subject. This technique has an extremely high temporal resolution, but a relatively poor spatial resolution.
    • Functional magnetic resonance imaging. fMRI measures the relative amount of oxygenated blood flowing to different parts of the brain. More oxygenated blood in a particular region is assumed to correlate with an increase in neural activity in that part of the brain. This allows us to localize particular functions within different brain regions. fMRI has moderate spatial and temporal resolution.
    • Optical imaging. This technique uses infrared transmitters and receivers to measure the amount of light reflectance by blood near different areas of the brain. Since oxygenated and deoxygenated blood reflects light by different amounts, we can study which areas are more active (i.e., those that have more oxygenated blood). Optical imaging has moderate temporal resolution, but poor spatial resolution. It also has the advantage that it is extremely safe and can be used to study infants' brains.
    • Magnetoencephalography. MEG measures magnetic fields resulting from cortical activity. It is similar to EEG, except that it has improved spatial resolution since the magnetic fields it measures are not as blurred or attenuated by the scalp, meninges and so forth as the electrical activity measured in EEG is. MEG uses SQUID sensors to detect tiny magnetic fields.

    Computational modeling

    An artificial neural network with two layers

    Computational models require a mathematically and logically formal representation of a problem. Computer models are used in the simulation and experimental verification of different specific and general properties of intelligence. Computational modeling can help us understand the functional organization of a particular cognitive phenomenon. Approaches to cognitive modeling can be categorized as: (1) symbolic, on abstract mental functions of an intelligent mind by means of symbols; (2) subsymbolic, on the neural and associative properties of the human brain; and (3) across the symbolic–subsymbolic border, including hybrid.

    • Symbolic modeling evolved from the computer science paradigms using the technologies of knowledge-based systems, as well as a philosophical perspective (e.g. "Good Old-Fashioned Artificial Intelligence" (GOFAI)). They were developed by the first cognitive researchers and later used in information engineering for expert systems. Since the early 1990s it was generalized in systemics for the investigation of functional human-like intelligence models, such as personoids, and, in parallel, developed as the SOAR environment. Recently, especially in the context of cognitive decision-making, symbolic cognitive modeling has been extended to the socio-cognitive approach, including social and organizational cognition, interrelated with a sub-symbolic non-conscious layer.
    • Subsymbolic modeling includes connectionist/neural network models. Connectionism relies on the idea that the mind/brain is composed of simple nodes and its problem-solving capacity derives from the connections between them. Neural nets are textbook implementations of this approach. Some critics of this approach feel that while these models approach biological reality as a representation of how the system works, these models lack explanatory powers because, even in systems endowed with simple connection rules, the emerging high complexity makes them less interpretable at the connection-level than they apparently are at the macroscopic level.
    • Other approaches gaining in popularity include (1) dynamical systems theory, (2) mapping symbolic models onto connectionist models (Neural-symbolic integration or hybrid intelligent systems), and (3) and Bayesian models, which are often drawn from machine learning.

    All the above approaches tend either to be generalized to the form of integrated computational models of a synthetic/abstract intelligence (i.e. cognitive architecture) in order to be applied to the explanation and improvement of individual and social/organizational decision-making and reasoning or to focus on single simulative programs (or microtheories/"middle-range" theories) modelling specific cognitive faculties (e.g. vision, language, categorization etc.).

    Neurobiological methods

    Research methods borrowed directly from neuroscience and neuropsychology can also help us to understand aspects of intelligence. These methods allow us to understand how intelligent behavior is implemented in a physical system.

    Key findings

    Cognitive science has given rise to models of human cognitive bias and risk perception, and has been influential in the development of behavioral finance, part of economics. It has also given rise to a new theory of the philosophy of mathematics (related to denotational mathematics), and many theories of artificial intelligence, persuasion and coercion. It has made its presence known in the philosophy of language and epistemology as well as constituting a substantial wing of modern linguistics. Fields of cognitive science have been influential in understanding the brain's particular functional systems (and functional deficits) ranging from speech production to auditory processing and visual perception. It has made progress in understanding how damage to particular areas of the brain affect cognition, and it has helped to uncover the root causes and results of specific dysfunction, such as dyslexia, anopsia, and hemispatial neglect.

    Notable researchers

    Name Year of birth Year of contribution Contribution(s)
    David Chalmers 1966 1995 Dualism, hard problem of consciousness
    Daniel Dennett 1942 1987 Offered a computational systems perspective (multiple drafts model)
    John Searle 1932 1980 Chinese room
    Douglas Hofstadter 1945 1979 Gödel, Escher, Bach
    Jerry Fodor 1935 1968, 1975 Functionalism
    Alan Baddeley 1934 1974 Baddeley's model of working memory
    Marvin Minsky 1927 1970s, early 1980s Wrote computer programs in languages such as LISP to attempt to formally characterize the steps that human beings go through, such as making decisions and solving problems
    Christopher Longuet-Higgins 1923 1973 Coined the term cognitive science
    Noam Chomsky 1928 1959 Published a review of B.F. Skinner's book Verbal Behavior which began cognitivism against then-dominant behaviorism
    George Miller 1920 1956 Wrote about the capacities of human thinking through mental representations
    Herbert Simon 1916 1956 Co-created Logic Theory Machine and General Problem Solver with Allen Newell, EPAM (Elementary Perceiver and Memorizer) theory, organizational decision-making
    John McCarthy 1927 1955 Coined the term artificial intelligence and organized the famous Dartmouth conference in Summer 1956, which started AI as a field
    McCulloch and Pitts
    1930s–1940s Developed early artificial neural networks
    Lila R. Gleitman 1929 1970s-2010s Wide-ranging contributions to understanding the cognition of language acquisition, including syntactic bootstrapping theory
    Eleanor Rosch 1938 1976 Development of the Prototype Theory of categorisation
    Philip N. Johnson-Laird 1936 1980 Introduced the idea of mental models in cognitive science
    Dedre Gentner 1944 1983 Development of the Structure-mapping Theory of analogical reasoning
    Allen Newell 1927 1990 Development of the field of Cognitive architecture in cognitive modelling and artificial intelligence
    Annette Karmiloff-Smith 1938 1992 Integrating neuroscience and computational modelling into theories of cognitive development
    David Marr (neuroscientist) 1945 1990 Proponent of the Three-Level Hypothesis of levels of analysis of computational systems
    Peter Gärdenfors 1949 2000 Creator of the conceptual space framework used in cognitive modelling and artificial intelligence.
    Linda B. Smith 1951 1993 Together with Esther Thelen, created a dynamical systems approach to understanding cognitive development

    Some of the more recognized names in cognitive science are usually either the most controversial or the most cited. Within philosophy, some familiar names include Daniel Dennett, who writes from a computational systems perspective, John Searle, known for his controversial Chinese room argument, and Jerry Fodor, who advocates functionalism.

    Others include David Chalmers, who advocates Dualism and is also known for articulating the hard problem of consciousness, and Douglas Hofstadter, famous for writing Gödel, Escher, Bach, which questions the nature of words and thought.

    In the realm of linguistics, Noam Chomsky and George Lakoff have been influential (both have also become notable as political commentators). In artificial intelligence, Marvin Minsky, Herbert A. Simon, and Allen Newell are prominent.

    Popular names in the discipline of psychology include George A. Miller, James McClelland, Philip Johnson-Laird, Lawrence Barsalou, Vittorio Guidano, Howard Gardner and Steven Pinker. Anthropologists Dan Sperber, Edwin Hutchins, Bradd Shore, James Wertsch and Scott Atran, have been involved in collaborative projects with cognitive and social psychologists, political scientists and evolutionary biologists in attempts to develop general theories of culture formation, religion, and political association.

    Computational theories (with models and simulations) have also been developed, by David Rumelhart, James McClelland and Philip Johnson-Laird.

    Epistemics

    Epistemics is a term coined in 1969 by the University of Edinburgh with the foundation of its School of Epistemics. Epistemics is to be distinguished from epistemology in that epistemology is the philosophical theory of knowledge, whereas epistemics signifies the scientific study of knowledge.

    Christopher Longuet-Higgins has defined it as "the construction of formal models of the processes (perceptual, intellectual, and linguistic) by which knowledge and understanding are achieved and communicated." In his 1978 essay "Epistemics: The Regulative Theory of Cognition", Alvin I. Goldman claims to have coined the term "epistemics" to describe a reorientation of epistemology. Goldman maintains that his epistemics is continuous with traditional epistemology and the new term is only to avoid opposition. Epistemics, in Goldman's version, differs only slightly from traditional epistemology in its alliance with the psychology of cognition; epistemics stresses the detailed study of mental processes and information-processing mechanisms that lead to knowledge or beliefs.

    In the mid-1980s, the School of Epistemics was renamed as The Centre for Cognitive Science (CCS). In 1998, CCS was incorporated into the University of Edinburgh's School of Informatics.

    Binding problem in cognitive science

    One of the core aims of cognitive science is to achieve an integrated theory of cognition. This requires integrative mechanisms explaining how the information processing that occurs simultaneously in spatially segregated (sub-)cortical areas in the brain is coordinated and bound together to give rise to coherent perceptual and symbolic representations. One approach is to solve this "Binding problem" (that is, the problem of dynamically representing conjunctions of informational elements, from the most basic perceptual representations ("feature binding") to the most complex cognitive representations, like symbol structures ("variable binding")), by means of integrative synchronization mechanisms. In other words, one of the coordinating mechanisms appears to be the temporal (phase) synchronization of neural activity based on dynamical self-organizing processes in neural networks, described by the Binding-by-synchrony (BBS) Hypothesis from neurophysiology. Connectionist cognitive neuroarchitectures have been developed that use integrative synchronization mechanisms to solve this binding problem in perceptual cognition and in language cognition. In perceptual cognition the problem is to explain how elementary object properties and object relations, like the object color or the object form, can be dynamically bound together or can be integrated to a representation of this perceptual object by means of a synchronization mechanism ("feature binding", "feature linking"). In language cognition the problem is to explain how semantic concepts and syntactic roles can be dynamically bound together or can be integrated to complex cognitive representations like systematic and compositional symbol structures and propositions by means of a synchronization mechanism ("variable binding") (see also the "Symbolism vs. connectionism debate" in connectionism).

    However, despite significant advances in understanding the integrated theory of cognition (specifically the Binding problem), the debate on this issue of beginning cognition is still in progress. From the different perspectives noted above, this problem can be reduced to the issue of how organisms at the simple reflexes stage of development overcome the threshold of the environmental chaos of sensory stimuli: electromagnetic waves, chemical interactions, and pressure fluctuations. The so-called Primary Data Entry (PDE) thesis poses doubts about the ability of such an organism to overcome this cue threshold on its own. In terms of mathematical tools, the PDE thesis underlines the insuperable high threshold of the cacophony of environmental stimuli (the stimuli noise) for young organisms at the onset of life. It argues that the temporal (phase) synchronization of neural activity based on dynamical self-organizing processes in neural networks, any dynamical bound together or integration to a representation of the perceptual object by means of a synchronization mechanism can not help organisms in distinguishing relevant cue (informative stimulus) for overcome this noise threshold.

    World domination

    From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/World_dom...