Search This Blog

Tuesday, April 7, 2015

Futures studies


From Wikipedia, the free encyclopedia


Moore's law is an example of futures studies; it is a statistical collection of past and present trends with the goal of accurately extrapolating future trends.

Futures studies (also called futurology and futurism) is the study of postulating possible, probable, and preferable futures and the worldviews and myths that underlie them. There is a debate as to whether this discipline is an art or science. In general, it can be considered as a branch of the social sciences and parallel to the field of history. History studies the past, futures studies considers the future. Futures studies (colloquially called "futures" by many of the field's practitioners) seeks to understand what is likely to continue and what could plausibly change. Part of the discipline thus seeks a systematic and pattern-based understanding of past and present, and to determine the likelihood of future events and trends.[1] Unlike the physical sciences where a narrower, more specified system is studied, futures studies concerns a much bigger and more complex world system. The methodology and knowledge are much less proven as compared to natural science or even social science like sociology, economics, and political science.

Overview

Futures studies is an interdisciplinary field, studying yesterday's and today's changes, and aggregating and analyzing both lay and professional strategies and opinions with respect to tomorrow. It includes analyzing the sources, patterns, and causes of change and stability in an attempt to develop foresight and to map possible futures. Around the world the field is variously referred to as futures studies, strategic foresight, futuristics, futures thinking, futuring, futurology, and futurism. Futures studies and strategic foresight are the academic field's most commonly used terms in the English-speaking world.

Foresight was the original term and was first used in this sense by H.G. Wells in 1932.[2] "Futurology" is a term common in encyclopedias, though it is used almost exclusively by nonpractitioners today, at least in the English-speaking world. "Futurology" is defined as the "study of the future."[3] The term was coined by German professor Ossip K. Flechtheim[citation needed] in the mid-1940s, who proposed it as a new branch of knowledge that would include a new science of probability. This term may have fallen from favor in recent decades because modern practitioners stress the importance of alternative and plural futures, rather than one monolithic future, and the limitations of prediction and probability, versus the creation of possible and preferable futures.[citation needed]

Three factors usually distinguish futures studies from the research conducted by other disciplines (although all of these disciplines overlap, to differing degrees). First, futures studies often examines not only possible but also probable, preferable, and "wild card" futures. Second, futures studies typically attempts to gain a holistic or systemic view based on insights from a range of different disciplines. Third, futures studies challenges and unpacks the assumptions behind dominant and contending views of the future. The future thus is not empty but fraught with hidden assumptions. For example, many people expect the collapse of the Earth's ecosystem in the near future, while others believe the current ecosystem will survive indefinitely. A foresight approach would seek to analyse and so highlight the assumptions underpinning such views.

Futures studies does not generally focus on short term predictions such as interest rates over the next business cycle, or of managers or investors with short-term time horizons. Most strategic planning, which develops operational plans for preferred futures with time horizons of one to three years, is also not considered futures. Plans and strategies with longer time horizons that specifically attempt to anticipate possible future events are definitely part of the field.

The futures field also excludes those who make future predictions through professed supernatural means. At the same time, it does seek to understand the models such groups use and the interpretations they give to these models.

History

Johan Galtung and Sohail Inayatullah[4] argue in Macrohistory and Macrohistorians that the search for grand patterns of social change goes all the way back to Ssu-Ma Chien (145-90BC) and his theory of the cycles of virtue, although the work of Ibn Khaldun (1332–1406) such as The Muqaddimah[5] would be an example that is perhaps more intelligible to modern sociology. Some intellectual foundations of futures studies appeared in the mid-19th century; according to Wendell Bell, Comte's discussion of the metapatterns of social change presages futures studies as a scholarly dialogue.[6]

The first works that attempt to make systematic predictions for the future were written in the 18th century. Memoirs of the Twentieth Century written by Samuel Madden in 1733, takes the form of a series of diplomatic letters written in 1997 and 1998 from British representatives in the foreign cities of Constantinople, Rome, Paris, and Moscow.[7] However, the technology of the 20th century is identical to that of Madden's own era - the focus is instead on the political and religious state of the world in the future. Madden went on to write The Reign of George VI, 1900 to 1925, where (in the context of the boom in canal construction at the time) he envisioned a large network of waterways that would radically transform patterns of living - "Villages grew into towns and towns became cities".[8]

The genre of science fiction became established towards the end of the 19th century, with notable writers, including Jules Verne and H. G. Wells, setting their stories in an imagined future world.

Origins


H. G. Wells first advocated for 'future studies', in a lecture delivered in 1902.

According to W. Warren Wagar, the founder of future studies was H. G. Wells. His Anticipations of the Reaction of Mechanical and Scientific Progress Upon Human Life and Thought: An Experiment in Prophecy, was first serially published in The Fortnightly Review in 1901.[9] Anticipating what the world would be like in the year 2000, the book is interesting both for its hits (trains and cars resulting in the dispersion of population from cities to suburbs; moral restrictions declining as men and women seek greater sexual freedom; the defeat of German militarism, and the existence of a European Union) and its misses (he did not expect successful aircraft before 1950, and averred that "my imagination refuses to see any sort of submarine doing anything but suffocate its crew and founder at sea").[10][11]

Moving from narrow technological predictions, Wells envisioned the eventual collapse of the capitalist world system after a series of destructive total wars. From this havoc would ultimately emerge a world of peace and plenty, controlled by competent technocrats.[9]

The work was a bestseller, and Wells was invited to deliver a lecture at the Royal Institution in 1902, entitled The Discovery of the Future. The lecture was well-received and was soon republished in book form. He advocated for the establishment of a new academic study of the future that would be grounded in scientific methodology rather than just speculation. He argued that a scientifically ordered vision of the future "will be just as certain, just as strictly science, and perhaps just as detailed as the picture that has been built up within the last hundred years to make the geological past." Although conscious of the difficulty in arriving at entirely accurate predictions, he thought that it would still be possible to arrive at a "working knowledge of things in the future".[9]

In his fictional works, Wells predicted the invention and use of the atomic bomb in The World Set Free (1914).[12] In The Shape of Things to Come (1933) the impending World War and cities destroyed by aerial bombardment was depicted.[13] However, he didn't stop advocating for the establishment of a futures science. In a 1933 BBC broadcast he called for the establishment of "Departments and Professors of Foresight", foreshadowing the development of modern academic futures studies by approximately 40 years.[2]

Emergence

Futures studies emerged as an academic discipline in the mid-1960s. First-generation futurists included Herman Kahn, an American Cold War strategist who wrote On Thermonuclear War (1960), Thinking about the unthinkable (1962) and The Year 2000: a framework for speculation on the next thirty-three years (1967); Bertrand de Jouvenel, a French economist who founded Futuribles International in 1960; and Dennis Gabor, a Hungarian-British scientist who wrote Inventing the Future (1963) and The Mature Society. A View of the Future (1972).[6]

Future studies had a parallel origin with the birth of systems science in academia, and with the idea of national economic and political planning, most notably in France and the Soviet Union.[6][14] In the 1950s, France was continuing to reconstruct their war-torn country. In the process, French scholars, philosophers, writers, and artists searched for what could constitute a more positive future for humanity. The Soviet Union similarly participated in postwar rebuilding, but did so in the context of an established national economic planning process, which also required a long-term, systemic statement of social goals. Future studies was therefore primarily engaged in national planning, and the construction of national symbols.

By contrast, in the United States of America, futures studies as a discipline emerged from the successful application of the tools and perspectives of systems analysis, especially with regard to quartermastering the war-effort. These differing origins account for an initial schism between futures studies in America and futures studies in Europe: U.S. practitioners focused on applied projects, quantitative tools and systems analysis, whereas Europeans preferred to investigate the long-range future of humanity and the Earth, what might constitute that future, what symbols and semantics might express it, and who might articulate these.[15][16]

By the 1960s, academics, philosophers, writers and artists across the globe had begun to explore enough future scenarios so as to fashion a common dialogue. Inventors such as Buckminster Fuller also began highlighting the effect technology might have on global trends as time progressed. This discussion on the intersection of population growth, resource availability and use, economic growth, quality of life, and environmental sustainability – referred to as the "global problematique" – came to wide public attention with the publication of Limits to Growth, a study sponsored by the Club of Rome.[17]

Further development

International dialogue became institutionalized in the form of the World Futures Studies Federation (WFSF), founded in 1967, with the noted sociologist, Johan Galtung, serving as its first president. In the United States, the publisher Edward Cornish, concerned with these issues, started the World Future Society, an organization focused more on interested laypeople.

1975 saw the founding of the first graduate program in futures studies in the United States, the M.S. program in Studies of the Future at the University of Houston at Clear Lake City;[18] there followed a year later the M.A. Program in Public Policy in Alternative Futures at the University of Hawaii at Manoa.[19] The Hawaii program provides particular interest in the light of the schism in perspective between European and U.S. futurists; it bridges that schism by locating futures studies within a pedagogical space defined by neo-Marxism, critical political economic theory, and literary criticism. In the years following the foundation of these two programs, single courses in Futures Studies at all levels of education have proliferated, but complete programs occur only rarely.

As a transdisciplinary field, futures studies attracts generalists. This transdisciplinary nature can also cause problems, owing to it sometimes falling between the cracks of disciplinary boundaries; it also has caused some difficulty in achieving recognition within the traditional curricula of the sciences and the humanities. In contrast to "Futures Studies" at the undergraduate level, some graduate programs in strategic leadership or management offer masters or doctorate programs in "strategic foresight" for mid-career professionals, some even online. Nevertheless, comparatively few new PhDs graduate in Futures Studies each year.

The field currently faces the great challenge of creating a coherent conceptual framework, codified into a well-documented curriculum (or curricula) featuring widely accepted and consistent concepts and theoretical paradigms linked to quantitative and qualitative methods, exemplars of those research methods, and guidelines for their ethical and appropriate application within society. As an indication that previously disparate intellectual dialogues have in fact started converging into a recognizable discipline,[20] at least six solidly-researched and well-accepted first attempts to synthesize a coherent framework for the field have appeared: Eleonora Masini's Why Futures Studies,[21] James Dator's Advancing Futures Studies,[22] Ziauddin Sardar's Rescuing all of our Futures,[23] Sohail Inayatullah's Questioning the future,[24] Richard A. Slaughter's The Knowledge Base of Futures Studies,[25] a collection of essays by senior practitioners, and Wendell Bell's two-volume work, The Foundations of Futures Studies.[26]

Probability and predictability

Some aspects of the future, such as celestial mechanics, are highly predictable, and may even be described by relatively simple mathematical models. At present however, science has yielded only a special minority of such "easy to predict" physical processes. Theories such as chaos theory, nonlinear science and standard evolutionary theory have allowed us to understand many complex systems as contingent (sensitively dependent on complex environmental conditions) and stochastic (random within constraints), making the vast majority of future events unpredictable, in any specific case.

Not surprisingly, the tension between predictability and unpredictability is a source of controversy and conflict among futures studies scholars and practitioners. Some argue that the future is essentially unpredictable, and that "the best way to predict the future is to create it." Others believe, as Flechtheim, that advances in science, probability, modeling and statistics will allow us to continue to improve our understanding of probable futures, while this area presently remains less well developed than methods for exploring possible and preferable futures.

As an example, consider the process of electing the president of the United States. At one level we observe that any U.S. citizen over 35 may run for president, so this process may appear too unconstrained for useful prediction. Yet further investigation demonstrates that only certain public individuals (current and former presidents and vice presidents, senators, state governors, popular military commanders, mayors of very large cities, etc.) receive the appropriate "social credentials" that are historical prerequisites for election. Thus with a minimum of effort at formulating the problem for statistical prediction, a much reduced pool of candidates can be described, improving our probabilistic foresight. Applying further statistical intelligence to this problem, we can observe that in certain election prediction markets such as the Iowa Electronic Markets, reliable forecasts have been generated over long spans of time and conditions, with results superior to individual experts or polls. Such markets, which may be operated publicly or as an internal market, are just one of several promising frontiers in predictive futures research.

Such improvements in the predictability of individual events do not though, from a complexity theory viewpoint, address the unpredictability inherent in dealing with entire systems, which emerge from the interaction between multiple individual events.

Methodologies

Futures practitioners use a wide range of models and methods (theory and practice), many of which come from other academic disciplines, including economics, sociology, geography, history, engineering, mathematics, psychology, technology, tourism, physics, biology, astronomy, and aspects of theology (specifically, the range of future beliefs).

One of the fundamental assumptions in futures studies is that the future is plural not singular, that is, that it consists of alternative futures of varying likelihood but that it is impossible in principle to say with certainty which one will occur. The primary effort in futures studies, therefore, is to identify and describe alternative futures. This effort includes collecting quantitative and qualitative data about the possibility, probability, and desirability of change. The plurality of the term "futures" in futures studies denotes the rich variety of alternative futures, including the subset of preferable futures (normative futures), that can be studied.

Practitioners of the discipline previously concentrated on extrapolating present technological, economic or social trends, or on attempting to predict future trends, but more recently they have started to examine social systems and uncertainties and to build scenarios, question the worldviews behind such scenarios via the causal layered analysis method (and others), create preferred visions of the future, and use backcasting to derive alternative implementation strategies. Apart from extrapolation and scenarios, many dozens of methods and techniques are used in futures research (see below).

Futures studies also includes normative or preferred futures, but a major contribution involves connecting both extrapolated (exploratory) and normative research to help individuals and organisations to build better social futures amid a (presumed) landscape of shifting social changes. Practitioners use varying proportions of inspiration and research. Futures studies only rarely uses the scientific method in the sense of controlled, repeatable and falsifiable experiments with highly standardized methodologies, given that environmental conditions for repeating a predictive scheme are usually quite hard to control. However, many futurists are informed by scientific techniques. Some historians project patterns observed in past civilizations upon present-day society to anticipate what will happen in the future. Oswald Spengler's "Decline of the West" argued, for instance, that western society, like imperial Rome, had reached a stage of cultural maturity that would inexorably lead to decline, in measurable ways.

Futures studies is often summarized as being concerned with "three Ps and a W", or possible, probable, and preferable futures, plus wildcards, which are low probability but high impact events (positive or negative), should they occur. Many futurists, however, do not use the wild card approach. Rather, they use a methodology called Emerging Issues Analysis. It searches for the seeds of change, issues that are likely to move from unknown to the known, from low impact to high impact.

Estimates of probability are involved with two of the four central concerns of foresight professionals (discerning and classifying both probable and wildcard events), while considering the range of possible futures, recognizing the plurality of existing alternative futures, characterizing and attempting to resolve normative disagreements on the future, and envisioning and creating preferred futures are other major areas of scholarship. Most estimates of probability in futures studies are normative and qualitative, though significant progress on statistical and quantitative methods (technology and information growth curves, cliometrics, predictive psychology, prediction markets, etc.) has been made in recent decades.

Futures techniques

While forecasting – i.e., attempts to predict future states from current trends – is a common methodology, professional scenarios often rely on "backcasting": asking what changes in the present would be required to arrive at envisioned alternative future states. For example, the Policy Reform and Eco-Communalism scenarios developed by the Global Scenario Group rely on the backcasting method. Practitioners of futures studies classify themselves as futurists (or foresight practitioners).

Futurists use a diverse range of forecasting methods including:

Shaping alternative futures

Futurists use scenarios – alternative possible futures – as an important tool. To some extent, people can determine what they consider probable or desirable using qualitative and quantitative methods. By looking at a variety of possibilities one comes closer to shaping the future, rather than merely predicting it. Shaping alternative futures starts by establishing a number of scenarios. Setting up scenarios takes place as a process with many stages. One of those stages involves the study of trends. A trend persists long-term and long-range; it affects many societal groups, grows slowly and appears to have a profound basis. In contrast, a fad operates in the short term, shows the vagaries of fashion, affects particular societal groups, and spreads quickly but superficially.

Sample predicted futures range from predicted ecological catastrophes, through a utopian future where the poorest human being lives in what present-day observers would regard as wealth and comfort, through the transformation of humanity into a posthuman life-form, to the destruction of all life on Earth in, say, a nanotechnological disaster.

Futurists have a decidedly mixed reputation and a patchy track record at successful prediction. For reasons of convenience, they often extrapolate present technical and societal trends and assume they will develop at the same rate into the future; but technical progress and social upheavals, in reality, take place in fits and starts and in different areas at different rates.

Many 1950s futurists predicted commonplace space tourism by the year 2000, but ignored the possibilities of ubiquitous, cheap computers, while Marxist expectations have failed to materialise to date. On the other hand, many forecasts have portrayed the future with some degree of accuracy. Current futurists often present multiple scenarios that help their audience envision what "may" occur instead of merely "predicting the future". They claim that understanding potential scenarios helps individuals and organizations prepare with flexibility.

Many corporations use futurists as part of their risk management strategy, for horizon scanning and emerging issues analysis, and to identify wild cards – low probability, potentially high-impact risks.[27] Every successful and unsuccessful business engages in futuring to some degree – for example in research and development, innovation and market research, anticipating competitor behavior and so on.[28][29]

Weak signals, the future sign and wild cards

In futures research "weak signals" may be understood as advanced, noisy and socially situated indicators of change in trends and systems that constitute raw informational material for enabling anticipatory action. There is confusion about the definition of weak signal by various researchers and consultants. Sometimes it is referred as future oriented information, sometimes more like emerging issues. Elina Hiltunen (2007), in her new concept the future sign has tried to clarify the confusion about the weak signal definitions, by combining signal, issue and interpretation to the future sign, which more holistically describes the change.[30]

"Wild cards" refer to low-probability and high-impact events, such as existential risks. This concept may be embedded in standard foresight projects and introduced into anticipatory decision-making activity in order to increase the ability of social groups adapt to surprises arising in turbulent business environments. Such sudden and unique incidents might constitute turning points in the evolution of a certain trend or system. Wild cards may or may not be announced by weak signals, which are incomplete and fragmented data from which relevant foresight information might be inferred. Sometimes, mistakenly, wild cards and weak signals are considered as synonyms, which they are not.[31]

Near-term predictions

A long-running tradition in various cultures, and especially in the media, involves various spokespersons making predictions for the upcoming year at the beginning of the year. These predictions sometimes base themselves on current trends in culture (music, movies, fashion, politics); sometimes they make hopeful guesses as to what major events might take place over the course of the next year.

Some of these predictions come true as the year unfolds, though many fail. When predicted events fail to take place, the authors of the predictions often state that misinterpretation of the "signs" and portents may explain the failure of the prediction.

Marketers have increasingly started to embrace futures studies, in an effort to benefit from an increasingly competitive marketplace with fast production cycles, using such techniques as trendspotting as popularized by Faith Popcorn.[dubious ]

Trend analysis and forecasting

Mega-trends

Trends come in different sizes. A mega-trend extends over many generations, and in cases of climate, mega-trends can cover periods prior to human existence. They describe complex interactions between many factors. The increase in population from the palaeolithic period to the present provides an example.

Potential trends

Possible new trends grow from innovations, projects, beliefs or actions that have the potential to grow and eventually go mainstream in the future. For example, just a few years ago, alternative medicine remained an outcast from modern medicine. Now it has links with big business and has achieved a degree of respectability in some circles and even in the marketplace. This increasing level of acceptance illustrates a potential trend of society to move away from the sciences, even beyond the scope of medicine.

Branching trends

Very often, trends relate to one another the same way as a tree-trunk relates to branches and twigs. For example, a well-documented movement toward equality between men and women might represent a branch trend. The trend toward reducing differences in the salaries of men and women in the Western world could form a twig on that branch.

Life-cycle of a trend

When a potential trend gets enough confirmation in the various media, surveys or questionnaires to show that it has an increasingly accepted value, behavior or technology, it becomes accepted as a bona fide trend. Trends can also gain confirmation by the existence of other trends perceived as springing from the same branch. Some commentators claim that when 15% to 25% of a given population integrates an innovation, project, belief or action into their daily life then a trend becomes mainstream.

Education

Education in the field of futures studies has taken place for some time. Beginning in the United States of America in the 1960s, it has since developed in many different countries. Futures education can encourage the use of concepts, tools and processes that allow students to think long-term, consequentially, and imaginatively. It generally helps students to:
  1. conceptualise more just and sustainable human and planetary futures.
  2. develop knowledge and skills in exploring probable and preferred futures.
  3. understand the dynamics and influence that human, social and ecological systems have on alternative futures.
  4. conscientize responsibility and action on the part of students toward creating better futures.
Thorough documentation of the history of futures education exists, for example in the work of Richard A. Slaughter (2004),[32] David Hicks, Ivana Milojević[33] and Jennifer Gidley[34][35][36] to name a few.

While futures studies remains a relatively new academic tradition, numerous tertiary institutions around the world teach it. These vary from small programs, or universities with just one or two classes, to programs that incorporate futures studies into other degrees, (for example in planning, business, environmental studies, economics, development studies, science and technology studies). Various formal Masters-level programs exist on six continents. Finally, doctoral dissertations around the world have incorporated futures studies. A recent survey documented approximately 50 cases of futures studies at the tertiary level.[37]

The largest Futures Studies program in the world is at Tamkang University, Taiwan.[citation needed] Futures Studies is a required course at the undergraduate level, with between three to five thousand students taking classes on an annual basis. Housed in the Graduate Institute of Futures Studies is an MA Program. Only ten students are accepted annually in the program. Associated with the program is the Journal of Futures Studies.[38]

As of 2003, over 40 tertiary education establishments around the world were delivering one or more courses in futures studies. The World Futures Studies Federation[39] has a comprehensive survey of global futures programs and courses. The Acceleration Studies Foundation maintains an annotated list of primary and secondary graduate futures studies programs.[40]

Futurists

Several authors have become recognized as futurists. They research trends, particularly in technology, and write their observations, conclusions, and predictions. In earlier eras, many futurists were at academic institutions. John McHale, author of The Future of the Future, published a 'Futures Directory', and directed a think tank called The Centre For Integrative Studies at a university. Futurists have started consulting groups or earn money as speakers, with examples including Alvin Toffler, John Naisbitt and Patrick Dixon. Frank Feather is a business speaker that presents himself as a pragmatic futurist. Some futurists have commonalities with science fiction, and some science-fiction writers, such as Arthur C. Clarke, are known as futurists.[citation needed] In the introduction to The Left Hand of Darkness, Ursula K. Le Guin distinguished futurists from novelists, writing of the study as the business of prophets, clairvoyants, and futurists. In her words, "a novelist's business is lying".
A survey of 108 futurists[41] found the following shared assumptions:
  1. We are in the midst of a historical transformation. Current times are not just part of normal history.
  2. Multiple perspectives are at heart of futures studies, including unconventional thinking, internal critique, and cross-cultural comparison.
  3. Consideration of alternatives. Futurists do not see themselves as value-free forecasters, but instead aware of multiple possibilities.
  4. Participatory futures. Futurists generally see their role as liberating the future in each person, and creating enhanced public ownership of the future. This is true worldwide.[clarification needed]
  5. Long term policy transformation. While some are more policy-oriented than others, almost all believe that the work of futurism is to shape public policy, so it consciously and explicitly takes into account the long term.
  6. Part of the process of creating alternative futures and of influencing public (corporate, or international) policy is internal transformation. At international meetings, structural and individual factors are considered equally important.
  7. Complexity. Futurists believe that a simple one-dimensional or single-discipline orientation is not satisfactory. Trans-disciplinary approaches that take complexity seriously are necessary. Systems thinking, particularly in its evolutionary dimension, is also crucial.
  8. Futurists are motivated by change. They are not content merely to describe or forecast. They desire an active role in world transformation.
  9. They are hopeful for a better future as a "strange attractor".
  10. Most believe they are pragmatists in this world, even as they imagine and work for another. Futurists have a long term perspective.
  11. Sustainable futures, understood as making decisions that do not reduce future options, that include policies on nature, gender and other accepted paradigms. This applies to corporate futurists and the NGO. Environmental sustainability is reconciled with the technological, spiritual and post-structural ideals. Sustainability is not a "back to nature" ideal, but rather inclusive of technology and culture.

Applications of foresight and specific fields

General applicability and use of foresight products

Several corporations and government agencies utilize foresight products to both better understand potential risks and prepare for potential opportunities. Several government agencies publish material for internal stakeholders as well as make that material available to broader public. Examples of this include the US Congressional Budget Office long term budget projections,[42] the National Intelligence Center,[43] and the United Kingdom Government Office for Science.[44] Much of this material is used by policy makers to inform policy decisions and government agencies to develop long term plan. Several corporations, particularly those with long product development lifecycles, utilize foresight and future studies products and practitioners in the development of their business strategies. The Shell Corporation is one such entity.[45] Foresight professionals and their tools are increasingly being utilized in both the private and public areas to help leaders deal with an increasingly complex and interconnected world.

Fashion and design

Fashion is one area of trend forecasting. The industry typically works 18 months ahead of the current selling season.[citation needed] Large retailers look at the obvious impact of everything from the weather forecast to runway fashion for consumer tastes. Consumer behavior and statistics are also important for a long-range forecast.

Artists and conceptual designers, by contrast, may feel that consumer trends are a barrier to creativity. Many of these ‘startists’ start micro trends but do not follow trends themselves.[citation needed]

Design is another area of trend forecasting. Foresight and futures thinking are rapidly being adopted by the design industry to insure more sustainable, robust and humanistic products. Design, much like future studies is an interdisciplinary field that considers global trends, challenges and opportunities to foster innovation. Designers are thus adopting futures methodologies including scenarios, trend forecasting, and futures research.

Holistic thinking that incorporates strategic, innovative and anticipatory solutions gives designers the tools necessary to navigate complex problems and develop novel future enhancing and visionary solutions.

The Association for Professional Futurists has also held meetings discussing the ways in which Design Thinking and Futures Thinking intersect.

Energy and alternative sources

While the price of oil probably will go down and up, the basic price trajectory is sharply up. Market forces will play an important role, but there are not enough new sources of oil in the Earth to make up for escalating demands from China, India, and the Middle East, and to replace declining fields. And while many alternative sources of energy exist in principle, none exists in fact in quality or quantity sufficient to make up for the shortfall of oil soon enough. A growing gap looms between the effective end of the Age of Oil and the possible emergence of new energy sources.[46]

Education

As Foresight has expanded to include a broader range of social concerns all levels and types of education have been addressed, including formal and informal education. Many countries are beginning to implement Foresight in their Education policy. A few programs are listed below:
  • Finland's FinnSight 2015[47] - Implementation began in 2006 and though at the time was not referred to as "Foresight" they tend to display the characteristics of a foresight program.
  • Singapore's Ministry of Education Master plan for Information Technology in Education[48] - This third Masterplan continues what was built on in the 1st and 2nd plans to transform learning environments to equip students to compete in a knowledge economy.

Research centers

Futurists and foresight thought leaders

Books

Periodicals and monographs

Organizations

Global warming


From Wikipedia, the free encyclopedia

refer to caption
Global mean surface temperature change from 1880 to 2014, relative to the 1951–1980 mean. The black line is the annual mean and the red line is the 5-year running mean. The green bars show uncertainty estimates. Source: NASA GISS.
Map of temperature changes across the world
key to above map of temperature changes
World map showing surface temperature trends (°C per decade) between 1950 and 2014. Source: NASA GISS.[1]
refer to caption
Fossil fuel related carbon dioxide (CO2) emissions compared to five of the IPCC's "SRES" emissions scenarios. The dips are related to global recessions. Image source: Skeptical Science.

Global warming and climate change can both refer to the observed century-scale rise in the average temperature of the Earth's climate system and its related effects, although climate change can also refer to any historic change in climate. Multiple lines of scientific evidence show that the climate system is warming.[2][3] More than 90% of the additional energy stored in the climate system since 1970 has gone into ocean warming; the remainder has melted ice, and warmed the continents and atmosphere.[4][a] Many of the observed changes since the 1950s are unprecedented over decades to millennia.[5]

Scientific understanding of global warming has been increasing. In its fifth assessment (AR5) in 2014 the Intergovernmental Panel on Climate Change (IPCC) reported that scientists were more than 95% certain that most of global warming is caused by increasing concentrations of greenhouse gases and other human (anthropogenic) activities.[6][7][8] Climate model projections summarized in AR5 indicated that during the 21st century the global surface temperature is likely to rise a further 0.3 to 1.7 °C (0.5 to 3.1 °F) for their lowest emissions scenario using stringent mitigation and 2.6 to 4.8 °C (4.7 to 8.6 °F) for their highest.[9] These findings have been recognized by the national science academies of the major industrialized nations.[10][b]

Future climate change and associated impacts will be different from region to region around the globe.[12][13] The effects of an increase in global temperature include a rise in sea levels and a change in the amount and pattern of precipitation, as well as a probable expansion of subtropical deserts.[14] Warming is expected to be strongest in the Arctic, with the continuing retreat of glaciers, permafrost and sea ice. Other likely effects of the warming include more frequent extreme weather events including heat waves, droughts, heavy rainfall, and heavy snowfall;[15] ocean acidification; and species extinctions due to shifting temperature regimes. Effects significant to humans include the threat to food security from decreasing crop yields and the loss of habitat from inundation.[16][17]

Possible responses to global warming include mitigation by emissions reduction, adaptation to its effects, building systems resilient to its effects, and possible future climate engineering. Most countries are parties to the United Nations Framework Convention on Climate Change (UNFCCC),[18] whose ultimate objective is to prevent dangerous anthropogenic climate change.[19] The UNFCCC have adopted a range of policies designed to reduce greenhouse gas emissions[20][21][22][23] and to assist in adaptation to global warming.[20][23][24][25] Parties to the UNFCCC have agreed that deep cuts in emissions are required,[26] and that future global warming should be limited to below 2.0 °C (3.6 °F) relative to the pre-industrial level.[26][c]

Observed temperature changes

refer to caption and adjacent text
Two millennia of mean surface temperatures according to different reconstructions from climate proxies, each smoothed on a decadal scale, with the instrumental temperature record overlaid in black.
refer to caption and adjacent text
NOAA graph of Global Annual Temperature Anomalies 1950–2012, showing the El Niño-Southern Oscillation
refer to caption and image description
Earth has been in radiative imbalance since at least the 1970s, where less energy leaves the atmosphere than enters it. Most of this extra energy has been absorbed by the oceans.[28] It is very likely that human activities substantially contributed to this increase in ocean heat content.[29]

The Earth's average surface temperature rose by 0.74±0.18 °C over the period 1906–2005. The rate of warming over the last half of that period was almost double that for the period as a whole (0.13±0.03 °C per decade, versus 0.07±0.02 °C per decade). The urban heat island effect is very small, estimated to account for less than 0.002 °C of warming per decade since 1900.[30] Temperatures in the lower troposphere have increased between 0.13 and 0.22 °C (0.22 and 0.4 °F) per decade since 1979, according to satellite temperature measurements. Climate proxies show the temperature to have been relatively stable over the one or two thousand years before 1850, with regionally varying fluctuations such as the Medieval Warm Period and the Little Ice Age.[31]

The warming that is evident in the instrumental temperature record is consistent with a wide range of observations, as documented by many independent scientific groups.[32] Examples include sea level rise (due to melting of snow and ice and because water above 3.98 °C expands as it warms),[33] widespread melting of snow and ice,[34] increased heat content of the oceans,[32] increased humidity,[32] and the earlier timing of spring events,[35] e.g., the flowering of plants.[36] The probability that these changes could have occurred by chance is virtually zero.[32]

Recent estimates by NASA's Goddard Institute for Space Studies (GISS) and the National Climatic Data Center show that 2005 and 2010 tied for the planet's warmest year since reliable, widespread instrumental measurements became available in the late 19th century, exceeding 1998 by a few hundredths of a degree.[37][38][39] Estimates by the Climatic Research Unit (CRU) show 2005 as the second warmest year, behind 1998 with 2003 and 2010 tied for third warmest year, however, "the error estimate for individual years ... is at least ten times larger than the differences between these three years."[40] The World Meteorological Organization (WMO) WMO statement on the status of the global climate in 2010 explains that, "The 2010 nominal value of +0.53 °C ranks just ahead of those of 2005 (+0.52 °C) and 1998 (+0.51 °C), although the differences between the three years are not statistically significant..."[41] Every year from 1986 to 2013 has seen annual average global land and ocean surface temperatures above the 1961–1990 average.[42][43]

Surface temperatures in 1998 were unusually warm because global temperatures are affected by the El Niño-Southern Oscillation (ENSO), and the strongest El Niño in the past century occurred during that year.[44] Global temperature is subject to short-term fluctuations that overlay long-term trends and can temporarily mask them. The relative stability in surface temperature from 2002 to 2009—which has been dubbed the global warming hiatus by the media and some scientists—[45] is consistent with such an episode.[46][47] 2010 was also an El Niño year. On the low swing of the oscillation, 2011 as a La Niña year was cooler but it was still the 11th warmest year since records began in 1880. Of the 13 warmest years since 1880, 11 were the years from 2001 to 2011. Over the more recent record, 2011 was the warmest La Niña year in the period from 1950 to 2011, and was close to 1997, which was not at the lowest point of the cycle.[48]

Temperature changes vary over the globe. Since 1979, land temperatures have increased about twice as fast as ocean temperatures (0.25 °C per decade against 0.13 °C per decade).[49] Ocean temperatures increase more slowly than land temperatures because of the larger effective heat capacity of the oceans and because the ocean loses more heat by evaporation.[50] The northern hemisphere is also naturally warmer than the southern hemisphere mainly because of meridional heat transport in the oceans, which has a differential of about 0.9 petawatts northwards,[51] with an additional contribution from the albedo differences between the polar regions. Since the beginning of industrialisation the temperature difference between the hemispheres has increased due to melting of sea ice and snow in the North.[52] Average arctic temperatures have been increasing at almost twice the rate of the rest of the world in the past 100 years; however arctic temperatures are also highly variable.[53] Although more greenhouse gases are emitted in the Northern than Southern Hemisphere this does not contribute to the difference in warming because the major greenhouse gases persist long enough to mix between hemispheres.[54]

The thermal inertia of the oceans and slow responses of other indirect effects mean that climate can take centuries or longer to adjust to changes in forcing. Climate commitment studies indicate that even if greenhouse gases were stabilized at year 2000 levels, a further warming of about 0.5 °C (0.9 °F) would still occur.[55]

Initial causes of temperature changes (external forcings)

refer to caption and adjacent text
Greenhouse effect schematic showing energy flows between space, the atmosphere, and Earth's surface. Energy exchanges are expressed in watts per square meter (W/m2).
refer to caption and adjacent text
This graph, known as the Keeling Curve, shows the increase of atmospheric carbon dioxide (CO2) concentrations from 1958–2013. Monthly CO2 measurements display seasonal oscillations in an upward trend; each year's maximum occurs during the Northern Hemisphere's late spring, and declines during its growing season as plants remove some atmospheric CO2.

The climate system can respond to changes in external forcings.[56][57] External forcings can "push" the climate in the direction of warming or cooling.[58] Examples of external forcings include changes in atmospheric composition (e.g., increased concentrations of greenhouse gases), solar luminosity, volcanic eruptions, and variations in Earth's orbit around the Sun.[59] Orbital cycles vary slowly over tens of thousands of years, and at present are in an overall cooling trend—which would be expected to lead towards a glacial period within the current ice age, but the 20th century instrumental temperature record shows a sudden rise in global temperatures.[60]

Greenhouse gases

The greenhouse effect is the process by which absorption and emission of infrared radiation by gases in a planet's atmosphere warm its lower atmosphere and surface. It was proposed by Joseph Fourier in 1824, discovered in 1860 by John Tyndall,[61] was first investigated quantitatively by Svante Arrhenius in 1896,[62] and was developed in the 1930s through 1960s by Guy Stewart Callendar.[63]
refer to caption and image description
Annual world greenhouse gas emissions, in 2010, by sector.
refer to caption and image description
Percentage share of global cumulative energy-related CO2 emissions between 1751 and 2012 across different regions.

On Earth, naturally occurring amounts of greenhouse gases have a mean warming effect of about 33 °C (59 °F).[64][d] Without the Earth's atmosphere, the Earth's average temperature would be well below the freezing temperature of water.[65] The major greenhouse gases are water vapor, which causes about 36–70% of the greenhouse effect; carbon dioxide (CO2), which causes 9–26%; methane (CH4), which causes 4–9%; and ozone (O3), which causes 3–7%.[66][67][68] Clouds also affect the radiation balance through cloud forcings similar to greenhouse gases.

Human activity since the Industrial Revolution has increased the amount of greenhouse gases in the atmosphere, leading to increased radiative forcing from CO2, methane, tropospheric ozone, CFCs and nitrous oxide. According to work published in 2007, the concentrations of CO2 and methane have increased by 36% and 148% respectively since 1750.[69] These levels are much higher than at any time during the last 800,000 years, the period for which reliable data has been extracted from ice cores.[70][71][72][73] Less direct geological evidence indicates that CO2 values higher than this were last seen about 20 million years ago.[74] Fossil fuel burning has produced about three-quarters of the increase in CO2 from human activity over the past 20 years. The rest of this increase is caused mostly by changes in land-use, particularly deforestation.[75] Estimates of global CO2 emissions in 2011 from fossil fuel combustion, including cement production and gas flaring, was 34.8 billion tonnes (9.5 ± 0.5 PgC), an increase of 54% above emissions in 1990. Coal burning was responsible for 43% of the total emissions, oil 34%, gas 18%, cement 4.9% and gas flaring 0.7%[76] In May 2013, it was reported that readings for CO2 taken at the world's primary benchmark site in Mauna Loa surpassed 400 ppm. According to professor Brian Hoskins, this is likely the first time CO2 levels have been this high for about 4.5 million years.[77][78]

Over the last three decades of the twentieth century, gross domestic product per capita and population growth were the main drivers of increases in greenhouse gas emissions.[79] CO2 emissions are continuing to rise due to the burning of fossil fuels and land-use change.[80][81]:71 Emissions can be attributed to different regions. Attribution of emissions due to land-use change is a controversial issue.[82][83]:289

Emissions scenarios, estimates of changes in future emission levels of greenhouse gases, have been projected that depend upon uncertain economic, sociological, technological, and natural developments.[84] In most scenarios, emissions continue to rise over the century, while in a few, emissions are reduced.[85][86] Fossil fuel reserves are abundant, and will not limit carbon emissions in the 21st century.[87] Emission scenarios, combined with modelling of the carbon cycle, have been used to produce estimates of how atmospheric concentrations of greenhouse gases might change in the future. Using the six IPCC SRES "marker" scenarios, models suggest that by the year 2100, the atmospheric concentration of CO2 could range between 541 and 970 ppm.[88] This is 90–250% above the concentration in the year 1750.

The popular media and the public often confuse global warming with ozone depletion, i.e., the destruction of stratospheric ozone (e.g., the ozone layer) by chlorofluorocarbons.[89][90] Although there are a few areas of linkage, the relationship between the two is not strong. Reduced stratospheric ozone has had a slight cooling influence on surface temperatures, while increased tropospheric ozone has had a somewhat larger warming effect.[91]
refer to caption and body text
Atmospheric CO2 concentration from 650,000 years ago to near present, using ice core proxy data and direct measurements.

Particulates and soot

Refer to caption
Ship tracks can be seen as lines in these clouds over the Atlantic Ocean on the east coast of the United States. The climatic impacts from particulate forcing could have a large effect on climate through the indirect effect.

Global dimming, a gradual reduction in the amount of global direct irradiance at the Earth's surface, was observed from 1961 until at least 1990.[92] The main cause of this dimming is particulates produced by volcanoes and human made pollutants, which exerts a cooling effect by increasing the reflection of incoming sunlight. The effects of the products of fossil fuel combustion – CO2 and aerosols – have partially offset one another in recent decades, so that net warming has been due to the increase in non-CO2 greenhouse gases such as methane.[93] Radiative forcing due to particulates is temporally limited due to wet deposition, which causes them to have an atmospheric lifetime of one week. Carbon dioxide has a lifetime of a century or more, and as such, changes in particulate concentrations will only delay climate changes due to carbon dioxide.[94] Black carbon is second only to carbon dioxide for its contribution to global warming.[95]

In addition to their direct effect by scattering and absorbing solar radiation, particulates have indirect effects on the Earth's radiation budget. Sulfates act as cloud condensation nuclei and thus lead to clouds that have more and smaller cloud droplets. These clouds reflect solar radiation more efficiently than clouds with fewer and larger droplets, phenomenon known as the Twomey effect.[96] This effect also causes droplets to be of more uniform size, which reduces growth of raindrops and makes the cloud more reflective to incoming sunlight, known as the Albrecht effect.[97] Indirect effects are most noticeable in marine stratiform clouds, and have very little radiative effect on convective clouds. Indirect effects of particulates represent the largest uncertainty in radiative forcing.[98]

Soot may either cool or warm Earth's climate system, depending on whether it is airborne or deposited. Atmospheric soot directly absorbs solar radiation, which heats the atmosphere and cools the surface. In isolated areas with high soot production, such as rural India, as much as 50% of surface warming due to greenhouse gases may be masked by atmospheric brown clouds.[99] When deposited, especially on glaciers or on ice in arctic regions, the lower surface albedo can also directly heat the surface.[100] The influences of particulates, including black carbon, are most pronounced in the tropics and sub-tropics, particularly in Asia, while the effects of greenhouse gases are dominant in the extratropics and southern hemisphere.[101]
Refer to caption and adjacent text
Changes in Total Solar Irradiance (TSI) and monthly sunspot numbers since the mid-1970s.
Refer to caption
Contribution of natural factors and human activities to radiative forcing of climate change.[102] Radiative forcing values are for the year 2005, relative to the pre-industrial era (1750).[102] The contribution of solar irradiance to radiative forcing is 5% the value of the combined radiative forcing due to increases in the atmospheric concentrations of carbon dioxide, methane and nitrous oxide.[103]

Solar activity

Since 1978, solar irradiance has been measured by satellites.[104] These measurements indicate that the Sun's output has not increased since 1978, so the warming during the past 30 years cannot be attributed to an increase in solar energy reaching the Earth.
Climate models have been used to examine the role of the sun in recent climate change.[105] Models are unable to reproduce the rapid warming observed in recent decades when they only take into account variations in solar output and volcanic activity. Models are, however, able to simulate the observed 20th century changes in temperature when they include all of the most important external forcings, including human influences and natural forcings.

Another line of evidence against the sun having caused recent climate change comes from looking at how temperatures at different levels in the Earth's atmosphere have changed.[106] Models and observations show that greenhouse warming results in warming of the lower atmosphere (called the troposphere) but cooling of the upper atmosphere (called the stratosphere).[107][108] Depletion of the ozone layer by chemical refrigerants has also resulted in a strong cooling effect in the stratosphere. If the sun were responsible for observed warming, warming of both the troposphere and stratosphere would be expected.[109]

Feedback

Sea ice, shown here in Nunavut, in northern Canada, reflects more sunshine, while open ocean absorbs more, accelerating melting.

The climate system includes a range of feedbacks, which alter the response of the system to changes in external forcings. Positive feedbacks increase the response of the climate system to an initial forcing, while negative feedbacks reduce the response of the climate system to an initial forcing.[110]

There are a range of feedbacks in the climate system, including water vapor, changes in ice-albedo (snow and ice cover affect how much the Earth's surface absorbs or reflects incoming sunlight), clouds, and changes in the Earth's carbon cycle (e.g., the release of carbon from soil).[111] The main negative feedback is the energy the Earth's surface radiates into space as infrared radiation.[112] According to the Stefan-Boltzmann law, if the absolute temperature (as measured in kelvin) doubles,[e] radiated energy increases by a factor of 16 (2 to the 4th power).[113]

Feedbacks are an important factor in determining the sensitivity of the climate system to increased atmospheric greenhouse gas concentrations. Other factors being equal, a higher climate sensitivity means that more warming will occur for a given increase in greenhouse gas forcing.[114] Uncertainty over the effect of feedbacks is a major reason why different climate models project different magnitudes of warming for a given forcing scenario. More research is needed to understand the role of clouds[110] and carbon cycle feedbacks in climate projections.[115]

The IPCC projections given in the lede span the "likely" range (greater than 66% probability, based on expert judgement)[6] for the selected emissions scenarios. However, the IPCC's projections do not reflect the full range of uncertainty.[116] The lower end of the "likely" range appears to be better constrained than the upper end of the "likely" range.[116]

Climate models

refer to caption
Calculations of global warming prepared in or before 2001 from a range of climate models under the SRES A2 emissions scenario, which assumes no action is taken to reduce emissions and regionally divided economic development.
refer to caption and image description
Projected change in annual mean surface air temperature from the late 20th century to the middle 21st century, based on a medium emissions scenario (SRES A1B).[117] This scenario assumes that no future policies are adopted to limit greenhouse gas emissions. Image credit: NOAA GFDL.[118]

A climate model is a computerized representation of the five components of the climate system: Atmosphere, hydrosphere, cryosphere, land surface, and biosphere.[119] Such models are based on scientific disciplines such as fluid dynamics, thermodynamics as well as physical processes such as radiative transfer. The models take into account various components, such as local air movement, temperature, clouds, and other atmospheric properties; ocean temperature, salt content, and circulation; ice cover on land and sea; the transfer of heat and moisture from soil and vegetation to the atmosphere; chemical and biological processes; solar variability and others.

Although researchers attempt to include as many processes as possible, simplifications of the actual climate system are inevitable because of the constraints of available computer power and limitations in knowledge of the climate system. Results from models can also vary due to different greenhouse gas inputs and the model's climate sensitivity. For example, the uncertainty in IPCC's 2007 projections is caused by (1) the use of multiple models[116] with differing sensitivity to greenhouse gas concentrations,[120] (2) the use of differing estimates of humanities' future greenhouse gas emissions,[116] (3) any additional emissions from climate feedbacks that were not included in the models IPCC used to prepare its report, i.e., greenhouse gas releases from permafrost.[121]

The models do not assume the climate will warm due to increasing levels of greenhouse gases. Instead the models predict how greenhouse gases will interact with radiative transfer and other physical processes. One of the mathematical results of these complex equations is a prediction whether warming or cooling will occur.[122]

Recent research has called special attention to the need to refine models with respect to the effect of clouds[123] and the carbon cycle.[124][125][126]

Models are also used to help investigate the causes of recent climate change by comparing the observed changes to those that the models project from various natural and human-derived causes. Although these models do not unambiguously attribute the warming that occurred from approximately 1910 to 1945 to either natural variation or human effects, they do indicate that the warming since 1970 is dominated by man-made greenhouse gas emissions.[59]

The physical realism of models is tested by examining their ability to simulate contemporary or past climates.[127] Climate models produce a good match to observations of global temperature changes over the last century, but do not simulate all aspects of climate.[128] Not all effects of global warming are accurately predicted by the climate models used by the IPCC. Observed Arctic shrinkage has been faster than that predicted.[129] Precipitation increased proportional to atmospheric humidity, and hence significantly faster than global climate models predict.[130][131] Since 1990, sea level has also risen considerably faster than models predicted it would.[132]

Observed and expected environmental effects

Refer to caption and adjacent text
Projections of global mean sea level rise by Parris and others.[133] Probabilities have not been assigned to these projections.[134] Therefore, none of these projections should be interpreted as a "best estimate" of future sea level rise. Image credit: NOAA.

"Detection" is the process of demonstrating that climate has changed in some defined statistical sense, without providing a reason for that change. Detection does not imply attribution of the detected change to a particular cause. "Attribution" of causes of climate change is the process of establishing the most likely causes for the detected change with some defined level of confidence.[135] Detection and attribution may also be applied to observed changes in physical, ecological and social systems.[136]

Natural systems

Global warming has been detected in a number of natural systems. Some of these changes are described in the section on observed temperature changes, e.g., sea level rise and widespread decreases in snow and ice extent.[137] Anthropogenic forcing has likely contributed to some of the observed changes, including sea level rise, changes in climate extremes (such as the number of warm and cold days), declines in Arctic sea ice extent, glacier retreat, and greening of the Sahara.[138][139]
refer to caption
Sparse records indicate that glaciers have been retreating since the early 1800s. In the 1950s measurements began that allow the monitoring of glacial mass balance, reported to the World Glacier Monitoring Service (WGMS) and the National Snow and Ice Data Center (NSIDC).

Over the 21st century,[140] the IPCC projects that global mean sea level could rise by 0.18–0.59 m.[141] The IPCC do not provide a best estimate of global mean sea level rise, and their upper estimate of 59 cm is not an upper-bound, i.e., global mean sea level could rise by more than 59 cm by 2100.[141] The IPCC's projections are conservative, and may underestimate future sea level rise.[142] Over the 21st century, Parris and others suggest that global mean sea level could rise by 0.2 to 2.0 m (0.7–6.6 ft), relative to mean sea level in 1992.[133]

Widespread coastal flooding would be expected if several degrees of warming is sustained for millennia.[143] For example, sustained global warming of more than 2 °C (relative to pre-industrial levels) could lead to eventual sea level rise of around 1 to 4 m due to thermal expansion of sea water and the melting of glaciers and small ice caps.[143] Melting of the Greenland ice sheet could contribute an additional 4 to 7.5 m over many thousands of years.[143]

Changes in regional climate are expected to include greater warming over land, with most warming at high northern latitudes, and least warming over the Southern Ocean and parts of the North Atlantic Ocean.[144] During the 21st century, glaciers[145] and snow cover[146] are projected to continue their widespread retreat. Projections of declines in Arctic sea ice vary.[147][148] Recent projections suggest that Arctic summers could be ice-free (defined as ice extent less than 1 million square km) as early as 2025-2030.[149]

Future changes in precipitation are expected to follow existing trends, with reduced precipitation over subtropical land areas, and increased precipitation at subpolar latitudes and some equatorial regions.[150] Projections suggest a probable increase in the frequency and severity of some extreme weather events, such as heat waves.[151]

Ecological systems

In terrestrial ecosystems, the earlier timing of spring events, and poleward and upward shifts in plant and animal ranges, have been linked with high confidence to recent warming.[137] Future climate change is expected to particularly affect certain ecosystems, including tundra, mangroves, and coral reefs.[144] It is expected that most ecosystems will be affected by higher atmospheric CO2 levels, combined with higher global temperatures.[152] Overall, it is expected that climate change will result in the extinction of many species and reduced diversity of ecosystems.[153]
Increases in atmospheric CO2 concentrations have led to an increase in ocean acidity.[154] Dissolved CO2 increases ocean acidity, which is measured by lower pH values.[154] Between 1750 to 2000, surface-ocean pH has decreased by ≈0.1, from ≈8.2 to ≈8.1.[155] Surface-ocean pH has probably not been below ≈8.1 during the past 2 million years.[155] Projections suggest that surface-ocean pH could decrease by an additional 0.3–0.4 units by 2100.[156] Future ocean acidification could threaten coral reefs, fisheries, protected species, and other natural resources of value to society.[154][157]

Long-term effects

On the timescale of centuries to millennia, the magnitude of global warming will be determined primarily by anthropogenic CO2 emissions.[158] This is due to carbon dioxide's very long lifetime in the atmosphere.[158]
Stabilizing global average temperature would require reductions in anthropogenic CO2 emissions.[158] Reductions in emissions of non-CO2 anthropogenic greenhouse gases (GHGs) (e.g., methane and nitrous oxide) would also be necessary.[158][159] For CO2, anthropogenic emissions would need to be reduced by more than 80% relative to their peak level.[158] Even if this were achieved, global average temperatures would remain close to their highest level for many centuries.[158]

Large-scale and abrupt impacts

Climate change could result in global, large-scale changes in natural and social systems.[160] Two examples are ocean acidification caused by increased atmospheric concentrations of carbon dioxide, and the long-term melting of ice sheets, which contributes to sea level rise.[161]
Some large-scale changes could occur abruptly, i.e., over a short time period, and might also be irreversible. An example of abrupt climate change is the rapid release of methane and carbon dioxide from permafrost, which would lead to amplified global warming.[162][163] Scientific understanding of abrupt climate change is generally poor.[164] The probability of abrupt change for some climate related feedbacks may be low.[162][165] Factors that may increase the probability of abrupt climate change include higher magnitudes of global warming, warming that occurs more rapidly, and warming that is sustained over longer time periods.[165]

Observed and expected effects on social systems

The effects of climate change on human systems, mostly due to warming or shifts in precipitation patterns, or both, have been detected worldwide. Production of wheat and maize globally has been impacted by climate change. 
While crop production has increased in some mid-latitude regions such as the UK and Northeast China, economic losses due to extreme weather events have increased globally. There has been a shift from cold- to heat-related mortality in some regions as a result of warming. Livelihoods of indigenous peoples of the Arctic have been altered by climate change, and there is emerging evidence of climate change impacts on livelihoods of indigenous peoples in other regions. Regional impacts of climate change are now observable at more locations than before, on all continents and across ocean regions.[166]
The future social impacts of climate change will be uneven.[167] Many risks are expected to increase with higher magnitudes of global warming.[168] All regions are at risk of experiencing negative impacts.[169] Low-latitude, less developed areas face the greatest risk.[170] Examples of impacts include:
  • Food: Crop production will probably be negatively affected in low latitude countries, while effects at northern latitudes may be positive or negative.[171] Global warming of around 4.6 °C relative to pre-industrial levels could pose a large risk to global and regional food security.[172]
  • Health: Generally impacts will be more negative than positive.[173] Impacts include: the effects of extreme weather, leading to injury and loss of life;[174] and indirect effects, such as undernutrition brought on by crop failures.[175]

Habitat inundation


Map showing where natural disasters caused/aggravated by global warming may occur.

In small islands and mega deltas, inundation as a result of sea level rise is expected to threaten vital infrastructure and human settlements.[176][177] This could lead to issues of homelessness in countries with low lying areas such as Bangladesh, as well as statelessness for populations in countries such as the Maldives and Tuvalu.[178]

Possible responses to global warming

Mitigation

Refer to caption and image description
The graph on the right shows three "pathways" to meet the UNFCCC's 2 °C target, labelled "global technology", "decentralised solutions", and "consumption change". Each pathway shows how various measures (e.g., improved energy efficiency, increased use of renewable energy) could contribute to emissions reductions. Image credit: PBL Netherlands Environmental Assessment Agency.[179]

Mitigation of climate change are actions to reduce greenhouse gas (GHG) emissions, or enhance the capacity of carbon sinks to absorb GHGs from the atmosphere.[180] There is a large potential for future reductions in emissions by a combination of activities, including: energy conservation and increased energy efficiency; the use of low-carbon energy technologies, such as renewable energy, nuclear energy, and carbon capture and storage;[181][182] and enhancing carbon sinks through, for example, reforestation and preventing deforestation.[181][182]

Near- and long-term trends in the global energy system are inconsistent with limiting global warming at below 1.5 or 2 °C, relative to pre-industrial levels.[183][184] Pledges made as part of the Cancún agreements are broadly consistent with having a likely chance (66 to 100% probability) of limiting global warming (in the 21st century) at below 3 °C, relative to pre-industrial levels.[184]

In limiting warming at below 2 °C, more stringent emission reductions in the near-term would allow for less rapid reductions after 2030.[185] Many integrated models are unable to meet the 2 °C target if pessimistic assumptions are made about the availability of mitigation technologies.[186]

Adaptation

Other policy responses include adaptation to climate change. Adaptation to climate change may be planned, either in reaction to or anticipation of climate change, or spontaneous, i.e., without government intervention.[187] Planned adaptation is already occurring on a limited basis.[181] The barriers, limits, and costs of future adaptation are not fully understood.[181]
A concept related to adaptation is adaptive capacity, which is the ability of a system (human, natural or managed) to adjust to climate change (including climate variability and extremes) to moderate potential damages, to take advantage of opportunities, or to cope with consequences.[188] Unmitigated climate change (i.e., future climate change without efforts to limit greenhouse gas emissions) would, in the long term, be likely to exceed the capacity of natural, managed and human systems to adapt.[189]

Environmental organizations and public figures have emphasized changes in the climate and the risks they entail, while promoting adaptation to changes in infrastructural needs and emissions reductions.[190]

Climate engineering

Climate engineering (sometimes called by the more expansive term 'geoengineering'), is the deliberate modification of the climate. It has been investigated as a possible response to global warming, e.g. by NASA[191] and the Royal Society.[192] Techniques under research fall generally into the categories solar radiation management and carbon dioxide removal, although various other schemes have been suggested. A study from 2014 investigated the most common climate engineering methods and concluded they are either ineffective or have potentially severe side effects and cannot be stopped without causing rapid climate change.[193]

Discourse about global warming

Political discussion

refer to caption
Article 2 of the UN Framework Convention refers explicitly to "stabilization of greenhouse gas concentrations."[194] To stabilize the atmospheric concentration of CO
2
, emissions worldwide would need to be dramatically reduced from their present level.[195]

Most countries are Parties to the United Nations Framework Convention on Climate Change (UNFCCC).[196] The ultimate objective of the Convention is to prevent dangerous human interference of the climate system.[197] As is stated in the Convention, this requires that GHG concentrations are stabilized in the atmosphere at a level where ecosystems can adapt naturally to climate change, food production is not threatened, and economic development can proceed in a sustainable fashion.[198] The Framework Convention was agreed in 1992, but since then, global emissions have risen.[199] During negotiations, the G77 (a lobbying group in the United Nations representing 133 developing nations)[200]:4 pushed for a mandate requiring developed countries to "[take] the lead" in reducing their emissions.[201] This was justified on the basis that: the developed world's emissions had contributed most to the stock of GHGs in the atmosphere; per-capita emissions (i.e., emissions per head of population) were still relatively low in developing countries; and the emissions of developing countries would grow to meet their development needs.[83]:290 This mandate was sustained in the Kyoto Protocol to the Framework Convention,[83]:290 which entered into legal effect in 2005.[202]

In ratifying the Kyoto Protocol, most developed countries accepted legally binding commitments to limit their emissions. These first-round commitments expired in 2012.[202] United States President George W. Bush rejected the treaty on the basis that "it exempts 80% of the world, including major population centers such as China and India, from compliance, and would cause serious harm to the US economy."[200]:5

At the 15th UNFCCC Conference of the Parties, held in 2009 at Copenhagen, several UNFCCC Parties produced the Copenhagen Accord.[203] Parties associated with the Accord (140 countries, as of November 2010)[204]:9 aim to limit the future increase in global mean temperature to below 2 °C.[205] The 16th Conference of the Parties (COP16) was held at Cancún in 2010. It produced an agreement, not a binding treaty, that the Parties should take urgent action to reduce greenhouse gas emissions to meet a goal of limiting global warming to 2 °C above pre-industrial temperatures. It also recognized the need to consider strengthening the goal to a global average rise of 1.5 °C.[206]

Scientific discussion

Most scientists agree that humans are contributing to observed climate change.[80][207] A meta study of academic papers concerning global warming, published between 1991 and 2011 and accessible from Web of Knowledge, found that among those whose abstracts expressed a position on the cause of global warming, 97.2% supported the consensus view that it is man made.[208] In an October 2011 paper published in the International Journal of Public Opinion Research, researchers from George Mason University analyzed the results of a survey of 489 American scientists working in academia, government, and industry. Of those surveyed, 97% agreed that that global temperatures have risen over the past century and 84% agreed that "human-induced greenhouse warming" is now occurring, only 5% disagreeing that human activity is a significant cause of global warming.[209][210] National science academies have called on world leaders for policies to cut global emissions.[211]
In the scientific literature, there is a strong consensus that global surface temperatures have increased in recent decades and that the trend is caused mainly by human-induced emissions of greenhouse gases. No scientific body of national or international standing disagrees with this view.[212][213]

Discussion by the public and in popular media

The global warming controversy refers to a variety of disputes, substantially more pronounced in the popular media than in the scientific literature,[214][215] regarding the nature, causes, and consequences of global warming. The disputed issues include the causes of increased global average air temperature, especially since the mid-20th century, whether this warming trend is unprecedented or within normal climatic variations, whether humankind has contributed significantly to it, and whether the increase is wholly or partially an artifact of poor measurements. Additional disputes concern estimates of climate sensitivity, predictions of additional warming, and what the consequences of global warming will be.
From 1990–1997 in the United States, conservative think tanks mobilized to challenge the legitimacy of global warming as a social problem. They challenged the scientific evidence, argued that global warming will have benefits, and asserted that proposed solutions would do more harm than good.[216]

Some people dispute aspects of climate change science.[207][217] Organizations such as the libertarian Competitive Enterprise Institute, conservative commentators, and some companies such as ExxonMobil have challenged IPCC climate change scenarios, funded scientists who disagree with the scientific consensus, and provided their own projections of the economic cost of stricter controls.[218][219][220][221] Some fossil fuel companies have scaled back their efforts in recent years,[222] or even called for policies to reduce global warming.[223]

Surveys of public opinion

A 2010 poll by the Office for National Statistics found that 75% of UK respondents were at least "fairly convinced" that the world's climate is changing, compared to 87% in a similar survey in 2006.[224] A January 2011 ICM poll in the UK found 83% of respondents viewed climate change as a current or imminent threat, while 14% said it was no threat. Opinion was unchanged from an August 2009 poll asking the same question, though there had been a slight polarisation of opposing views.[225]
By 2010, with 111 countries surveyed, Gallup determined that there was a substantial decrease since 2007–08 in the number of Americans and Europeans who viewed global warming as a serious threat. In the US, just a little over half the population (53%) now viewed it as a serious concern for either themselves or their families; this was 10 points below the 2008 poll (63%). Latin America had the biggest rise in concern: 73% said global warming is a serious threat to their families.[226] This global poll also found that people are more likely to attribute global warming to human activities than to natural causes, except in the USA where nearly half (47%) of the population attributed global warming to natural causes.[227]

A March–May 2013 survey by Pew Research Center for the People & the Press polled 39 countries about global threats. According to 54% of those questioned, global warming featured top of the perceived global threats.[228] In a January 2013 survey, Pew found that 69% of Americans say there is solid evidence that the Earth's average temperature has been getting warmer over the past few decades, up six points since November 2011 and 12 points since 2009.[229]

Etymology

According to Erik M. Conway, global warming became the dominant popular term after June 1988, when NASA climate scientist James Hansen used the term in a testimony to Congress[230] when he said: "global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and the observed warming."[231] Conway said that this testimony was widely reported in the media and subsequently global warming became the commonly used term by both the press and in public discourse. However, in 2008 he also wrote that "global climate change" is the more scientifically accurate term, because changes in Earth systems are not limited to surface temperatures.[230]

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...