Search This Blog

Thursday, July 29, 2021

Nonlinear system

From Wikipedia, the free encyclopedia

In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other scientists because most systems are inherently nonlinear in nature. Nonlinear dynamical systems, describing changes in variables over time, may appear chaotic, unpredictable, or counterintuitive, contrasting with much simpler linear systems.

Typically, the behavior of a nonlinear system is described in mathematics by a nonlinear system of equations, which is a set of simultaneous equations in which the unknowns (or the unknown functions in the case of differential equations) appear as variables of a polynomial of degree higher than one or in the argument of a function which is not a polynomial of degree one. In other words, in a nonlinear system of equations, the equation(s) to be solved cannot be written as a linear combination of the unknown variables or functions that appear in them. Systems can be defined as nonlinear, regardless of whether known linear functions appear in the equations. In particular, a differential equation is linear if it is linear in terms of the unknown function and its derivatives, even if nonlinear in terms of the other variables appearing in it.

As nonlinear dynamical equations are difficult to solve, nonlinear systems are commonly approximated by linear equations (linearization). This works well up to some accuracy and some range for the input values, but some interesting phenomena such as solitons, chaos, and singularities are hidden by linearization. It follows that some aspects of the dynamic behavior of a nonlinear system can appear to be counterintuitive, unpredictable or even chaotic. Although such chaotic behavior may resemble random behavior, it is in fact not random. For example, some aspects of the weather are seen to be chaotic, where simple changes in one part of the system produce complex effects throughout. This nonlinearity is one of the reasons why accurate long-term forecasts are impossible with current technology.

Some authors use the term nonlinear science for the study of nonlinear systems. This term is disputed by others:

Using a term like nonlinear science is like referring to the bulk of zoology as the study of non-elephant animals.

Definition

In mathematics, a linear map (or linear function) is one which satisfies both of the following properties:

  • Additivity or superposition principle:
  • Homogeneity:

Additivity implies homogeneity for any rational α, and, for continuous functions, for any real α. For a complex α, homogeneity does not follow from additivity. For example, an antilinear map is additive but not homogeneous. The conditions of additivity and homogeneity are often combined in the superposition principle

An equation written as

is called linear if is a linear map (as defined above) and nonlinear otherwise. The equation is called homogeneous if .

The definition is very general in that can be any sensible mathematical object (number, vector, function, etc.), and the function can literally be any mapping, including integration or differentiation with associated constraints (such as boundary values). If contains differentiation with respect to , the result will be a differential equation.

Nonlinear algebraic equations

Nonlinear algebraic equations, which are also called polynomial equations, are defined by equating polynomials (of degree greater than one) to zero. For example,

For a single polynomial equation, root-finding algorithms can be used to find solutions to the equation (i.e., sets of values for the variables that satisfy the equation). However, systems of algebraic equations are more complicated; their study is one motivation for the field of algebraic geometry, a difficult branch of modern mathematics. It is even difficult to decide whether a given algebraic system has complex solutions. Nevertheless, in the case of the systems with a finite number of complex solutions, these systems of polynomial equations are now well understood and efficient methods exist for solving them.

Nonlinear recurrence relations

A nonlinear recurrence relation defines successive terms of a sequence as a nonlinear function of preceding terms. Examples of nonlinear recurrence relations are the logistic map and the relations that define the various Hofstadter sequences. Nonlinear discrete models that represent a wide class of nonlinear recurrence relationships include the NARMAX (Nonlinear Autoregressive Moving Average with eXogenous inputs) model and the related nonlinear system identification and analysis procedures. These approaches can be used to study a wide class of complex nonlinear behaviors in the time, frequency, and spatio-temporal domains.

Nonlinear differential equations

A system of differential equations is said to be nonlinear if it is not a linear system. Problems involving nonlinear differential equations are extremely diverse, and methods of solution or analysis are problem dependent. Examples of nonlinear differential equations are the Navier–Stokes equations in fluid dynamics and the Lotka–Volterra equations in biology.

One of the greatest difficulties of nonlinear problems is that it is not generally possible to combine known solutions into new solutions. In linear problems, for example, a family of linearly independent solutions can be used to construct general solutions through the superposition principle. A good example of this is one-dimensional heat transport with Dirichlet boundary conditions, the solution of which can be written as a time-dependent linear combination of sinusoids of differing frequencies; this makes solutions very flexible. It is often possible to find several very specific solutions to nonlinear equations, however the lack of a superposition principle prevents the construction of new solutions.

Ordinary differential equations

First order ordinary differential equations are often exactly solvable by separation of variables, especially for autonomous equations. For example, the nonlinear equation

has as a general solution (and also u = 0 as a particular solution, corresponding to the limit of the general solution when C tends to infinity). The equation is nonlinear because it may be written as

and the left-hand side of the equation is not a linear function of u and its derivatives. Note that if the u2 term were replaced with u, the problem would be linear (the exponential decay problem).

Second and higher order ordinary differential equations (more generally, systems of nonlinear equations) rarely yield closed-form solutions, though implicit solutions and solutions involving nonelementary integrals are encountered.

Common methods for the qualitative analysis of nonlinear ordinary differential equations include:

Partial differential equations

The most common basic approach to studying nonlinear partial differential equations is to change the variables (or otherwise transform the problem) so that the resulting problem is simpler (possibly even linear). Sometimes, the equation may be transformed into one or more ordinary differential equations, as seen in separation of variables, which is always useful whether or not the resulting ordinary differential equation(s) is solvable.

Another common (though less mathematical) tactic, often seen in fluid and heat mechanics, is to use scale analysis to simplify a general, natural equation in a certain specific boundary value problem. For example, the (very) nonlinear Navier-Stokes equations can be simplified into one linear partial differential equation in the case of transient, laminar, one dimensional flow in a circular pipe; the scale analysis provides conditions under which the flow is laminar and one dimensional and also yields the simplified equation.

Other methods include examining the characteristics and using the methods outlined above for ordinary differential equations.

Pendula

Illustration of a pendulum
Linearizations of a pendulum

A classic, extensively studied nonlinear problem is the dynamics of a pendulum under the influence of gravity. Using Lagrangian mechanics, it may be shown that the motion of a pendulum can be described by the dimensionless nonlinear equation

where gravity points "downwards" and is the angle the pendulum forms with its rest position, as shown in the figure at right. One approach to "solving" this equation is to use as an integrating factor, which would eventually yield

which is an implicit solution involving an elliptic integral. This "solution" generally does not have many uses because most of the nature of the solution is hidden in the nonelementary integral (nonelementary unless ).

Another way to approach the problem is to linearize any nonlinearities (the sine function term in this case) at the various points of interest through Taylor expansions. For example, the linearization at , called the small angle approximation, is

since for . This is a simple harmonic oscillator corresponding to oscillations of the pendulum near the bottom of its path. Another linearization would be at , corresponding to the pendulum being straight up:

since for . The solution to this problem involves hyperbolic sinusoids, and note that unlike the small angle approximation, this approximation is unstable, meaning that will usually grow without limit, though bounded solutions are possible. This corresponds to the difficulty of balancing a pendulum upright, it is literally an unstable state.

One more interesting linearization is possible around , around which :

This corresponds to a free fall problem. A very useful qualitative picture of the pendulum's dynamics may be obtained by piecing together such linearizations, as seen in the figure at right. Other techniques may be used to find (exact) phase portraits and approximate periods.

Types of nonlinear dynamic behaviors

  • Amplitude death – any oscillations present in the system cease due to some kind of interaction with other system or feedback by the same system
  • Chaos – values of a system cannot be predicted indefinitely far into the future, and fluctuations are aperiodic
  • Multistability – the presence of two or more stable states
  • Solitons – self-reinforcing solitary waves
  • Limit cycles – asymptotic periodic orbits to which destabilized fixed points are attracted.
  • Self-oscillations - feedback oscillations taking place in open dissipative physical systems.

Examples of nonlinear equations

Internet bot

From Wikipedia, the free encyclopedia

An Internet bot, web robot, robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet. Typically, bots perform tasks that are simple and repetitive much faster than a person could. The most extensive use of bots is for web crawling, in which an automated script fetches, analyzes and files information from web servers. More than half of all web traffic is generated by bots.

Efforts by web servers to restrict bots vary. Some servers have a robots.txt file that contains the rules governing bot behavior on that server. Any bot that does not follow the rules could, in theory, be denied access to or removed from, the affected website. If the posted text file has no associated program/software/app, then adhering to the rules is entirely voluntary. There would be no way to enforce the rules or to ensure that a bot's creator or implementer reads or acknowledges the robots.txt file. Some bots are "good" – e.g. search engine spiders – while others are used to launch malicious attacks on, for example, political campaigns.

IM and IRC

Some bots communicate with users of Internet-based services, via instant messaging (IM), Internet Relay Chat (IRC), or other web interfaces such as Facebook bots and Twitter bots. These chatbots may allow people to ask questions in plain English and then formulate a response. Such bots can often handle reporting weather, zip code information, sports scores, currency or other unit conversions, etc. Others are used for entertainment, such as SmarterChild on AOL Instant Messenger and MSN Messenger.

Bots are very commonly used on social media. A user may not be aware that they are interacting with a bot.

Additional roles of an IRC bot may be to listen on a conversation channel, and to comment on certain phrases uttered by the participants (based on pattern matching). This is sometimes used as a help service for new users or to censor profanity.

Social bots

Social networking bots are sets of algorithms that take on the duties of repetitive sets of instructions in order to establish a service or connection among social networking users. Among the various designs of networking bots, the most common are chat bots, algorithms designed to converse with a human user, and social bots, algorithms designed to mimic human behaviors to converse with patterns similar to those of a human user. The history of social botting can be traced back to Alan Turing in the 1950s and his vision of designing sets of instructional code approved by the Turing test. In the 1960s Joseph Weizenbaum created ELIZA, a natural language processing computer program. considered an early indicator of artificial intelligence algorithms. ELIZA inspired computer programmers to design tasked programs that can match behavior patterns to their sets of instruction. As a result, natural language processing has become an influencing factor to the development of artificial intelligence and social bots. And as information and thought see a progressive mass spreading on social media websites, innovative technological advancements are made following the same pattern.

Reports of political interferences in recent elections, including the 2016 US and 2017 UK general elections, have set the notion of bots being more prevalent because of the ethics that is challenged between the bot's design and the bot's designer. Emilio Ferrara, a computer scientist from the University of Southern California reporting on Communications of the ACM, said the lack of resources available to implement fact-checking and information verification results in the large volumes of false reports and claims made about these bots on social media platforms. In the case of Twitter, most of these bots are programmed with search filter capabilities that target keywords and phrases favoring political agendas and then retweet them. While the attention of bots is programmed to spread unverified information throughout the social media platforms, it is a challenge that programmers face in the wake of a hostile political climate. The Bot Effect is what Ferrera reported as the socialization of bots and human users creating a vulnerability to the leaking of personal information and polarizing influences outside the ethics of the bot's code, and was confirmed by Guillory Kramer in his study where he observed the behavior of emotionally volatile users and the impact the bots have on them, altering their perception of reality.

Commercial bots

There has been a great deal of controversy about the use of bots in an automated trading function. Auction website eBay took legal action in an attempt to suppress a third-party company from using bots to look for bargains on its site; this approach backfired on eBay and attracted the attention of further bots. The United Kingdom-based bet exchange, Betfair, saw such a large amount of traffic coming from bots that it launched a WebService API aimed at bot programmers, through which it can actively manage bot interactions.

Bot farms are known to be used in online app stores, like the Apple App Store and Google Play, to manipulate positions or increase positive ratings/reviews.

A rapidly growing, benign form of internet bot is the chatbot. From 2016, when Facebook Messenger allowed developers to place chatbots on their platform, there has been an exponential growth of their use on that app alone. 30,000 bots were created for Messenger in the first six months, rising to 100,000 by September 2017. Avi Ben Ezra, CTO of SnatchBot, told Forbes that evidence from the use of their chatbot building platform pointed to a near future saving of millions of hours of human labor as 'live chat' on websites was replaced with bots.

Companies use internet bots to increase online engagement and streamline communication. Companies often use bots to cut down on cost; instead of employing people to communicate with consumers, companies have developed new ways to be efficient. These chatbots are used to answer customers' questions: for example, Domino's developed a chatbot that can take orders via Facebook Messenger. Chatbots allow companies to allocate their employees' time to other tasks.

Malicious bots

One example of the malicious use of bots is the coordination and operation of an automated attack on networked computers, such as a denial-of-service attack by a botnet. Internet bots or web bots can also be used to commit click fraud and more recently have appeared around MMORPG games as computer game bots. Another category is represented by spambots, internet bots that attempt to spam large amounts of content on the Internet, usually adding advertising links. More than 94.2% of websites have experienced a bot attack.

There are malicious bots (and botnets) of the following types:

  1. Spambots that harvest email addresses from contact or guestbook pages
  2. Downloaded programs that suck bandwidth by downloading entire websites
  3. Website scrapers that grab the content of websites and re-use it without permission on automatically generated doorway pages
  4. Registration bots that sign up a specific email address to numerous services in order to have the confirmation messages flood the email inbox and distract from important messages indicating a security breach.
  5. Viruses and worms
  6. DDoS attacks
  7. Botnets, zombie computers, etc.
  8. Spambots that try to redirect people onto a malicious website, sometimes found in comment sections or forums of various websites
  9. Viewbots create fake views
  10. Bots that buy up higher-demand seats for concerts, particularly by ticket brokers who resell the tickets. These bots run through the purchase process of entertainment event-ticketing sites and obtain better seats by pulling as many seats back as it can.
  11. Bots that are used in massively multiplayer online role-playing games to farm for resources that would otherwise take significant time or effort to obtain, which can be a concern for online in-game economies.
  12. Bots that increase views for YouTube videos
  13. Bots that increase traffic counts on analytics reporting to extract money from advertisers. A study by Comscore found that over half of ads shown across thousands of campaigns between May 2012 and February 2013 were not served to human users.
  14. Bots used on internet forums to automatically post inflammatory or nonsensical posts to disrupt the forum and anger users.

in 2012, journalist Percy von Lipinski reported that he discovered millions of bots or botted or pinged views at CNN iReport. CNN iReport quietly removed millions of views from the account of iReporter Chris Morrow. It is not known if the ad revenue received by CNN from the fake views was ever returned to the advertisers.

The most widely used anti-bot technique is the use of CAPTCHA. Examples of providers include Recaptcha, Minteye, Solve Media and NuCaptcha. However, captchas are not foolproof in preventing bots, as they can often be circumvented by computer character recognition, security holes, and outsourcing captcha solving to cheap laborers.

Human interaction with social bots

There are two main concerns with bots: clarity and face-to-face support. The cultural background of human beings affects the way they communicate with social bots. Many people believe that bots are vastly less intelligent than humans and so they are not worthy of our respect.

Min-Sun Kim proposed five concerns or issues that may arise when communicating with a social robot, and they are avoiding the damage of peoples' feelings, minimizing impositions, disproval from others, clarity issues, and how effective their messages may come across.

Social robots also take away from the genuine creations of human relationships.

Global Consciousness Project

From Wikipedia, the free encyclopedia

The Global Consciousness Project (GCP, also called the EGG Project) is a parapsychology experiment begun in 1998 as an attempt to detect possible interactions of "global consciousness" with physical systems. The project monitors a geographically distributed network of hardware random number generators in a bid to identify anomalous outputs that correlate with widespread emotional responses to sets of world events, or periods of focused attention by large numbers of people. The GCP is privately funded through the Institute of Noetic Sciences and describes itself as an international collaboration of about 100 research scientists and engineers.

Skeptics such as Robert T. Carroll, Claus Larsen, and others have questioned the methodology of the Global Consciousness Project, particularly how the data are selected and interpreted, saying the data anomalies reported by the project are the result of "pattern matching" and selection bias which ultimately fail to support a belief in psi or global consciousness. Other critics have stated that the open access to the test data "is a testimony to the integrity and curiosity of those involved". But in analyzing the data for 11 September 2001, May et al. concluded that the statistically significant result given by the published GCP hypothesis was fortuitous, and found that as far as this particular event was concerned an alternative method of analysis gave only chance deviations throughout.

Background

Roger D. Nelson developed the project as an extrapolation of two decades of experiments from the controversial Princeton Engineering Anomalies Research Lab (PEAR).

In an extension of the laboratory research utilizing hardware random number generators called FieldREG, investigators examined the outputs of REGs in the field before, during and after highly focused or coherent group events. The group events studied included psychotherapy sessions, theater presentations, religious rituals, sports competitions such as the Football World Cup, and television broadcasts such as the Academy Awards.

FieldREG was extended to global dimensions in studies looking at data from 12 independent REGs in the US and Europe during a web-promoted "Gaiamind Meditation" in January 1997, and then again in September 1997 after the death of Diana, Princess of Wales. The project claimed the results suggested it would be worthwhile to build a permanent network of continuously-running REGs. This became the EGG project or Global Consciousness Project.

Comparing the GCP to PEAR, Nelson, referring to the "field" studies with REGs done by PEAR, said the GCP used "exactly the same procedure... applied on a broader scale."[9][non-primary source needed]

Methodology

The GCP's methodology is based on the hypothesis that events which elicit widespread emotion or draw the simultaneous attention of large numbers of people may affect the output of hardware random number generators in a statistically significant way. The GCP maintains a network of hardware random number generators which are interfaced to computers at 70 locations around the world. Custom software reads the output of the random number generators and records a trial (sum of 200 bits) once every second. The data are sent to a server in Princeton, creating a database of synchronized parallel sequences of random numbers. The GCP is run as a replication experiment, essentially combining the results of many distinct tests of the hypothesis. The hypothesis is tested by calculating the extent of data fluctuations at the time of events. The procedure is specified by a three-step experimental protocol. In the first step, the event duration and the calculation algorithm are pre-specified and entered into a formal registry. In the second step, the event data are extracted from the database and a Z score, which indicates the degree of deviation from the null hypothesis, is calculated from the pre-specified algorithm. In the third step, the event Z-score is combined with the Z-scores from previous events to yield an overall result for the experiment.

The remote devices have been dubbed Princeton Eggs, a reference to the coinage electrogaiagram, a portmanteau of electroencephalogram and Gaia. Supporters and skeptics have referred to the aim of the GCP as being analogous to detecting "a great disturbance in the Force."

Claims and criticism of effects from the September 11 terrorist attacks

The GCP has suggested that changes in the level of randomness may have occurred during the September 11, 2001 attacks when the planes first impacted, as well as in the two days following the attacks.

Independent scientists Edwin May and James Spottiswoode conducted an analysis of the data around the September 11 attacks and concluded there was no statistically significant change in the randomness of the GCP data during the attacks and the apparent significant deviation reported by Nelson and Radin existed only in their chosen time window. Spikes and fluctuations are to be expected in any random distribution of data, and there is no set time frame for how close a spike has to be to a given event for the GCP to say they have found a correlation. Wolcotte Smith said "A couple of additional statistical adjustments would have to be made to determine if there really was a spike in the numbers," referencing the data related to September 11, 2001. Similarly, Jeffrey D. Scargle believes unless both Bayesian and classical p-value analysis agree and both show the same anomalous effects, the kind of result GCP proposes will not be generally accepted.

In 2003, a New York Times article concluded "All things considered at this point, the stock market seems a more reliable gauge of the national—if not the global—emotional resonance."

In 2007 The Age reported that "[Nelson] concedes the data, so far, is not solid enough for global consciousness to be said to exist at all. It is not possible, for example, to look at the data and predict with any accuracy what (if anything) the eggs may be responding to."

Robert Matthews said that while it was "the most sophisticated attempt yet" to prove psychokinesis existed, the unreliability of significant events to cause statistically significant spikes meant that "the only conclusion to emerge from the Global Consciousness Project so far is that data without a theory is as meaningless as words without a narrative".

 

Noosphere

From Wikipedia, the free encyclopedia

The noosphere (alternate spelling noösphere) is a philosophical concept developed and popularized by the Russian-Ukrainian Soviet biogeochemist Vladimir Vernadsky, and the French philosopher and Jesuit priest Pierre Teilhard de Chardin. Vernadsky defined the noosphere as the new state of the biosphere and described as the planetary "sphere of reason". The noosphere represents the highest stage of biospheric development, its defining factor being the development of humankind's rational activities.

The word is derived from the Greek νόος ("mind", "reason") and σφαῖρα ("sphere"), in lexical analogy to "atmosphere" and "biosphere". The concept, however, cannot be accredited to a single author. The founding authors Vernadsky and de Chardin developed two related but starkly different concepts, the former being grounded in the geological sciences, and the latter in theology. Both conceptions of the noosphere share the common thesis that together human reason and the scientific thought has created, and will continue to create, the next evolutionary geological layer. This geological layer is part of the evolutionary chain. Second generation authors, predominantly of Russian origin, have further developed the Vernadskian concept, creating the related concepts: noocenosis and noocenology.

Founding authors

The term noosphere was first used in the publications of Pierre Teilhard de Chardin in 1922 in his Cosmogenesis. Vernadsky was most likely introduced to the term by a common acquaintance, Édouard Le Roy, during a stay in Paris. Some sources claim Édouard Le Roy actually first proposed the term. Vernadsky himself wrote that he was first introduced to the concept by Le Roy in his 1927 lectures at the College of France, and that Le Roy had emphasized a mutual exploration of the concept with Teilhard de Chardin. According to Vernadsky's own letters, he took Le Roy's ideas on the noosphere from Le Roy's article "Les origines humaines et l’evolution de l’intelligence", part III: "La noosphere et l’hominisation", before reworking the concept within his own field, biogeochemistry. The historian Bailes concludes that Vernadsky and Teilhard de Chardin were mutual influences on each other, as Teilhard de Chardin also attended the Vernadsky's lectures on biogeochemistry, before creating the concept of the noosphere.

An account stated that Le Roy and Teilhard were not aware of the concept of biosphere in their noosphere concept and that it was Vernadsky who introduced them to this notion, which gave their conceptualization a grounding on natural sciences. Both Teilhard de Chardin and Vernadsky base their conceptions of the noosphere on the term 'biosphere', developed by Edward Suess in 1875. Despite the differing backgrounds, approaches and focuses of Teilhard and Vernadsky, they have a few fundamental themes in common. Both scientists overstepped the boundaries of natural science and attempted to create all-embracing theoretical constructions founded in philosophy, social sciences and authorized interpretations of the evolutionary theory. Moreover, both thinkers were convinced of the teleological character of evolution. They also argued that human activity becomes a geological power and that the manner by which it is directed can influence the environment. There are, however, fundamental differences in the two conceptions.

Concept

In the theory of Vernadsky, the noosphere is the third in a succession of phases of development of the Earth, after the geosphere (inanimate matter) and the biosphere (biological life). Just as the emergence of life fundamentally transformed the geosphere, the emergence of human cognition fundamentally transforms the biosphere. In contrast to the conceptions of the Gaia theorists, or the promoters of cyberspace, Vernadsky's noosphere emerges at the point where humankind, through the mastery of nuclear processes, begins to create resources through the transmutation of elements. It is also currently being researched as part of the Global Consciousness Project.

Teilhard perceived a directionality in evolution along an axis of increasing Complexity/Consciousness. For Teilhard, the noosphere is the sphere of thought encircling the earth that has emerged through evolution as a consequence of this growth in complexity/consciousness. The noosphere is therefore as much part of nature as the barysphere, lithosphere, hydrosphere, atmosphere, and biosphere. As a result, Teilhard sees the "social phenomenon [as] the culmination of and not the attenuation of the biological phenomenon." These social phenomena are part of the noosphere and include, for example, legal, educational, religious, research, industrial and technological systems. In this sense, the noosphere emerges through and is constituted by the interaction of human minds. The noosphere thus grows in step with the organization of the human mass in relation to itself as it populates the earth. Teilhard argued the noosphere evolves towards ever greater personalisation, individuation and unification of its elements. He saw the Christian notion of love as being the principal driver of "noogenesis", the evolution of mind. Evolution would culminate in the Omega Point—an apex of thought/consciousness—which he identified with the eschatological return of Christ.

One of the original aspects of the noosphere concept deals with evolution. Henri Bergson, with his L'évolution créatrice (1907), was one of the first to propose evolution is "creative" and cannot necessarily be explained solely by Darwinian natural selection. L'évolution créatrice is upheld, according to Bergson, by a constant vital force which animates life and fundamentally connects mind and body, an idea opposing the dualism of René Descartes. In 1923, C. Lloyd Morgan took this work further, elaborating on an "emergent evolution" which could explain increasing complexity (including the evolution of mind). Morgan found many of the most interesting changes in living things have been largely discontinuous with past evolution. Therefore, these living things did not necessarily evolve through a gradual process of natural selection. Rather, he posited, the process of evolution experiences jumps in complexity (such as the emergence of a self-reflective universe, or noosphere), in a sort of qualitative punctuated equilibrium. Finally, the complexification of human cultures, particularly language, facilitated a quickening of evolution in which cultural evolution occurs more rapidly than biological evolution. Recent understanding of human ecosystems and of human impact on the biosphere have led to a link between the notion of sustainability with the "co-evolution" and harmonization of cultural and biological evolution.

Futures studies

From Wikipedia, the free encyclopedia

A rough approximation of Pangaea Proxima, a potential supercontinent that may exist in about 250 million years according to the early model on the Paleomap Project website
 
Moore's law is an example of futurology; it is a statistical collection of past and present trends with the goal of accurately extrapolating future trends.

Futures studies, futures research or futurology is the systematic, interdisciplinary and holistic study of social and technological advancement, and other environmental trends, often for the purpose of exploring how people will live and work in the future. Predictive techniques, such as forecasting, can be applied, but contemporary futures studies scholars emphasize the importance of systematically exploring alternatives. In general, it can be considered as a branch of the social sciences and parallel to the field of history. Futures studies (colloquially called "futures" by many of the field's practitioners) seeks to understand what is likely to continue and what could plausibly change. Part of the discipline thus seeks a systematic and pattern-based understanding of past and present, and to explore the possibility of future events and trends.

Unlike the physical sciences where a narrower, more specified system is studied, futurology concerns a much bigger and more complex world system. The methodology and knowledge are much less proven than in natural science and social sciences like sociology and economics. There is a debate as to whether this discipline is an art or science, and it is sometimes described as pseudoscience; nevertheless, the Association of Professional Futurists was formed in 2002, a Foresight Competency Model was developed in 2017, and it is now possible to academically study it, for example at the FU Berlin in their master's course Zukunftsforschung.

Overview

Futurology is an interdisciplinary field that aggregates and analyzes trends, with both lay and professional methods, to compose possible futures. It includes analyzing the sources, patterns, and causes of change and stability in an attempt to develop foresight. Around the world the field is variously referred to as futures studies, futures research, strategic foresight, futuristics, futures thinking, futuring, and futurology. Futures studies and strategic foresight are the academic field's most commonly used terms in the English-speaking world.

Foresight was the original term and was first used in this sense by H.G. Wells in 1932. "Futurology" is a term common in encyclopedias, though it is used almost exclusively by nonpractitioners today, at least in the English-speaking world. "Futurology" is defined as the "study of the future." The term was coined by German professor Ossip K. Flechtheim in the mid-1940s, who proposed it as a new branch of knowledge that would include a new science of probability. This term has fallen from favor in recent decades because modern practitioners stress the importance of alternative, plausible, preferable and plural futures, rather than one monolithic future, and the limitations of prediction and probability, versus the creation of possible and preferable futures.

Three factors usually distinguish futures studies from the research conducted by other disciplines (although all of these disciplines overlap, to differing degrees). First, futures studies often examines trends to compose possible, probable, and preferable futures along with the role "wild cards" can play on future scenarios. Second, futures studies typically attempts to gain a holistic or systemic view based on insights from a range of different disciplines, generally focusing on the STEEP categories of Social, Technological, Economic, Environmental and Political. Third, futures studies challenges and unpacks the assumptions behind dominant and contending views of the future. The future thus is not empty but fraught with hidden assumptions. For example, many people expect the collapse of the Earth's ecosystem in the near future, while others believe the current ecosystem will survive indefinitely. A foresight approach would seek to analyze and highlight the assumptions underpinning such views.

As a field, futures studies expands on the research component, by emphasizing the communication of a strategy and the actionable steps needed to implement the plan or plans leading to the preferable future. It is in this regard, that futures studies evolves from an academic exercise to a more traditional business-like practice, looking to better prepare organizations for the future.

Futures studies does not generally focus on short term predictions such as interest rates over the next business cycle, or of managers or investors with short-term time horizons. Most strategic planning, which develops goals and objectives with time horizons of one to three years, is also not considered futures. Plans and strategies with longer time horizons that specifically attempt to anticipate possible future events are definitely part of the field. Learning about medium and long-term developments may at times be observed from their early signs. As a rule, futures studies is generally concerned with changes of transformative impact, rather than those of an incremental or narrow scope.

The futures field also excludes those who make future predictions through professed supernatural means.

To complete a futures study, a domain is selected for examination. The domain is the main idea of the project, or what the outcome of the project seeks to determine. Domains can have a strategic or exploratory focus and must narrow down the scope of the research. It examines what will, and more importantly, will not be discussed in the research. Futures practitioners study trends focusing on STEEP (Social Technological, Economic, Environments and Political) baselines. Baseline exploration examine current STEEP environments to determine normal trends, called baselines. Next, practitioners use scenarios to explore different futures outcomes. Scenarios examine how the future can be different. 1. Collapse Scenarios seek to answer: What happens if the STEEP baselines fall into ruin and no longer exist? How will that impact STEEP categories? 2. Transformation Scenarios: explore futures with the baseline of society transiting to a “new” state. How are the STEEP categories effected if society has a whole new structure? 3. New Equilibrium: examines an entire change to the structure of the domain. What happens if the baseline changes to a “new” baseline within the same structure of society? Hines, Andy; Bishop, Peter (2006). Thinking About The Future Guidelines for Strategic Foresight.

History

Origins

Sir Thomas More, originator of the 'Utopian' ideal.

Johan Galtung and Sohail Inayatullah argue in Macrohistory and Macrohistorians that the search for grand patterns of social change goes all the way back to Sima Qian (145-90BC) and his theory of the cycles of virtue, although the work of Ibn Khaldun (1332–1406) such as The Muqaddimah would be an example that is perhaps more intelligible to modern sociology. Early western examples include Sir Thomas More’s “Utopia,” published in 1516, and based upon Plato’s “Republic,” in which a future society has overcome poverty and misery to create a perfect model for living. This work was so powerful that utopias, originally meaning "nowhere", have come to represent positive and fulfilling futures in which everyone's needs are met.

Some intellectual foundations of futures studies appeared in the mid-19th century. Isadore Comte, considered the father of scientific philosophy, was heavily influenced by the work of utopian socialist Henri Saint-Simon, and his discussion of the metapatterns of social change presages futures studies as a scholarly dialogue.

The first works that attempt to make systematic predictions for the future were written in the 18th century. Memoirs of the Twentieth Century written by Samuel Madden in 1733, takes the form of a series of diplomatic letters written in 1997 and 1998 from British representatives in the foreign cities of Constantinople, Rome, Paris, and Moscow. However, the technology of the 20th century is identical to that of Madden's own era - the focus is instead on the political and religious state of the world in the future. Madden went on to write The Reign of George VI, 1900 to 1925, where (in the context of the boom in canal construction at the time) he envisioned a large network of waterways that would radically transform patterns of living - "Villages grew into towns and towns became cities".

In 1845, Scientific American, the oldest continuously published magazine in the U.S., began publishing articles about scientific and technological research, with a focus upon the future implications of such research. It would be followed in 1872 by the magazine Popular Science, which was aimed at a more general readership.

The genre of science fiction became established towards the end of the 19th century, with notable writers, including Jules Verne and H. G. Wells, setting their stories in an imagined future world.

Early 20th Century

H. G. Wells first advocated for 'future studies' in a lecture delivered in 1902.

According to W. Warren Wagar, the founder of future studies was H. G. Wells. His Anticipations of the Reaction of Mechanical and Scientific Progress Upon Human Life and Thought: An Experiment in Prophecy, was first serially published in The Fortnightly Review in 1901. Anticipating what the world would be like in the year 2000, the book is interesting both for its hits (trains and cars resulting in the dispersion of population from cities to suburbs; moral restrictions declining as men and women seek greater sexual freedom; the defeat of German militarism, the existence of a European Union, and a world order maintained by "English-speaking peoples" based on the urban core between Chicago and New York) and its misses (he did not expect successful aircraft before 1950, and averred that "my imagination refuses to see any sort of submarine doing anything but suffocate its crew and founder at sea").

Moving from narrow technological predictions, Wells envisioned the eventual collapse of the capitalist world system after a series of destructive total wars. From this havoc would ultimately emerge a world of peace and plenty, controlled by competent technocrats.

The work was a bestseller, and Wells was invited to deliver a lecture at the Royal Institution in 1902, entitled The Discovery of the Future. The lecture was well-received and was soon republished in book form. He advocated for the establishment of a new academic study of the future that would be grounded in scientific methodology rather than just speculation. He argued that a scientifically ordered vision of the future "will be just as certain, just as strictly science, and perhaps just as detailed as the picture that has been built up within the last hundred years to make the geological past." Although conscious of the difficulty in arriving at entirely accurate predictions, he thought that it would still be possible to arrive at a "working knowledge of things in the future".

In his fictional works, Wells predicted the invention and use of the atomic bomb in The World Set Free (1914). In The Shape of Things to Come (1933) the impending World War and cities destroyed by aerial bombardment was depicted. However, he didn't stop advocating for the establishment of a futures science. In a 1933 BBC broadcast he called for the establishment of "Departments and Professors of Foresight", foreshadowing the development of modern academic futures studies by approximately 40 years.

At the beginning of the 20th century future works were often shaped by political forces and turmoil. The WWI era led to adoption of futures thinking in institutions throughout Europe. The Russian Revolution led to the 1921 establishment of the Soviet Union's Gosplan, or State Planning Committee, which was active until the dissolution of the Soviet Union. Gosplan was responsible for economic planning and created plans in five year increments to govern the economy. One of the first Soviet dissidents, Yevgeny Zamyatin, published the first dystopian novel, We, in 1921. The science fiction and political satire featured a future police state and was the first work censored by the Soviet censorship board, leading to Zamyatin's political exile.

In the United States, President Hoover created the Research Committee on Social Trends, which produced a report in 1933. The head of the committee, William F. Ogburn, analyzed the past to chart trends and project those trends into the future, with a focus on technology. Similar technique was used during The Great Depression, with the addition of alternative futures and a set of likely outcomes that resulted in the creation of Social Security and the Tennessee Valley development project.

The WWII era emphasized the growing need for foresight. The Nazis used strategic plans to unify and mobilize their society with a focus on creating a fascist utopia. This planning and the subsequent war forced global leaders to create their own strategic plans in response. The post-war era saw the creation of numerous nation states with complex political alliances and was further complicated by the introduction of nuclear power.

Project RAND was created in 1946 as joint project between the United States Army Air Forces and the Douglas Aircraft Company, and later incorporated as the non-profit RAND corporation. Their objective was the future of weapons, and long-range planning to meet future threats. Their work has formed the basis of US strategy and policy in regard to nuclear weapons, the Cold War, and the space race.

Mid-Century Emergence

Futures studies truly emerged as an academic discipline in the mid-1960s. First-generation futurists included Herman Kahn, an American Cold War strategist for the RAND Corporation who wrote On Thermonuclear War (1960), Thinking about the unthinkable (1962) and The Year 2000: a framework for speculation on the next thirty-three years (1967); Bertrand de Jouvenel, a French economist who founded Futuribles International in 1960; and Dennis Gabor, a Hungarian-British scientist who wrote Inventing the Future (1963) and The Mature Society. A View of the Future (1972).

Future studies had a parallel origin with the birth of systems science in academia, and with the idea of national economic and political planning, most notably in France and the Soviet Union. In the 1950s, the people of France were continuing to reconstruct their war-torn country. In the process, French scholars, philosophers, writers, and artists searched for what could constitute a more positive future for humanity. The Soviet Union similarly participated in postwar rebuilding, but did so in the context of an established national economic planning process, which also required a long-term, systemic statement of social goals. Future studies was therefore primarily engaged in national planning, and the construction of national symbols.

Rachel Carson, author of The Silent Spring, which helped launch the environmental movement and a new direction for futures research.

By contrast, in the United States, futures studies as a discipline emerged from the successful application of the tools and perspectives of systems analysis, especially with regard to quartermastering the war-effort. The Society for General Systems Research, founded in 1955, sought to understand cybernetics and the practical application of systems sciences, greatly influencing the U.S. foresight community. These differing origins account for an initial schism between futures studies in America and “futurology” in Europe: U.S. practitioners focused on applied projects, quantitative tools and systems analysis, whereas Europeans preferred to investigate the long-range future of humanity and the Earth, what might constitute that future, what symbols and semantics might express it, and who might articulate these.

By the 1960s, academics, philosophers, writers and artists across the globe had begun to explore enough future scenarios so as to fashion a common dialogue. Several of the most notable writers to emerge during this era include: sociologist Fred L. Polak, whose work Images of the Future (1961) discusses the importance of images to society's creation of the future; Marshall McLuhan, whose The Gutenberg Galaxy (1962) and Understanding Media: The Extensions of Man (1964) put forth his theories on how technologies change our cognitive understanding; and Rachel Carson’s The Silent Spring (1962) which was hugely influential not only to future studies but also the creation of the environmental movement.

Inventors such as Buckminster Fuller also began highlighting the effect technology might have on global trends as time progressed.

By the 1970s there was an obvious shift in the use and development of futures studies; its focus was no longer exclusive to governments and militaries. Instead, it embraced a wide array of technologies, social issues, and concerns. This discussion on the intersection of population growth, resource availability and use, economic growth, quality of life, and environmental sustainability – referred to as the "global problematique" – came to wide public attention with the publication of Limits to Growth by Donella Meadows, a study sponsored by the Club of Rome which detailed the results of a computer simulation of the future based on economic and population growth. Public investment in the future was further enhanced by the publication of Alvin & Heidi Toffler’s bestseller Future Shock (1970), and its exploration of how great amounts of change can overwhelm people and create a social paralysis due to “information overload.”

Further development

International dialogue became institutionalized in the form of the World Futures Studies Federation (WFSF), founded in 1967, with the noted sociologist, Johan Galtung, serving as its first president. In the United States, the publisher Edward Cornish, concerned with these issues, started the World Future Society, an organization focused more on interested laypeople. The Association of Professional Futurists was founded in 2002 and spans 40 countries with more than 400 members. Their mission is to promote professional excellence by “demonstrating the value of strategic foresight and futures studies.”

The first doctoral program on the Study of the Future, was founded in 1969 at the University Of Massachusetts by Christopher Dede and Billy Rojas. The next graduate program (Master's degree) was also founded by Christopher Dede in 1975 at the University of Houston–Clear Lake,. Oliver Markley of SRI (now SRI International) was hired in 1978 to move the program into a more applied and professional direction. The program moved to the University of Houston in 2007 and renamed the degree to Foresight. The program has remained focused on preparing professional futurists and providing high-quality foresight training for individuals and organizations in business, government, education, and non-profits. In 1976, the M.A. Program in Public Policy in Alternative Futures at the University of Hawaii at Manoa was established. The Hawaii program locates futures studies within a pedagogical space defined by neo-Marxism, critical political economic theory, and literary criticism. In the years following the foundation of these two programs, single courses in Futures Studies at all levels of education have proliferated, but complete programs occur only rarely.

In 2010, the Free University of Berlin initiated a master's degree Programme in Futures Studies, which is the first one in Germany. In 2012, the Finland Futures Research Centre started a master's degree Programme in Futures Studies at Turku School of Economics, a business school which is part of the University of Turku in Turku, Finland.

Foresight and futures work cover any domain a company considers important; therefore, a futurist must be able to cross domains and industries in their work. There is continued discussion by people in the profession on how to advance it, with some preferring to keep the field open to anyone interested in the future and others arguing to make the credentialing more rigorous. There are approximately 23 graduate and PhD programs in foresight globally, and many other certification courses.

The field currently faces the challenge of creating a coherent conceptual framework, codified into a well-documented curriculum (or curricula) featuring widely accepted and consistent concepts and theoretical paradigms linked to quantitative and qualitative methods, exemplars of those research methods, and guidelines for their ethical and appropriate application within society. As an indication that previously disparate intellectual dialogues have in fact started converging into a recognizable discipline, at least seven solidly-researched and well-accepted attempts to synthesize a coherent framework for the field have appeared: Eleonora Masini [sk]'s Why Futures Studies?, James Dator's Advancing Futures Studies, Ziauddin Sardar's Rescuing all of our Futures, Sohail Inayatullah's Questioning the future, Richard A. Slaughter's The Knowledge Base of Futures Studies, a collection of essays by senior practitioners, Wendell Bell's two-volume work, The Foundations of Futures Studies, and Andy Hines and Peter Bishop’s Thinking about the Future.

Probability and predictability

While understanding the difference between the concepts of probability and predictability are very important to understanding the future, the field of futures studies is generally more focused on long-term futures in which the concept of plausibility becomes the greater concern.  The usefulness of probability and predictability to the field lies more in analyzing the quantifiable trends and drivers which influence future change, than in predicting future events.

Some aspects of the future, such as celestial mechanics, are highly predictable, and may even be described by relatively simple mathematical models. At present however, science has yielded only a special minority of such "easy to predict" physical processes. Theories such as chaos theory, nonlinear science and standard evolutionary theory have allowed us to understand many complex systems as contingent (sensitively dependent on complex environmental conditions) and stochastic (random within constraints), making the vast majority of future events unpredictable, in any specific case.

Not surprisingly, the tension between predictability and unpredictability is a source of controversy and conflict among futures studies scholars and practitioners. Some argue that the future is essentially unpredictable, and that "the best way to predict the future is to create it." Others believe, as Flechtheim, that advances in science, probability, modeling and statistics will allow us to continue to improve our understanding of probable futures, as this area presently remains less well developed than methods for exploring possible and preferable futures.

As an example, consider the process of electing the president of the United States. At one level we observe that any U.S. citizen over 35 may run for president, so this process may appear too unconstrained for useful prediction. Yet further investigation demonstrates that only certain public individuals (current and former presidents and vice presidents, senators, state governors, popular military commanders, mayors of very large cities, celebrities, etc.) receive the appropriate "social credentials" that are historical prerequisites for election. Thus, with a minimum of effort at formulating the problem for statistical prediction, a much-reduced pool of candidates can be described, improving our probabilistic foresight. Applying further statistical intelligence to this problem, we can observe that in certain election prediction markets such as the Iowa Electronic Markets, reliable forecasts have been generated over long spans of time and conditions, with results superior to individual experts or polls. Such markets, which may be operated publicly or as an internal market, are just one of several promising frontiers in predictive futures research.

Such improvements in the predictability of individual events do not though, from a complexity theory viewpoint, address the unpredictability inherent in dealing with entire systems, which emerge from the interaction between multiple individual events.

Futurology is sometimes described by scientists as pseudoscience. Science exists in the realm of the certain and builds knowledge through attempting to falsify predictions.  Futures studies, however, exists in the realm of the uncertain but also builds knowledge through attempting to falsify predictions and exposing uncertainty.  So in a sense, both science and futures studies share the same goal. The difference is that futures studies attempts to understand, mitigate, and utilize uncertainty.

Methodologies

In terms of methodology, futures practitioners employ a wide range of approaches, models and methods, in both theory and practice, many of which are derived from or informed by other academic or professional disciplines , including social sciences such as economics, psychology, sociology, religious studies, cultural studies, history, geography, and political science; physical and life sciences such as physics, chemistry, astronomy, biology; mathematics, including statistics, game theory and econometrics; applied disciplines such as engineering, computer sciences, and business management (particularly strategy).

The largest internationally peer-reviewed collection of futures research methods (1,300 pages) is Futures Research Methodology 3.0. Each of the 37 methods or groups of methods contains: an executive overview of each method's history, description of the method, primary and alternative usages, strengths and weaknesses, uses in combination with other methods, and speculation about future evolution of the method. Some also contain appendixes with applications, links to software, and sources for further information. More recent method books, such as "How Do We Explore Our Futures?" have also been published.

Given its unique objectives and material, the practice of futures studies only rarely features employment of the scientific method in the sense of controlled, repeatable and verifiable experiments with highly standardized methodologies. However, many futurists are informed by scientific techniques or work primarily within scientific domains. Borrowing from history, the futurist might project patterns observed in past civilizations upon present-day society to model what might happen in the future, or borrowing from technology, the futurist may model possible social and cultural responses to an emerging technology based on established principles of the diffusion of innovation. In short, the futures practitioner enjoys the synergies of an interdisciplinary laboratory.

As the plural term “futures” suggests, one of the fundamental assumptions in futures studies is that the future is plural not singular. That is, the future consists not of one inevitable future that is to be “predicted,” but rather of multiple alternative futures of varying likelihood which may be derived and described, and about which it is impossible to say with certainty which one will occur. The primary effort in futures studies, then, is to identify and describe alternative futures in order to better understand the driving forces of the present or the structural dynamics of a particular subject or subjects. The exercise of identifying alternative futures includes collecting quantitative and qualitative data about the possibility, probability, and desirability of change. The plural term "futures" in futures studies denotes both the rich variety of alternative futures, including the subset of preferable futures (normative futures), that can be studied, as well as the tenet that the future is many.

At present, the general futures studies model has been summarized as being concerned with "three Ps and a W", or possible, probable, and preferable futures, plus wildcards, which are unexpected, seemingly low probability but high impact events (positive or negative). Many futurists do not use the wild card approach. Rather, they use a methodology called Emerging Issues Analysis. It searches for the drivers of change, issues that are likely to move from unknown to the known, from low impact to high impact.

In terms of technique, futures practitioners originally concentrated on extrapolating present technological, economic or social trends, or on attempting to predict future trends. Over time, the discipline has come to put more and more focus on the examination of social systems and uncertainties, to the end of articulating scenarios. The practice of scenario development facilitates the examination of worldviews and assumptions through the causal layered analysis method (and others), the creation of preferred visions of the future, and the use of exercises such as backcasting to connect the present with alternative futures. Apart from extrapolation and scenarios, many dozens of methods and techniques are used in futures research (see below).

Therefore, the general practice of futures studies also sometimes includes the articulation of normative or preferred futures, and a major thread of practice involves connecting both extrapolated (exploratory) and normative research to assist individuals and organizations to model preferred futures amid shifting social changes. For instance, despite many wicked, global challenges in today's world from climate change to extreme poverty, the aspect of preferability or "what should happen" can at times be overlooked. Practitioners use varying proportions of collaboration, creativity and research to derive and define alternative futures, and to the degree that a “preferred” future might be sought, especially in an organizational context, techniques may also be deployed to develop plans or strategies for directed future shaping or implementation of a preferred future.

While some futurists are not concerned with assigning probability to future scenarios, other futurists find probabilities useful in certain situations, such as when probabilities stimulate thinking about scenarios within organizations . When dealing with the three Ps and a W model, estimates of probability are involved with two of the four central concerns (discerning and classifying both probable and wildcard events), while considering the range of possible futures, recognizing the plurality of existing alternative futures, characterizing and attempting to resolve normative disagreements on the future, and envisioning and creating preferred futures are other major areas of scholarship. Most estimates of probability in futures studies are normative and qualitative, though significant progress on statistical and quantitative methods (technology and information growth curves, cliometrics, predictive psychology, prediction markets, crowd-voting forecasts, etc.) has been made in recent decades.

Futures techniques

Futures techniques or methodologies may be viewed as “frameworks for making sense of data generated by structured processes to think about the future”. There is no single set of methods that are appropriate for all futures research. Different futures researchers intentionally or unintentionally promote use of favored techniques over a more structured approach. Selection of methods for use on futures research projects has so far been dominated by the intuition and insight of practitioners; but can better identify a balanced selection of techniques via acknowledgement of foresight as a process together with familiarity with the fundamental attributes of most commonly used methods.

Scenarios are a central technique in Futures Studies and are often confused with other techniques. The flowchart to the right provides a process for classifying a phenomenon as a scenario in the intuitive logics tradition.

Process for classifying a phenomenon as a scenario in the Intuitive Logics tradition.

Futurists use a diverse range of forecasting and foresight methods including:

Shaping alternative futures

Futurists use scenarios – alternative possible futures – as an important tool. To some extent, people can determine what they consider probable or desirable using qualitative and quantitative methods. By looking at a variety of possibilities one comes closer to shaping the future, rather than merely predicting it. Shaping alternative futures starts by establishing a number of scenarios. Setting up scenarios takes place as a process with many stages, and can take place in an evidence-based manner. Scenarios can also study unlikely and improbable developments that would otherwise be ignored. However, for credibility, they should not be entirely utopian or dystopian. One of those stages involves the study of emerging issues, such as megatrends, trends and weak signals. Megatrends illustrate major, long-term phenomena that change slowly, are often interlinked and cannot be transformed in an instant. Trends express an increase or a decrease in a phenomenon, and there are many ways to spot trends. Some argue that a trend persists long-term and long-range; affects many societal groups; grows slowly; and appears to have a profound basis. A fad operates in the short term, shows the vagaries of fashion, affects particular societal groups, and spreads quickly but superficially.

Futurists have a decidedly mixed reputation and a patchy track record at successful prediction. Many 1950s futurists predicted commonplace space tourism by the year 2000, but ignored the possibilities of ubiquitous, cheap computers. On the other hand, many forecasts have portrayed the future with some degree of accuracy. Sample predicted futures range from predicted ecological catastrophes, through a utopian future where the poorest human being lives in what present-day observers would regard as wealth and comfort, through the transformation of humanity into a posthuman life-form, to the destruction of all life on Earth in, say, a nanotechnological disaster. For reasons of convenience, futurists have often extrapolated present technical and societal trends and assumed they will develop at the same rate into the future; but technical progress and social upheavals, in reality, take place in fits and starts and in different areas at different rates.

Therefore, to some degree, the field has aimed to move away from prediction. Current futurists often present multiple scenarios that help their audience envision what "may" occur instead of merely "predicting the future". They claim that understanding potential scenarios helps individuals and organizations prepare with flexibility.

Many corporations use futurists as part of their risk management strategy, for horizon scanning and emerging issues analysis, and to identify wild cards – low probability, potentially high-impact risks. Understanding a range of possibilities can enhance the recognition of opportunities and threats. Every successful and unsuccessful business engages in futuring to some degree – for example in research and development, innovation and market research, anticipating competitor behavior and so on.

Weak signals, the future sign and wild cards

In futures research "weak signals" may be understood as advanced, noisy and socially situated indicators of change in trends and systems that constitute raw informational material for enabling anticipatory action. There is some confusion about the definition of weak signal by various researchers and consultants. Sometimes it is referred as future oriented information, sometimes more like emerging issues. The confusion has been partly clarified with the concept 'the future sign', by separating signal, issue and interpretation of the future sign.

A weak signal can be an early indicator of coming change, and an example might also help clarify the confusion. On May 27, 2012, hundreds of people gathered for a “Take the Flour Back” demonstration at Rothamsted Research in Harpenden, UK, to oppose a publicly funded trial of genetically modified wheat. This was a weak signal for a broader shift in consumer sentiment against genetically modified foods. When Whole Foods mandated the labeling of GMOs in 2013, this non-GMO idea had already become a trend and was about to be a topic of mainstream awareness.

"Wild cards" refer to low-probability and high-impact events "that happen quickly" and "have huge sweeping consequences," and materialize too quickly for social systems to effectively respond. Elina Hultunen notes that wild cards are not new, though they have become more prevalent. One reason for this may be the increasingly fast pace of change. Oliver Markley proposed four types of wild cards:

  • Type I Wild Card: low probability, high impact, high credibility
  • Type II Wild Card: high probability, high impact, low credibility
  • Type III Wild Card: high probability, high impact, disputed credibility
  • Type IV Wild Card: high probability, high impact, high credibility

He posits that it is important to track the emergence of "Type II Wild Cards" that have a high probability of occurring, but low credibility that it will happen. This focus is especially important to note because it is often difficult to persuade people to accept something they don't believe is happening, until they see the wild card. An example is climate change. This hypothesis has gone from Type I (high impact and high credibility, but low probability where science was accepted and thought unlikely to happen) to Type II (high probability, high impact, but low credibility as policy makers and lobbyists push back against the science), to Type III (high probability, high impact, high credibility)--at least for most people, There are still some who probably will not accept the science until the Greenland ice sheet has completely melted and sea-level has risen the seven meters estimated rise.

This concept may be embedded in standard foresight projects and introduced into anticipatory decision-making activity in order to increase the ability of social groups adapt to surprises arising in turbulent business environments. Such sudden and unique incidents might constitute turning points in the evolution of a certain trend or system. Wild cards may or may not be announced by weak signals, which are incomplete and fragmented data from which relevant foresight information might be inferred. Sometimes, mistakenly, wild cards and weak signals are considered as synonyms, which they are not. One of the most often cited examples of a wild card event in recent history is 9/11. Nothing had happened in the past that could point to such a possibility and yet it had a huge impact on everyday life in the United States, from simple tasks like how to travel via airplane to deeper cultural values. Wild card events might also be natural disasters, such as Hurricane Katrina, which can force the relocation of huge populations and wipe out entire crops or completely disrupt the supply chain of many businesses. Although wild card events can't be predicted, after they occur it is often easy to reflect back and convincingly explain why they happened.

Near-term predictions

A long-running tradition in various cultures, and especially in the media, involves various spokespersons making predictions for the upcoming year at the beginning of the year. These predictions are thought-provokers, which sometimes base themselves on current trends in culture (music, movies, fashion, politics); sometimes they make hopeful guesses as to what major events might take place over the course of the next year. Evidently, some of these predictions may come true as the year unfolds, though many fail. When predicted events fail to take place, the authors of the predictions may state that misinterpretation of the "signs" and portents may explain the failure of the prediction.

Marketers have increasingly started to embrace futures studies, in an effort to benefit from an increasingly competitive marketplace with fast production cycles, using such techniques as trendspotting as popularized by Faith Popcorn.

Trend analysis and forecasting

Megatrends

Trends come in different sizes. A megatrend extends over many generations, and in cases of climate, megatrends can cover periods prior to human existence. They describe complex interactions between many factors. The increase in population from the palaeolithic period to the present provides an example. Megatrends are likely to produce greater change than any previous one, because technology is causing trends to unfold at an accelerating pace. The concept was popularized by the 1982 book Megatrends by futurist John Naisbitt.

Potential trends

Possible new trends grow from innovations, projects, beliefs or actions and activism that have the potential to grow and eventually go mainstream in the future.

Branching trends

Very often, trends relate to one another the same way as a tree-trunk relates to branches and twigs. For example, a well-documented movement toward equality between men and women might represent a branch trend. The trend toward reducing differences in the salaries of men and women in the Western world could form a twig on that branch.

Life cycle of a trend

Understanding the technology adoption cycle helps futurists monitor trend development. Trends start as weak signals by small mentions in fringe media outlets, discussion conversations or blog posts, often by innovators. As these ideas, projects, beliefs or technologies gain acceptance, they move into the phase of early adopters. In the beginning of a trend's development, it is difficult to tell if it will become a significant trend that creates changes or merely a trendy fad that fades into forgotten history. Trends will emerge as initially unconnected dots but eventually coalesce into persistent change.

Some trends emerge when enough confirmation occurs in the various media, surveys or questionnaires to show that it has an increasingly accepted value, behavior or technology, it becomes accepted as a bona fide trend. Trends can also gain confirmation by the existence of other trends perceived as springing from the same branch. Some commentators claim that when 15% to 25% of a given population integrates an innovation, project, belief or action into their daily life then a trend becomes mainstream.

General Hype Cycle used to visualize technological life stages of maturity, adoption, and social application.

Life cycle of technologies

Gartner created their Hype Cycle to illustrate the phases a technology moves through as it grows from research and development to mainstream adoption. The unrealistic expectations and subsequent disillusionment that virtual reality experienced in the 1990s and early 2000s is an example of the middle phases encountered before a technology can begin to be integrated into society.

Education

Education in the field of futures studies has taken place for some time. Beginning in the United States in the 1960s, it has since developed in many different countries. Futures education encourages the use of concepts, tools and processes that allow students to think long-term, consequentially, and imaginatively. It generally helps students to:

  1. conceptualize more just and sustainable human and planetary futures.
  2. develop knowledge and skills of methods and tools used to help people understand, map, and influence the future by exploring probable and preferred futures.
  3. understand the dynamics and influence that human, social and ecological systems have on alternative futures.
  4. conscientize responsibility and action on the part of students toward creating better futures.

Thorough documentation of the history of futures education exists, for example in the work of Richard A. Slaughter (2004), David Hicks, Ivana Milojević to name a few.

While futures studies remains a relatively new academic tradition, numerous tertiary institutions around the world teach it. These vary from small programs, or universities with just one or two classes, to programs that offer certificates and incorporate futures studies into other degrees, (for example in planning, business, environmental studies, economics, development studies, science and technology studies). Various formal Masters-level programs exist on six continents. Finally, doctoral dissertations around the world have incorporated futures studies (see e.g. Rohrbeck, 2010; von der Gracht, 2008; Hines, 2012). A recent survey documented approximately 50 cases of futures studies at the tertiary level.

A Futures Studies program is offered at Tamkang University, Taiwan. Futures Studies is a required course at the undergraduate level, with between three and five thousand students taking classes on an annual basis. Housed in the Graduate Institute of Futures Studies is an MA Program. Only ten students are accepted annually in the program. Associated with the program is the Journal of Futures Studies.

The longest running Future Studies program in North America was established in 1975 at the University of Houston–Clear Lake. It moved to the University of Houston in 2007 and renamed the degree to Foresight. The program was established on the belief that if history is studied and taught in an academic setting, then so should the future. Its mission is to prepare professional futurists. The curriculum incorporates a blend of the essential theory, a framework and methods for doing the work, and a focus on application for clients in business, government, nonprofits, and society in general.

As of 2003, over 40 tertiary education establishments around the world were delivering one or more courses in futures studies. The World Futures Studies Federation has a comprehensive survey of global futures programs and courses. The Acceleration Studies Foundation maintains an annotated list of primary and secondary graduate futures studies programs.

A MA Program in Futures Studies has been offered at Free University of Berlin since 2010.

A MSocSc and PhD program in Futures Studies is offered at the University of Turku, Finland.

Applications of foresight and specific fields

General applicability and use of foresight products

Several corporations and government agencies utilize foresight products to both better understand potential risks and prepare for potential opportunities as an anticipatory approach. Several government agencies publish material for internal stakeholders as well as make that material available to broader public. Examples of this include the US Congressional Budget Office long term budget projections, the National Intelligence Center, and the United Kingdom Government Office for Science. Much of this material is used by policy makers to inform policy decisions and government agencies to develop long-term plan. Several corporations, particularly those with long product development lifecycles, utilize foresight and future studies products and practitioners in the development of their business strategies. The Shell Corporation is one such entity. Foresight professionals and their tools are increasingly being used in both the private and public areas to help leaders deal with an increasingly complex and interconnected world.

Imperial cycles and world order

Imperial cycles represent an "expanding pulsation" of "mathematically describable" macro-historic trend.

Chinese philosopher K'ang Yu-wei and French demographer Georges Vacher de Lapouge stressed in the late 19th century that the trend cannot proceed indefinitely on the finite surface of the globe. The trend is bound to culminate in a world empire. K'ang Yu-wei predicted that the matter will be decided in a contest between Washington and Berlin; Vacher de Lapouge foresaw this contest as being between the United States and Russia and wagered the odds were in the United States' favour. Both published their futures studies before H. G. Wells introduced the science of future in his Anticipations (1901).

Four later anthropologists—Hornell Hart, Raoul Naroll, Louis Morano, and Robert Carneiro—researched the expanding imperial cycles. They reached the same conclusion that a world empire is not only pre-determined but close at hand and attempted to estimate the time of its appearance.

Education

As foresight has expanded to include a broader range of social concerns all levels and types of education have been addressed, including formal and informal education. Many countries are beginning to implement Foresight in their Education policy. A few programs are listed below:

  • Finland's FinnSight 2015 - Implementation began in 2006 and though at the time was not referred to as "Foresight" they tend to display the characteristics of a foresight program.
  • Singapore's Ministry of Education Master plan for Information Technology in Education - This third Masterplan continues what was built on in the 1st and 2nd plans to transform learning environments to equip students to compete in a knowledge economy.
  • The World Future Society, founded in 1966, is the largest and longest-running community of futurists in the world. WFS established and built futurism from the ground up—through publications, global summits, and advisory roles to world leaders in business and government.

By the early 2000s, educators began to independently institute futures studies (sometimes referred to as futures thinking) lessons in K-12 classroom environments. To meet the need, non-profit futures organizations designed curriculum plans to supply educators with materials on the topic. Many of the curriculum plans were developed to meet common core standards. Futures studies education methods for youth typically include age-appropriate collaborative activities, games, systems thinking and scenario building exercises.

There are several organizations devoted to furthering the advancement of Foresight and Future Studies worldwide. Teach the Future emphasizes foresight educational practices appropriate for K-12 schools. The University of Houston has a Master's (MS) level graduate program through the College of Technology as well as a certificate program for those interested in advanced studies. The Department of Political Science at the University of Hawaii Manoa has the Hawaii Research Center for Future Studies which offers a Master's (MA) in addition to a Doctorate (Ph.D.).

Science fiction

Wendell Bell and Ed Cornish acknowledge science fiction as a catalyst to future studies, conjuring up visions of tomorrow. Science fiction's potential to provide an “imaginative social vision” is its contribution to futures studies and public perspective. Productive sci-fi presents plausible, normative scenarios. Jim Dator attributes the foundational concepts of “images of the future” to Wendell Bell, for clarifying Fred Polak's concept in Images of the Future, as it applies to futures studies. Similar to futures studies’ scenarios thinking, empirically supported visions of the future are a window into what the future could be. However, unlike in futures studies, most science fiction works present a single alternative, unless the narrative deals with multiple timelines or alternative realities, such as in the works of Phillip K. Dick, and a multitude of small and big screen works. Pamela Sargent states, “Science fiction reflects attitudes typical of this century.” She gives a brief history of impactful sci-fi publications, like The Foundation Trilogy, by Isaac Asimov and Starship Troopers, by Robert A. Heinlein. Alternate perspectives validate sci-fi as part of the fuzzy “images of the future.”

Brian David Johnson is a futurist and author who uses science fiction to help build the future. He has been a futurist at Intel, and is now the resident futurist at Arizona State University. “His work is called ‘future casting’—using ethnographic field studies, technology research, trend data, and even science fiction to create a pragmatic vision of consumers and computing.” Brian David Johnson has developed a practical guide to utilizing science fiction as a tool for futures studies. Science Fiction Prototyping combines the past with the present, including interviews with notable science fiction authors to provide the tools needed to “design the future with science fiction.”

Science Fiction Prototyping has five parts:

1.     Pick your science concept and build an imaginative world

2.     The scientific inflection point

3.     The consequences, for better, or worse, or both, of the science or technology on the people and your world

4.     The human inflection point

5.     Reflection, what did we learn?

“A full Science Fiction Prototyping (SFP) is 6-12 pages long, with a popular structure being; an introduction, background work, the fictional story (the bulk of the SFP), a short summary and a summary (reflection). Most often science fiction prototypes extrapolate current science forward and, therefore, include a set of references at the end.”

Ian Miles reviews The New Encyclopedia of Science Fiction,” identifying ways Science Fiction and Futures Studies “cross-fertilize, as well as the ways in which they differ distinctly.” Science Fiction cannot be simply considered fictionalized Futures Studies. It may have aims other than foresight or “prediction, and be no more concerned with shaping the future than any other genre of literature.”  It is not to be understood as an explicit pillar of futures studies, due to its inconsistency of integrated futures research. Additionally, Dennis Livingston, a literature and Futures journal critic says, “The depiction of truly alternative societies has not been one of science fiction’s strong points, especially” preferred, normative envisages. The strengths of the genre as a form of futurist thinking are discussed by Tom Lombardo, who argues that select science fiction "combines a highly detailed and concrete level of realism with theoretical speculation on the future", "addresses all the main dimensions of the future and synthesizes all these dimensions into integrative visions of the future", and "reflects contemporary and futurist thinking", therefore it "can be viewed as the mythology of the future."

It is notable that although there are no hard limits on horizons in future studies and foresight efforts, typical future horizons explored are within the realm of the practical and do not span more than a few decades. Nevertheless, there are hard science fiction works that can be applicable as visioning exercises that span longer periods of time when the topic is of a significant time scale, such as is in the case of Kim Stanley Robinson's Mars Trilogy, which deals with the terraforming of Mars and extends two centuries forward through the early 23rd century. In fact, there is some overlap between science fiction writers and professional futurists such as in the case of David Brin. Arguably, the work of science fiction authors has seeded many ideas that have later been developed (be it technological or social in nature) - from early works of Jules Verne and H.G. Wells to the later Arthur C. Clarke and William Gibson. Beyond literary works, futures studies and futurists have influenced film and TV works. The 2002 movie adaptation of Phillip K. Dick's short stort, Minority Report, had a group of consultants to build a realistic vision of the future, including futurist Peter Schwartz. TV shows such as HBO's Westworld, and Channel 4/Netflix' Black Mirror follow many of the rules of futures studies to build the world, the scenery and storytelling in a way futurists would in experiential scenarios and works.

Science Fiction novels for Futurists:

  • William Gibson, Neuromancer, Ace Books, 1984. (Pioneering cyberpunk novel)
  • Kim Stanley Robinson, Red Mars, Spectra, 1993. (Story on the founding a colony on Mars)
  • Bruce Sterling, Heavy Weather, Bantam, 1994. (Story about a world with drastically altered climate and weather)
  • Iain Banks’ Culture novels (Space operas in distance future with thoughtful treatments of advanced AI)

Government agencies

Several governments have formalized strategic foresight agencies to encourage long range strategic societal planning, with most notable are the governments of Singapore, Finland, and the United Arab Emirates. Other governments with strategic foresight agencies include Canada's Policy Horizons Canada and the Malaysia's Malaysian Foresight Institute.

The Singapore government's Centre for Strategic Futures (CSF) is part of the Strategy Group within the Prime Minister's Office. Their mission is to position the Singapore government to navigate emerging strategic challenges and harness potential opportunities. Singapore's early formal efforts in strategic foresight began in 1991 with the establishment of the Risk Detection and Scenario Planning Office in the Ministry of Defence. In addition to the CSF, the Singapore government has established the Strategic Futures Network, which brings together deputy secretary-level officers and foresight units across the government to discuss emerging trends that may have implications for Singapore.

Since the 1990s, Finland has integrated strategic foresight within the parliament and Prime Minister's Office. The government is required to present a "Report of the Future" each parliamentary term for review by the parliamentary Committee for the Future. Led by the Prime Minister's Office, the Government Foresight Group coordinates the government's foresight efforts. Futures research is supported by the Finnish Society for Futures Studies (established in 1980), the Finland Futures Research Centre (established in 1992), and the Finland Futures Academy (established in 1998) in coordination with foresight units in various government agencies.

In the United Arab Emirates, Sheikh Mohammed bin Rashid, Vice President and Ruler of Dubai, announced in September 2016 that all government ministries were to appoint Directors of Future Planning. Sheikh Mohammed described the UAE Strategy for the Future as an "integrated strategy to forecast our nation’s future, aiming to anticipate challenges and seize opportunities". The Ministry of Cabinet Affairs and Future(MOCAF) is mandated with crafting the UAE Strategy for the Future and is responsible for the portfolio of the future of UAE.

In 2018, the United States General Accountability Office (GAO) created the Center for Strategic Foresight to enhance its ability to “serve as the agency’s principal hub for identifying, monitoring, and analyzing emerging issues facing policymakers.” The Center is composed of non-resident Fellows who are considered leading experts in foresight, planning and future thinking. In September 2019 they hosted a conference on space policy and “deep fake” synthetic media to manipulate online and real-world interactions.

Risk analysis and management

Foresight is a framework or lens which could be used in risk analysis and management in a medium- to long-term time range. A typical formal foresight project would identify key drivers and uncertainties relevant to the scope of analysis. It would also analyze how the drivers and uncertainties could interact to create the most probable scenarios of interest and what risks they might contain. An additional step would be identifying actions to avoid or minimize these risks.

One classic example of such work was how foresight work at the Royal Dutch Shell international oil company led to envision the turbulent oil prices of the 1970s as a possibility and better embed this into company planning. Yet the practice at Shell focuses on stretching the company's thinking rather than in making predictions. Its planning is meant to link and embed scenarios in “organizational processes such as strategy making, innovation, risk management, public affairs, and leadership development.”

Foresight studies can also consider the possibility of “wild card” events – or events that many consider would be impossible to envision – although often such events can be imagined as remote possibilities as part of foresight work. One of many possible areas of focus for a foresight lens could also be identifying conditions for potential scenarios of high-level risks to society.

These risks may arise from the development and adoption of emerging technologies and/or social change. Special interest lies on hypothetical future events that have the potential to damage human well-being on a global scale - global catastrophic risks. Such events may cripple or destroy modern civilization or, in the case of existential risks, even cause human extinction. Potential global catastrophic risks include but are not limited to climate change, hostile artificial intelligence, nanotechnology weapons, nuclear warfare, total war, and pandemics. The aim of a professional futurist would be to identify conditions that could lead to these events in order to create “pragmatically feasible roads to alternative futures.”

Academic Programs and Research centers

Futurists

Futurists are practitioners of the foresight profession, which seeks to provide organizations and individuals with images of the future to help them prepare for contingencies and to maximize opportunities. A foresight project begins with a question that ponders the future of any given subject area, including technology, medicine, government and business. Futurists engage in environmental scanning to search for drivers of change and emerging trends that may have an effect on the focus topic. The scanning process includes reviewing social media platforms, researching already prepared reports, engaging in Delphi studies, reading articles and any other sources of relevant information and preparing and analyzing data extrapolations. Then, through one of a number of highly structured methods futurists organize this information and use it to create multiple future scenarios for the topic, also known as a domain. The value of preparing many different versions of the future rather than a singular prediction is that they provide a client with the ability to prepare long-range plans that will weather and optimize a variety of contexts.

Books

APF's list of most significant futures works

The Association for Professional Futurists recognizes the Most Significant Futures Works for the purpose of identifying and rewarding the work of foresight professionals and others whose work illuminates aspects of the future.

Author Title
Bertrand de Jouvenel L’Art de la conjecture (The Art of Conjecture), 2008 
Donella Meadows The Limits to Growth, 2008 
Peter Schwartz The Art of the Long View, 2008 
Ray Kurzweil The Age of Spiritual Machines: When Computers Exceed Human Intelligence, 2008 
Jerome C. Glenn & Theodore J. Gordon Futures Research Methodology Version 2.0, 2008 
Jerome C. Glenn & Theodore J. Gordon The State of the Future, 2008
Jared Diamond Collapse: How Societies Choose to Fail or Succeed, 2008 
Richard Slaughter The Biggest Wake up Call in History, 2012
Richard Slaughter The Knowledge Base of Futures Studies, 2008
Worldwatch Institute State of the World (book series), 2008
Nassim Nicholas Taleb The Black Swan: The Impact of the Highly Improbable, 2012 
Tim Jackson (economist) Prosperity Without Growth, 2012 
Jørgen Randers 2052: A Global Forecast for the Next Forty Years, 2013
Stroom den Haag Food for the City, 2013
Andy Hines & Peter C. Bishop Teaching About the Future, 2014 
James A. Dator Advancing Futures - Futures Studies in Higher Education
Ziauddin Sardar Future: All that Matters, 2014
Emma Marris Rambunctious Garden: Saving Nature in a Post-Wild World, 2014
Sohail Inayatullah What Works: Case Studies in the Practice of Foresight, 2016 
Dougal Dixon After Man: A Zoology of the Future

Other notable foresight books

For further suggestions, please visit A Resource Bibliography by Dr. Peter Bishop

Periodicals and journals

Organizations

Foresight professional networks

Public-sector foresight organizations

Non-governmental foresight organizations

Marriage in Islam

From Wikipedia, the free encyclopedia ...