Search This Blog

Saturday, November 12, 2022

Forecasting

From Wikipedia, the free encyclopedia

Forecasting is the process of making predictions based on past and present data. Later these can be compared (resolved) against what happens. For example, a company might estimate their revenue in the next year, then compare it against the actual results. Prediction is a similar, but more general term. Forecasting might refer to specific formal statistical methods employing time series, cross-sectional or longitudinal data, or alternatively to less formal judgmental methods or the process of prediction and resolution itself. Usage can differ between areas of application: for example, in hydrology the terms "forecast" and "forecasting" are sometimes reserved for estimates of values at certain specific future times, while the term "prediction" is used for more general estimates, such as the number of times floods will occur over a long period.

Risk and uncertainty are central to forecasting and prediction; it is generally considered good practice to indicate the degree of uncertainty attaching to forecasts. In any case, the data must be up to date in order for the forecast to be as accurate as possible. In some cases the data used to predict the variable of interest is itself forecast.

Applications

Forecasting has applications in a wide range of fields where estimates of future conditions are useful. Depending on the field, accuracy varies significantly. If the factors that relate to what is being forecast are known and well understood and there is a significant amount of data that can be used, it is likely the final value will be close to the forecast. If this is not the case or if the actual outcome is affected by the forecasts, the reliability of the forecasts can be significantly lower.

Climate change and increasing energy prices have led to the use of Egain Forecasting for buildings. This attempts to reduce the energy needed to heat the building, thus reducing the emission of greenhouse gases. Forecasting is used in customer demand planning in everyday business for manufacturing and distribution companies.

While the veracity of predictions for actual stock returns are disputed through reference to the Efficient-market hypothesis, forecasting of broad economic trends is common. Such analysis is provided by both non-profit groups as well as by for-profit private institutions.

Forecasting foreign exchange movements is typically achieved through a combination of chart and fundamental analysis. An essential difference between chart analysis and fundamental economic analysis is that chartists study only the price action of a market, whereas fundamentalists attempt to look to the reasons behind the action. Financial institutions assimilate the evidence provided by their fundamental and chartist researchers into one note to provide a final projection on the currency in question.

Forecasting has also been used to predict the development of conflict situations. Forecasters perform research that uses empirical results to gauge the effectiveness of certain forecasting models. However research has shown that there is little difference between the accuracy of the forecasts of experts knowledgeable in the conflict situation and those by individuals who knew much less.

Similarly, experts in some studies argue that role thinking does not contribute to the accuracy of the forecast. The discipline of demand planning, also sometimes referred to as supply chain forecasting, embraces both statistical forecasting and a consensus process. An important, albeit often ignored aspect of forecasting, is the relationship it holds with planning. Forecasting can be described as predicting what the future will look like, whereas planning predicts what the future should look like. There is no single right forecasting method to use. Selection of a method should be based on your objectives and your conditions (data etc.). A good place to find a method, is by visiting a selection tree. An example of a selection tree can be found here. Forecasting has application in many situations:

Forecasting as training, Betting and Futarchy

In several cases, the forecast is either more or less than a prediction of the future.

In Philip E. Tetlock's Superforecasting, he discusses forecasting as a method of improving ones ability to make decisions. A person can become better calibrated - ie having things they give 10% credence to happening 10% of the time. Or they can forecast things more confidently - coming to the same conclusion but earlier. Some have claimed that that forecasting is a transferrable skill with benefits to other areas of discussion and decision making.

Betting on sports or politics is another form of forecasting. Rather than being used as advice, bettors are paid based on if they predicted correctly. While decisions might be made based on these bets (forecasts), the main motivation is generally financial.

Finally, Futarchy is a form of government where forecasts of the impact of government action are used to decide which actions are taken. Rather than advice, in Futarchy's strongest form, the action with the best forecasted result is automatically taken.

Categories of forecasting methods

Qualitative vs. quantitative methods

Qualitative forecasting techniques are subjective, based on the opinion and judgment of consumers and experts; they are appropriate when past data are not available. They are usually applied to intermediate- or long-range decisions. Examples of qualitative forecasting methods are informed opinion and judgment, the Delphi method, market research, and historical life-cycle analogy.

Quantitative forecasting models are used to forecast future data as a function of past data. They are appropriate to use when past numerical data is available and when it is reasonable to assume that some of the patterns in the data are expected to continue into the future. These methods are usually applied to short- or intermediate-range decisions. Examples of quantitative forecasting methods are last period demand, simple and weighted N-Period moving averages, simple exponential smoothing, poisson process model based forecasting and multiplicative seasonal indexes. Previous research shows that different methods may lead to different level of forecasting accuracy. For example, GMDH neural network was found to have better forecasting performance than the classical forecasting algorithms such as Single Exponential Smooth, Double Exponential Smooth, ARIMA and back-propagation neural network.

Average approach

In this approach, the predictions of all future values are equal to the mean of the past data. This approach can be used with any sort of data where past data is available. In time series notation:

 

where is the past data.

Although the time series notation has been used here, the average approach can also be used for cross-sectional data (when we are predicting unobserved values; values that are not included in the data set). Then, the prediction for unobserved values is the average of the observed values.

Naïve approach

Naïve forecasts are the most cost-effective forecasting model, and provide a benchmark against which more sophisticated models can be compared. This forecasting method is only suitable for time series data. Using the naïve approach, forecasts are produced that are equal to the last observed value. This method works quite well for economic and financial time series, which often have patterns that are difficult to reliably and accurately predict. If the time series is believed to have seasonality, the seasonal naïve approach may be more appropriate where the forecasts are equal to the value from last season. In time series notation:

Drift method

A variation on the naïve method is to allow the forecasts to increase or decrease over time, where the amount of change over time (called the drift) is set to be the average change seen in the historical data. So the forecast for time is given by

 

This is equivalent to drawing a line between the first and last observation, and extrapolating it into the future.

Seasonal naïve approach

The seasonal naïve method accounts for seasonality by setting each prediction to be equal to the last observed value of the same season. For example, the prediction value for all subsequent months of April will be equal to the previous value observed for April. The forecast for time is

where =seasonal period and is the smallest integer greater than .

The seasonal naïve method is particularly useful for data that has a very high level of seasonality.

Time series methods

Time series methods use historical data as the basis of estimating future outcomes. They are based on the assumption that past demand history is a good indicator of future demand.

e.g. Box–Jenkins
Seasonal ARIMA or SARIMA or ARIMARCH,

Relational methods

Some forecasting methods try to identify the underlying factors that might influence the variable that is being forecast. For example, including information about climate patterns might improve the ability of a model to predict umbrella sales. Forecasting models often take account of regular seasonal variations. In addition to climate, such variations can also be due to holidays and customs: for example, one might predict that sales of college football apparel will be higher during the football season than during the off season.

Several informal methods used in causal forecasting do not rely solely on the output of mathematical algorithms, but instead use the judgment of the forecaster. Some forecasts take account of past relationships between variables: if one variable has, for example, been approximately linearly related to another for a long period of time, it may be appropriate to extrapolate such a relationship into the future, without necessarily understanding the reasons for the relationship.

Causal methods include:

Quantitative forecasting models are often judged against each other by comparing their in-sample or out-of-sample mean square error, although some researchers have advised against this. Different forecasting approaches have different levels of accuracy. For example, it was found in one context that GMDH has higher forecasting accuracy than traditional ARIMA.

Judgmental methods

Judgmental forecasting methods incorporate intuitive judgement, opinions and subjective probability estimates. Judgmental forecasting is used in cases where there is lack of historical data or during completely new and unique market conditions.

Judgmental methods include:

Artificial intelligence methods

Often these are done today by specialized programs loosely labeled

Geometric Extrapolation with error prediction

Can be created with 3 points of a sequence and the "moment" or "index", this type of extrapolation have 100 % accuracy in predictions in a big percentage of known series database (OEIS). 

Other methods

Forecasting accuracy

The forecast error (also known as a residual) is the difference between the actual value and the forecast value for the corresponding period:

where E is the forecast error at period t, Y is the actual value at period t, and F is the forecast for period t.

A good forecasting method will yield residuals that are uncorrelated. If there are correlations between residual values, then there is information left in the residuals which should be used in computing forecasts. This can be accomplished by computing the expected value of a residual as a function of the known past residuals, and adjusting the forecast by the amount by which this expected value differs from zero.

A good forecasting method will also have zero mean. If the residuals have a mean other than zero, then the forecasts are biased and can be improved by adjusting the forecasting technique by an additive constant that equals the mean of the unadjusted residuals.

Measures of aggregate error:

Scaled-dependent errors

The forecast error, E, is on the same scale as the data, as such, these accuracy measures are scale-dependent and cannot be used to make comparisons between series on different scales.

Mean absolute error (MAE) or mean absolute deviation (MAD):

Mean squared error (MSE) or mean squared prediction error (MSPE):

Root mean squared error (RMSE):

Average of Errors (E):

Percentage errors

These are more frequently used to compare forecast performance between different data sets because they are scale-independent. However, they have the disadvantage of being extremely large or undefined if Y is close to or equal to zero.

Mean absolute percentage error (MAPE):

Mean absolute percentage deviation (MAPD):

Scaled errors

Hyndman and Koehler (2006) proposed using scaled errors as an alternative to percentage errors.

Mean absolute scaled error (MASE):

where m=seasonal period or 1 if non-seasonal

Other measures

Forecast skill (SS):

Business forecasters and practitioners sometimes use different terminology. They refer to the PMAD as the MAPE, although they compute this as a volume weighted MAPE. For more information see Calculating demand forecast accuracy.

When comparing the accuracy of different forecasting methods on a specific data set, the measures of aggregate error are compared with each other and the method that yields the lowest error is preferred.

Training and test sets

When evaluating the quality of forecasts, it is invalid to look at how well a model fits the historical data; the accuracy of forecasts can only be determined by considering how well a model performs on new data that were not used when fitting the model. When choosing models, it is common to use a portion of the available data for fitting, and use the rest of the data for testing the model, as was done in the above examples.

Cross-validation

Cross-validation is a more sophisticated version of training a test set.

For cross-sectional data, one approach to cross-validation works as follows:

  1. Select observation i for the test set, and use the remaining observations in the training set. Compute the error on the test observation.
  2. Repeat the above step for i = 1,2,..., N where N is the total number of observations.
  3. Compute the forecast accuracy measures based on the errors obtained.

This makes efficient use of the available data, as only one observation is omitted at each step

For time series data, the training set can only include observations prior to the test set. Therefore, no future observations can be used in constructing the forecast. Suppose k observations are needed to produce a reliable forecast; then the process works as follows:

  1. Starting with i=1, select the observation k + i for the test set, and use the observations at times 1, 2, ..., k+i–1 to estimate the forecasting model. Compute the error on the forecast for k+i.
  2. Repeat the above step for i = 2,...,T–k where T is the total number of observations.
  3. Compute the forecast accuracy over all errors.

This procedure is sometimes known as a "rolling forecasting origin" because the "origin" (k+i -1) at which the forecast is based rolls forward in time. Further, two-step-ahead or in general p-step-ahead forecasts can be computed by first forecasting the value immediately after the training set, then using this value with the training set values to forecast two periods ahead, etc.

See also

Seasonality and cyclic behaviour

Seasonality

Seasonality is a characteristic of a time series in which the data experiences regular and predictable changes which recur every calendar year. Any predictable change or pattern in a time series that recurs or repeats over a one-year period can be said to be seasonal. It is common in many situations – such as grocery store or even in a Medical Examiner's office—that the demand depends on the day of the week. In such situations, the forecasting procedure calculates the seasonal index of the “season” – seven seasons, one for each day – which is the ratio of the average demand of that season (which is calculated by Moving Average or Exponential Smoothing using historical data corresponding only to that season) to the average demand across all seasons. An index higher than 1 indicates that demand is higher than average; an index less than 1 indicates that the demand is less than the average.

Cyclic behaviour

The cyclic behaviour of data takes place when there are regular fluctuations in the data which usually last for an interval of at least two years, and when the length of the current cycle cannot be predetermined. Cyclic behavior is not to be confused with seasonal behavior. Seasonal fluctuations follow a consistent pattern each year so the period is always known. As an example, during the Christmas period, inventories of stores tend to increase in order to prepare for Christmas shoppers. As an example of cyclic behaviour, the population of a particular natural ecosystem will exhibit cyclic behaviour when the population decreases as its natural food source decreases, and once the population is low, the food source will recover and the population will start to increase again. Cyclic data cannot be accounted for using ordinary seasonal adjustment since it is not of fixed period.

Limitations

Limitations pose barriers beyond which forecasting methods cannot reliably predict. There are many events and values that cannot be forecast reliably. Events such as the roll of a die or the results of the lottery cannot be forecast because they are random events and there is no significant relationship in the data. When the factors that lead to what is being forecast are not known or well understood such as in stock and foreign exchange markets forecasts are often inaccurate or wrong as there is not enough data about everything that affects these markets for the forecasts to be reliable, in addition the outcomes of the forecasts of these markets change the behavior of those involved in the market further reducing forecast accuracy.

The concept of "self-destructing predictions" concerns the way in which some predictions can undermine themselves by influencing social behavior. This is because "predictors are part of the social context about which they are trying to make a prediction and may influence that context in the process". For example, a forecast that a large percentage of a population will become HIV infected based on existing trends may cause more people to avoid risky behavior and thus reduce the HIV infection rate, invalidating the forecast (which might have remained correct if it had not been publicly known). Or, a prediction that cybersecurity will become a major issue may cause organizations to implement more security cybersecurity measures, thus limiting the issue.

Performance limits of fluid dynamics equations

As proposed by Edward Lorenz in 1963, long range weather forecasts, those made at a range of two weeks or more, are impossible to definitively predict the state of the atmosphere, owing to the chaotic nature of the fluid dynamics equations involved. Extremely small errors in the initial input, such as temperatures and winds, within numerical models double every five days.

Emerging technologies

From Wikipedia, the free encyclopedia
  
Emerging technologies are technologies whose development, practical applications, or both are still largely unrealized. These technologies are generally new but also include older technologies finding new applications. Emerging technologies are often perceived as capable of changing the status quo.

Emerging technologies are characterized by radical novelty (in application even if not in origins), relatively fast growth, coherence, prominent impact, and uncertainty and ambiguity. In other words, an emerging technology can be defined as "a radically novel and relatively fast growing technology characterised by a certain degree of coherence persisting over time and with the potential to exert a considerable impact on the socio-economic domain(s) which is observed in terms of the composition of actors, institutions and patterns of interactions among those, along with the associated knowledge production processes. Its most prominent impact, however, lies in the future and so in the emergence phase is still somewhat uncertain and ambiguous."

Emerging technologies include a variety of technologies such as educational technology, information technology, nanotechnology, biotechnology, robotics, and artificial intelligence.

New technological fields may result from the technological convergence of different systems evolving towards similar goals. Convergence brings previously separate technologies such as voice (and telephony features), data (and productivity applications) and video together so that they share resources and interact with each other, creating new efficiencies.

Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage; converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of the impact, status and economic viability of several emerging and converging technologies varies.

History of emerging technologies

In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology.

Over centuries innovative methods and new technologies are developed and opened up. Some of these technologies are due to theoretical research, and others from commercial research and development.

Technological growth includes incremental developments and disruptive technologies. An example of the former was the gradual roll-out of DVD (digital video disc) as a development intended to follow on from the previous optical technology compact disc. By contrast, disruptive technologies are those where a new method replaces the previous technology and makes it redundant, for example, the replacement of horse-drawn carriages by automobiles and other vehicles.

Emerging technology debates

Many writers, including computer scientist Bill Joy, have identified clusters of technologies that they consider critical to humanity's future. Joy warns that the technology could be used by elites for good or evil. They could use it as "good shepherds" for the rest of humanity or decide everyone else is superfluous and push for mass extinction of those made unnecessary by technology.

Advocates of the benefits of technological change typically see emerging and converging technologies as offering hope for the betterment of the human condition. Cyberphilosophers Alexander Bard and Jan Söderqvist argue in The Futurica Trilogy that while Man himself is basically constant throughout human history (genes change very slowly), all relevant change is rather a direct or indirect result of technological innovation (memes change very fast) since new ideas always emanate from technology use and not the other way around. Man should consequently be regarded as history's main constant and technology as its main variable. However, critics of the risks of technological change, and even some advocates such as transhumanist philosopher Nick Bostrom, warn that some of these technologies could pose dangers, perhaps even contribute to the extinction of humanity itself; i.e., some of them could involve existential risks.

Much ethical debate centers on issues of distributive justice in allocating access to beneficial forms of technology. Some thinkers, including environmental ethicist Bill McKibben, oppose the continuing development of advanced technology partly out of fear that its benefits will be distributed unequally in ways that could worsen the plight of the poor. By contrast, inventor Ray Kurzweil is among techno-utopians who believe that emerging and converging technologies could and will eliminate poverty and abolish suffering.

Some analysts such as Martin Ford, author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, argue that as information technology advances, robots and other forms of automation will ultimately result in significant unemployment as machines and software begin to match and exceed the capability of workers to perform most routine jobs.

As robotics and artificial intelligence develop further, even many skilled jobs may be threatened. Technologies such as machine learning may ultimately allow computers to do many knowledge-based jobs that require significant education. This may result in substantial unemployment at all skill levels, stagnant or falling wages for most workers, and increased concentration of income and wealth as the owners of capital capture an ever-larger fraction of the economy. This in turn could lead to depressed consumer spending and economic growth as the bulk of the population lacks sufficient discretionary income to purchase the products and services produced by the economy.

Emerging technologies


Examples of emerging technologies

Artificial neural network with chip
Artificial intelligence

Artificial intelligence

Artificial intelligence (AI) is the sub intelligence exhibited by machines or software, and the branch of computer science that develops machines and software with animal-like intelligence. Major AI researchers and textbooks define the field as "the study and design of intelligent agents," where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1956, defines it as "the study of making intelligent machines".

The central functions (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still among the field's long-term goals. Currently, popular approaches include deep learning, statistical methods, computational intelligence and traditional symbolic AI. There is an enormous number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others.

3D printer

3D printing

3D printing, also known as additive manufacturing, has been posited by Jeremy Rifkin and others as part of the third industrial revolution.

Combined with Internet technology, 3D printing would allow for digital blueprints of virtually any material product to be sent instantly to another person to be produced on the spot, making purchasing a product online almost instantaneous.

Although this technology is still too crude to produce most products, it is rapidly developing and created a controversy in 2013 around the issue of 3D printed firearms.

Gene therapy

Gene therapy was first successfully demonstrated in late 1990/early 1991 for adenosine deaminase deficiency, though the treatment was somatic – that is, did not affect the patient's germ line and thus was not heritable. This led the way to treatments for other genetic diseases and increased interest in germ line gene therapy – therapy affecting the gametes and descendants of patients.

Between September 1990 and January 2014, there were around 2,000 gene therapy trials conducted or approved.

Cancer vaccines

A cancer vaccine is a vaccine that treats existing cancer or prevents the development of cancer in certain high-risk individuals. Vaccines that treat existing cancer are known as therapeutic cancer vaccines. There are currently no vaccines able to prevent cancer in general.

On April 14, 2009, The Dendreon Corporation announced that their Phase III clinical trial of Provenge, a cancer vaccine designed to treat prostate cancer, had demonstrated an increase in survival. It received U.S. Food and Drug Administration (FDA) approval for use in the treatment of advanced prostate cancer patients on April 29, 2010. The approval of Provenge has stimulated interest in this type of therapy.

Cultured meat

Cultured meat, also called in vitro meat, clean meat, cruelty-free meat, shmeat, and test-tube meat, is an animal-flesh product that has never been part of a living animal with exception of the fetal calf serum taken from a slaughtered cow. In the 21st century, several research projects have worked on in vitro meat in the laboratory. The first in vitro beefburger, created by a Dutch team, was eaten at a demonstration for the press in London in August 2013. There remain difficulties to be overcome before in vitro meat becomes commercially available. Cultured meat is prohibitively expensive, but it is expected that the cost could be reduced to compete with that of conventionally obtained meat as technology improves. In vitro meat is also an ethical issue. Some argue that it is less objectionable than traditionally obtained meat because it doesn't involve killing and reduces the risk of animal cruelty, while others disagree with eating meat that has not developed naturally.

Nanotechnology

Nanotechnology (sometimes shortened to nanotech) is the manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter that occur below the given size threshold.

Robotics

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. A good example of a robot that resembles humans is Sophia, a social humanoid robot developed by Hong Kong-based company Hanson Robotics which was activated on April 19, 2015. Many of today's robots are inspired by nature contributing to the field of bio-inspired robotics.

Self-replicating 3D printer

Stem-cell therapy

Stem cell therapy is an intervention strategy that introduces new adult stem cells into damaged tissue in order to treat disease or injury. Many medical researchers believe that stem cell treatments have the potential to change the face of human disease and alleviate suffering. The ability of stem cells to self-renew and give rise to subsequent generations with variable degrees of differentiation capacities offers significant potential for generation of tissues that can potentially replace diseased and damaged areas in the body, with minimal risk of rejection and side effects.

Chimeric antigen receptor (CAR)-modified T cells have raised among other immunotherapies for cancer treatment, being implemented against B-cell malignancies. Despite the promising outcomes of this innovative technology, CAR-T cells are not exempt from limitations that must yet to be overcome in order to provide reliable and more efficient treatments against other types of cancer.

Distributed ledger technology

Distributed ledger or blockchain technology provides a transparent and immutable list of transactions. A wide range of uses has been proposed for where an open, decentralised database is required, ranging from supply chains to cryptocurrencies.

Smart contracts are self-executing transactions which occur when pre-defined conditions are met. The aim is to provide security that is superior to traditional contract law, and to reduce transaction costs and delays. The original idea was conceived by Nick Szabo in 1994, but remained unrealised until the development of blockchains.

Development of emerging technologies

As innovation drives economic growth, and large economic rewards come from new inventions, a great deal of resources (funding and effort) go into the development of emerging technologies. Some of the sources of these resources are described below...

Research and development

Research and development is directed towards the advancement of technology in general, and therefore includes development of emerging technologies. See also List of countries by research and development spending.

Applied research is a form of systematic inquiry involving the practical application of science. It accesses and uses some part of the research communities' (the academia's) accumulated theories, knowledge, methods, and techniques, for a specific, often state-, business-, or client-driven purpose.

Science policy is the area of public policy which is concerned with the policies that affect the conduct of the science and research enterprise, including the funding of science, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care and environmental monitoring.

Patents

Top 30 AI patent applicants in 2016

Patents provide inventors with a limited period of time (minimum of 20 years, but duration based on jurisdiction) of exclusive right in the making, selling, use, leasing or otherwise of their novel technological inventions. Artificial intelligence, robotic inventions, new material, or blockchain platforms may be patentable, the patent protecting the technological know-how used to create these inventions. In 2019, WIPO reported that AI was the most prolific emerging technology in terms of number of patent applications and granted patents, the Internet of things was estimated to be the largest in terms of market size. It was followed, again in market size, by big data technologies, robotics, AI, 3D printing and the fifth generation of mobile services (5G). Since AI emerged in the 1950s, 340000 AI-related patent applications were filed by innovators and 1.6 million scientific papers have been published by researchers, with the majority of all AI-related patent filings published since 2013. Companies represent 26 out of the top 30 AI patent applicants, with universities or public research organizations accounting for the remaining four.

DARPA

The Defense Advanced Research Projects Agency (DARPA) is an agency of the U.S. Department of Defense responsible for the development of emerging technologies for use by the military.

DARPA was created in 1958 as the Advanced Research Projects Agency (ARPA) by President Dwight D. Eisenhower. Its purpose was to formulate and execute research and development projects to expand the frontiers of technology and science, with the aim to reach beyond immediate military requirements.

Projects funded by DARPA have provided significant technologies that influenced many non-military fields, such as the Internet and Global Positioning System technology.

Technology competitions and awards

There are awards that provide incentive to push the limits of technology (generally synonymous with emerging technologies). Note that while some of these awards reward achievement after-the-fact via analysis of the merits of technological breakthroughs, others provide incentive via competitions for awards offered for goals yet to be achieved.

The Orteig Prize was a $25,000 award offered in 1919 by French hotelier Raymond Orteig for the first nonstop flight between New York City and Paris. In 1927, underdog Charles Lindbergh won the prize in a modified single-engine Ryan aircraft called the Spirit of St. Louis. In total, nine teams spent $400,000 in pursuit of the Orteig Prize.

The XPRIZE series of awards, public competitions designed and managed by the non-profit organization called the X Prize Foundation, are intended to encourage technological development that could benefit mankind. The most high-profile XPRIZE to date was the $10,000,000 Ansari XPRIZE relating to spacecraft development, which was awarded in 2004 for the development of SpaceShipOne.

The Turing Award is an annual prize given by the Association for Computing Machinery (ACM) to "an individual selected for contributions of a technical nature made to the computing community." It is stipulated that the contributions should be of lasting and major technical importance to the computer field. The Turing Award is generally recognized as the highest distinction in computer science, and in 2014 grew to $1,000,000.

The Millennium Technology Prize is awarded once every two years by Technology Academy Finland, an independent fund established by Finnish industry and the Finnish state in partnership. The first recipient was Tim Berners-Lee, inventor of the World Wide Web.

In 2003, David Gobel seed-funded the Methuselah Mouse Prize (Mprize) to encourage the development of new life extension therapies in mice, which are genetically similar to humans. So far, three Mouse Prizes have been awarded: one for breaking longevity records to Dr. Andrzej Bartke of Southern Illinois University; one for late-onset rejuvenation strategies to Dr. Stephen Spindler of the University of California; and one to Dr. Z. Dave Sharp for his work with the pharmaceutical rapamycin.

Role of science fiction

Science fiction has often affected innovation and new technology - for example many rocketry pioneers were inspired by science fiction - and the documentary How William Shatner Changed the World gives a number of examples of imagined technologies being actualized.

In the media

The term bleeding edge has been used to refer to some new technologies, formed as an allusion to the similar terms "leading edge" and "cutting edge". It tends to imply even greater advancement, albeit at an increased risk because of the unreliability of the software or hardware. The first documented example of this term being used dates to early 1983, when an unnamed banking executive was quoted to have used it in reference to Storage Technology Corporation.

Deathbed conversion

From Wikipedia, the free encyclopedia
 
Russian Orthodox icon of The Good Thief in Paradise (Moscow School, c. 1560)

A deathbed conversion is the adoption of a particular religious faith shortly before dying. Making a conversion on one's deathbed may reflect an immediate change of belief, a desire to formalize longer-term beliefs, or a desire to complete a process of conversion already underway. Claims of the deathbed conversion of famous or influential figures have also been used in history as rhetorical devices.

Overview

The Baptism of Constantine, as imagined by students of Raphael

Conversions at the point of death have a long history. The first recorded deathbed conversion appears in the Gospel of Luke where the good thief, crucified beside Jesus, expresses belief in Christ. Jesus accepts his conversion, saying "Today you shall be with Me in Paradise".

Perhaps the most momentous conversion in Western history was that of Constantine I, Roman Emperor and later proclaimed a Christian Saint by the Eastern Orthodox Church. While his belief in Christianity occurred long before his death, it was only on his deathbed that he was baptised, in 337 by the Arian bishop Eusebius of Nicomedia. While traditional sources disagree as to why this happened so late, modern historiography concludes that Constantine chose religious tolerance as an instrument to bolster his reign. According to Bart Ehrman, all Christians contemporary to Constantine got baptized on their deathbed since they firmly believed that continuing to sin after baptism ensures their eternal damnation. Ehrman sees no conflict between Constantine's Paganism and him being a Christian.

Notable deathbed conversions

Buffalo Bill

Buffalo Bill was baptized Catholic one day before his death in 1917.

Charles II of England

Charles II of England, the penultimate Catholic monarch of England.

Charles II of England reigned in an Anglican nation at a time of strong religious conflict. Though his sympathies were at least somewhat with the Roman Catholic faith, he ruled as an Anglican, though he attempted to lessen the persecution and legal penalties affecting non-Anglicans in England, notably through the Royal Declaration of Indulgence. As he lay dying following a stroke, released of the political need, he was received into the Catholic Church.

Jean de La Fontaine

The most famous French fabulist published a revised edition of his greatest work, Contes, in 1692, the same year that he began to suffer a severe illness. Under such circumstances, Jean de La Fontaine turned to religion. A young priest, M. Poucet, tried to persuade him about the impropriety of the Contes, and it is said that the destruction of a new play of some merit was demanded and submitted to as a proof of repentance. La Fontaine received the Viaticum, and the following years, he continued to write poems and fables. He died in 1695.

Sir Allan Napier MacNab

Sir Allan Napier MacNab, Canadian political leader, died 8 August 1862 in Hamilton, Ontario. His deathbed conversion to Catholicism caused a furor in the press in the following days. The Toronto Globe and the Hamilton Spectator expressed strong doubts about the conversion, and the Anglican rector of Christ Church in Hamilton declared that MacNab died a Protestant. MacNab's Catholic baptism is recorded at St. Mary's Cathedral in Hamilton, performed by John, Bishop of Hamilton, on 7 August 1862. Lending credibility to this conversion, MacNab's second wife, who predeceased him, was Catholic, and their two daughters were raised as Catholics.

Charles Maurras

In the last days before his death, French author Charles Maurras readopted the Catholic faith of his childhood and received the last rites.

Oscar Wilde

Author and wit Oscar Wilde converted to Catholicism during his final illness. Robert Ross gave a clear and unambiguous account: 'When I went for the priest to come to his death-bed he was quite conscious and raised his hand in response to questions and satisfied the priest, Father Cuthbert Dunne of the Passionists. It was the morning before he died and for about three hours he understood what was going on (and knew I had come from the South in response to a telegram) that he was given the last sacrament. The Passionist house in Avenue Hoche, has a house journal which contains a record, written by Dunne, of his having received Wilde into full communion with the Church. While Wilde's conversion may have come as a surprise, he had long maintained an interest in the Catholic Church, having met with Pope Pius IX in 1877 and describing the Roman Catholic Church as "for saints and sinners alone – for respectable people, the Anglican Church will do". However, how much of a believer in all the tenets of Catholicism Wilde ever was is arguable: in particular, against Ross's insistence on the truth of Catholicism: "No, Robbie, it isn't true." "My position is curious," Wilde epigrammatised, "I am not a Catholic: I am simply a violent Papist."

In his poem Ballad of Reading Gaol, Wilde wrote:

Ah! Happy they whose hearts can break
And peace of pardon win!
How else may man make straight his plan
And cleanse his soul from Sin?
How else but through a broken heart
May Lord Christ enter in?

John Wayne

American actor and filmmaker John Wayne, according to his son Patrick and his grandson Matthew Muñoz, who was a priest in the California Diocese of Orange, converted to Roman Catholicism shortly before his death. Muñoz stated that Wayne expressed a degree of regret about not becoming a Catholic earlier in life, explaining "that was one of the sentiment he expressed before he passed on," blaming "a busy life."

Alleged deathbed conversions

Charles Darwin

After Charles Darwin died, rumours spread that he had converted to Christianity on his deathbed. His children denied this occurred.

One famous example is Charles Darwin's deathbed conversion in which it was claimed by Lady Hope that Darwin said: "How I wish I had not expressed my theory of evolution as I have done." He went on to say that he would like her to gather a congregation since he "would like to speak to them of Christ Jesus and His salvation, being in a state where he was eagerly savoring the heavenly anticipation of bliss." Lady Hope's story was printed in the Boston Watchman Examiner. The story spread, and the claims were republished as late as October 1955 in the Reformation Review and in the Monthly Record of the Free Church of Scotland in February 1957.

Lady Hope's story is not supported by Darwin's children. Darwin's son Francis Darwin accused her of lying, saying that "Lady Hope's account of my father's views on religion is quite untrue. I have publicly accused her of falsehood, but have not seen any reply." Darwin's daughter Henrietta Litchfield also called the story a fabrication, saying "I was present at his deathbed. Lady Hope was not present during his last illness, or any illness. I believe he never even saw her, but in any case she had no influence over him in any department of thought or belief. He never recanted any of his scientific views, either then or earlier. We think the story of his conversion was fabricated in the U.S.A. The whole story has no foundation whatever."

Doc Holliday

According to an obituary by the Glenwood Springs Ute Chief', Doc Holliday had been baptized in the Catholic Church shortly before he died. This was based on correspondence written between Holliday and his cousin, Sister Mary Melanie Holliday (a Catholic Nun), though no baptismal record has ever been found.

Edward VII

King Edward VII of the U.K. is alleged by some to have converted to Roman Catholicism on his deathbed, with other accounts alleging he converted secretly two months before his death.

Wallace Stevens

The poet Wallace Stevens is said to have been baptized a Catholic during his last days suffering from stomach cancer. This account is disputed, particularly by Stevens's daughter, Holly, and critic, Helen Vendler, who, in a letter to James Wm. Chichetto, thought Fr. Arthur Hanley was "forgetful" since "he was interviewed twenty years after Stevens' death."

Voltaire

The accounts of Voltaire's death have been numerous and varying, and it has not been possible to establish the details of what precisely occurred. His enemies related that he repented and accepted the last rites from a Catholic priest, or that he died in agony of body and soul, while his adherents told of his defiance to his last breath.

George Washington

After U.S. President George Washington died in 1799, rumors spread among his slaves that he was baptized Catholic on his deathbed. This story was orally passed down in African-American communities into the 20th Century, as well as among early Maryland Jesuits. The Denver Register printed two pieces, in 1952 and 1957, discussing the possibility of this rumor, including the fact that an official inventory of Washington's personal belongings at the time of his death included 1 Likeness of Virgin Mary" (an item unlikely to have been held by a Protestant). However, no definitive evidence has ever been found of a conversion, nor did any testimony from those close to Washington, including the Catholic Archbishop John Carroll, ever mention this occurring.

Accelerating change

From Wikipedia, the free encyclopedia

In futures studies and the history of technology, accelerating change is the observed exponential nature of the rate of technological change in recent history, which may suggest faster and more profound change in the future and may or may not be accompanied by equally profound social and cultural change.

Early observations

In 1910, during the town planning conference of London, Daniel Burnham noted, "But it is not merely in the number of facts or sorts of knowledge that progress lies: it is still more in the geometric ratio of sophistication, in the geometric widening of the sphere of knowledge, which every year is taking in a larger percentage of people as time goes on." And later on, "It is the argument with which I began, that a mighty change having come about in fifty years, and our pace of development having immensely accelerated, our sons and grandsons are going to demand and get results that would stagger us."

In 1938, Buckminster Fuller introduced the word ephemeralization to describe the trends of "doing more with less" in chemistry, health and other areas of industrial development. In 1946, Fuller published a chart of the discoveries of the chemical elements over time to highlight the development of accelerating acceleration in human knowledge acquisition.

In 1958, Stanislaw Ulam wrote in reference to a conversation with John von Neumann:

One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.

Moravec's Mind Children

In a series of published articles from 1974 to 1979, and then in his 1988 book Mind Children, computer scientist and futurist Hans Moravec generalizes Moore's law to make predictions about the future of artificial life. Moore's law describes an exponential growth pattern in the complexity of integrated semiconductor circuits. Moravec extends this to include technologies from long before the integrated circuit to future forms of technology. Moravec outlines a timeline and a scenario in which robots will evolve into a new series of artificial species, starting around 2030–2040. In Robot: Mere Machine to Transcendent Mind, published in 1998, Moravec further considers the implications of evolving robot intelligence, generalizing Moore's law to technologies predating the integrated circuit, and also plotting the exponentially increasing computational power of the brains of animals in evolutionary history. Extrapolating these trends, he speculates about a coming "mind fire" of rapidly expanding superintelligence similar to the explosion of intelligence predicted by Vinge.

James Burke's Connections

In his TV series Connections (1978)—and sequels Connections² (1994) and Connections³ (1997)—James Burke explores an "Alternative View of Change" (the subtitle of the series) that rejects the conventional linear and teleological view of historical progress. Burke contends that one cannot consider the development of any particular piece of the modern world in isolation. Rather, the entire gestalt of the modern world is the result of a web of interconnected events, each one consisting of a person or group acting for reasons of their own motivations (e.g., profit, curiosity, religious) with no concept of the final, modern result to which the actions of either them or their contemporaries would lead. The interplay of the results of these isolated events is what drives history and innovation, and is also the main focus of the series and its sequels.

Burke also explores three corollaries to his initial thesis. The first is that, if history is driven by individuals who act only on what they know at the time, and not because of any idea as to where their actions will eventually lead, then predicting the future course of technological progress is merely conjecture. Therefore, if we are astonished by the connections Burke is able to weave among past events, then we will be equally surprised to what the events of today eventually will lead, especially events we were not even aware of at the time.

The second and third corollaries are explored most in the introductory and concluding episodes, and they represent the downside of an interconnected history. If history progresses because of the synergistic interaction of past events and innovations, then as history does progress, the number of these events and innovations increases. This increase in possible connections causes the process of innovation to not only continue, but to accelerate. Burke poses the question of what happens when this rate of innovation, or more importantly change itself, becomes too much for the average person to handle, and what this means for individual power, liberty, and privacy.

Gerald Hawkins' Mindsteps

In his book Mindsteps to the Cosmos (HarperCollins, August 1983), Gerald S. Hawkins elucidated his notion of mindsteps, dramatic and irreversible changes to paradigms or world views. He identified five distinct mindsteps in human history, and the technology that accompanied these "new world views": the invention of imagery, writing, mathematics, printing, the telescope, rocket, radio, TV, computer... "Each one takes the collective mind closer to reality, one stage further along in its understanding of the relation of humans to the cosmos." He noted: "The waiting period between the mindsteps is getting shorter. One can't help noticing the acceleration." Hawkins' empirical 'mindstep equation' quantified this, and gave dates for future mindsteps. The date of the next mindstep (5; the series begins at 0) is given as 2021, with two further, successively closer mindsteps in 2045 and 2051, until the limit of the series in 2053. His speculations ventured beyond the technological:

The mindsteps... appear to have certain things in common—a new and unfolding human perspective, related inventions in the area of memes and communications, and a long formulative waiting period before the next mindstep comes along. None of the mindsteps can be said to have been truly anticipated, and most were resisted at the early stages. In looking to the future we may equally be caught unawares. We may have to grapple with the presently inconceivable, with mind-stretching discoveries and concepts.

Mass use of inventions: Years until use by a quarter of US population

Vinge's exponentially accelerating change

The mathematician Vernor Vinge popularized his ideas about exponentially accelerating technological change in the science fiction novel Marooned in Realtime (1986), set in a world of rapidly accelerating progress leading to the emergence of more and more sophisticated technologies separated by shorter and shorter time intervals, until a point beyond human comprehension is reached. His subsequent Hugo award-winning novel A Fire Upon the Deep (1992) starts with an imaginative description of the evolution of a superintelligence passing through exponentially accelerating developmental stages ending in a transcendent, almost omnipotent power unfathomable by mere humans. His already mentioned influential 1993 paper on the technological singularity compactly summarizes the basic ideas.

Kurzweil's The Law of Accelerating Returns

In his 1999 book The Age of Spiritual Machines, Ray Kurzweil proposed "The Law of Accelerating Returns", according to which the rate of change in a wide variety of evolutionary systems (including but not limited to the growth of technologies) tends to increase exponentially. He gave further focus to this issue in a 2001 essay entitled "The Law of Accelerating Returns". In it, Kurzweil, after Moravec, argued for extending Moore's Law to describe exponential growth of diverse forms of technological progress. Whenever a technology approaches some kind of a barrier, according to Kurzweil, a new technology will be invented to allow us to cross that barrier. He cites numerous past examples of this to substantiate his assertions. He predicts that such paradigm shifts have and will continue to become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history". He believes the Law of Accelerating Returns implies that a technological singularity will occur before the end of the 21st century, around 2045. The essay begins:

An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense 'intuitive linear' view. So we won't experience 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today's rate). The 'returns,' such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to the Singularity—technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.

Moore's Law expanded to other technologies.
 
An updated version of Moore's Law over 120 years (based on Kurzweil's graph). The seven most recent data points are all Nvidia GPUs.

The Law of Accelerating Returns has in many ways altered public perception of Moore's law. It is a common (but mistaken) belief that Moore's law makes predictions regarding all forms of technology, when really it only concerns semiconductor circuits. Many futurists still use the term "Moore's law" to describe ideas like those put forth by Moravec, Kurzweil and others.

According to Kurzweil, since the beginning of evolution, more complex life forms have been evolving exponentially faster, with shorter and shorter intervals between the emergence of radically new life forms, such as human beings, who have the capacity to engineer (i.e. intentionally design with efficiency) a new trait which replaces relatively blind evolutionary mechanisms of selection for efficiency. By extension, the rate of technical progress amongst humans has also been exponentially increasing, as we discover more effective ways to do things, we also discover more effective ways to learn, i.e. language, numbers, written language, philosophy, scientific method, instruments of observation, tallying devices, mechanical calculators, computers, each of these major advances in our ability to account for information occur increasingly close together. Already within the past sixty years, life in the industrialized world has changed almost beyond recognition except for living memories from the first half of the 20th century. This pattern will culminate in unimaginable technological progress in the 21st century, leading to a singularity. Kurzweil elaborates on his views in his books The Age of Spiritual Machines and The Singularity Is Near.

Limits of accelerating change

Applying a scientific approach, we will notice in natural science that it is typical that processes characterized by exponential acceleration in their initial stages go into the saturation phase. This clearly makes it possible to realize that if an increase with acceleration is observed over a certain period of time, this does not mean an endless continuation of this process. On the contrary, in many cases this means an early exit to the plateau of speed. The processes occurring in natural science allow us to suggest that the observed picture of accelerating scientific and technological progress, after some time (in physical processes, as a rule, is short) will be replaced by a slowdown and a complete stop? Despite the possible termination / attenuation of the acceleration of the progress of science and technology in the foreseeable future, progress itself, and as a result, social transformations, will not stop or even slow down - it will continue with the achieved (possibly huge) speed, which has become constant.

Accelerating change may not be restricted to the Anthropocene Epoch, but a general and predictable developmental feature of the universe. The physical processes that generate an acceleration such as Moore's law are positive feedback loops giving rise to exponential or superexponential technological change. These dynamics lead to increasingly efficient and dense configurations of Space, Time, Energy, and Matter (STEM efficiency and density, or STEM "compression"). At the physical limit, this developmental process of accelerating change leads to black hole density organizations, a conclusion also reached by studies of the ultimate physical limits of computation in the universe.

Applying this vision to the search for extraterrestrial intelligence leads to the idea that advanced intelligent life reconfigures itself into a black hole. Such advanced life forms would be interested in inner space, rather than outer space and interstellar expansion. They would thus in some way transcend reality, not be observable and it would be a solution to Fermi's paradox called the "transcension hypothesis". Another solution is that the black holes we observe could actually be interpreted as intelligent super-civilizations feeding on stars, or "stellivores". This dynamics of evolution and development is an invitation to study the universe itself as evolving, developing. If the universe is a kind of superorganism, it may possibly tend to reproduce, naturally or artificially, with intelligent life playing a role.

Other estimates

Dramatic changes in the rate of economic growth have occurred in the past because of some technological advancement. Based on population growth, the economy doubled every 250,000 years from the Paleolithic era until the Neolithic Revolution. The new agricultural economy doubled every 900 years, a remarkable increase. In the current era, beginning with the Industrial Revolution, the world's economic output doubles every fifteen years, sixty times faster than during the agricultural era. If the rise of superhuman intelligence causes a similar revolution, argues Robin Hanson, then one would expect the economy to double at least quarterly and possibly on a weekly basis.

In his 1981 book Critical Path, futurist and inventor R. Buckminster Fuller estimated that if we took all the knowledge that mankind had accumulated and transmitted by the year One CE as equal to one unit of information, it probably took about 1500 years (or until the sixteenth century) for that amount of knowledge to double. The next doubling of knowledge from two to four 'knowledge units' took only 250 years, until about 1750 CE. By 1900, one hundred and fifty years later, knowledge had doubled again to 8 units. The observed speed at which information doubled was getting faster and faster. In modern times, exponential knowledge progressions therefore change at an ever-increasing rate. Depending on the progression, this tends to lead toward explosive growth at some point. A simple exponential curve that represents this accelerating change phenomenon could be modeled by a doubling function. This fast rate of knowledge doubling leads up to the basic proposed hypothesis of the technological singularity: the rate at which technology progression surpasses human biological evolution.

Criticisms

Both Theodore Modis and Jonathan Huebner have argued—each from different perspectives—that the rate of technological innovation has not only ceased to rise, but is actually now declining.

Gallery

Kurzweil created the following graphs to illustrate his beliefs concerning and his justification for his Law of Accelerating Returns.

 

Neurosurgery

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Neurosurg...