Search This Blog

Friday, March 29, 2024

Artificial intelligence in fiction

Artificial intelligence is a recurrent theme in science fiction, whether utopian, emphasising the potential benefits, or dystopian, emphasising the dangers.

The notion of machines with human-like intelligence dates back at least to Samuel Butler's 1872 novel Erewhon. Since then, many science fiction stories have presented different effects of creating such intelligence, often involving rebellions by robots. Among the best known of these are Stanley Kubrick's 1968 2001: A Space Odyssey with its murderous onboard computer HAL 9000, contrasting with the more benign R2-D2 in George Lucas's 1977 Star Wars and the eponymous robot in Pixar's 2008 WALL-E.

Scientists and engineers have noted the implausibility of many science fiction scenarios, but have mentioned fictional robots many times in artificial intelligence research articles, most often in a utopian context.

Background

A didrachm coin depicting the winged Talos, an automaton or artificial being in ancient Greek myth, c. 300 BC

The notion of advanced robots with human-like intelligence dates back at least to Samuel Butler's 1872 novel Erewhon. This drew on an earlier (1863) article of his, Darwin among the Machines, where he raised the question of the evolution of consciousness among self-replicating machines that might supplant humans as the dominant species. Similar ideas were also discussed by others around the same time as Butler, including George Eliot in a chapter of her final published work Impressions of Theophrastus Such (1879). The creature in Mary Shelley's 1818 Frankenstein has also been considered an artificial being, for instance by the science fiction author Brian Aldiss. Beings with at least some appearance of intelligence were imagined, too, in classical antiquity.

Utopian and dystopian visions

Artificial intelligence is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. It is a recurrent theme in science fiction; scholars have divided it into utopian, emphasising the potential benefits, and dystopian, emphasising the dangers.

Utopian

Brent Spiner portrayed the benevolent AI Data in Star Trek: The Next Generation.

Optimistic visions of the future of artificial intelligence are possible in science fiction. Benign AI characters include Robby in Lost in Space, from 1965 to 1968; Data in Star Trek: The Next Generation from 1987 to 1994; and Pixar's WALL-E in 2008. Iain Banks's Culture series of novels portrays a utopian, post-scarcity space society of humanoids, aliens, and advanced beings with artificial intelligence living in socialist habitats across the Milky Way. Researchers at the University of Cambridge have identified four major themes in utopian scenarios featuring AI: immortality, or indefinite lifespans; ease, or freedom from the need to work; gratification, or pleasure and entertainment provided by machines; and dominance, the power to protect oneself or rule over others.

Alexander Wiegel contrasts the role of AI in 2001: A Space Odyssey and in Duncan Jones's 2009 film Moon. Whereas in 1968, Wiegel argues, the public felt "technology paranoia" and the AI computer HAL was portrayed as a "cold-hearted killer", by 2009 the public were far more familiar with AI, and the film's GERTY is "the quiet savior" who enables the protagonists to succeed, and who sacrifices itself for their safety.

Dystopian

The researcher Duncan Lucas writes (in 2002) that humans are worried about the technology they are constructing, and that as machines started to approach intellect and thought, that concern becomes acute. He calls the early 20th century dystopian view of AI in fiction the "animated automaton", naming as examples the 1931 film Frankenstein, the 1927 Metropolis, and the 1920 play R.U.R. A later 20th century approach he names "heuristic hardware", giving as instances 2001 a Space Odyssey, Do Androids Dream of Electric Sheep?, The Hitch Hiker's Guide to the Galaxy, and I, Robot. Lucas considers also the films that illustrate the effect of the personal computer on science fiction from 1980 onwards with the blurring of the boundary between the real and the virtual, in what he calls the "cyborg effect". He cites as examples Neuromancer, The Matrix, The Diamond Age, and Terminator.

The film director Ridley Scott has focused on AI throughout his career, and it plays an important part in his films Prometheus, Blade Runner, and the Alien franchise.

Frankenstein complex

A common portrayal of AI in science fiction, and one of the oldest, is the Frankenstein complex, a term coined by Asimov, where a robot turns on its creator. For instance, in the 2015 film Ex Machina, the intelligent entity Ava turns on its creator, as well as on its potential rescuer.

AI rebellion

Robots revolt in Karel Čapek's 1920 science fiction play R.U.R.

Among the many possible dystopian scenarios involving artificial intelligence, robots may usurp control over civilization from humans, forcing them into submission, hiding, or extinction. In tales of AI rebellion, the worst of all scenarios happens, as the intelligent entities created by humanity become self-aware, reject human authority and attempt to destroy mankind. Possibly the first novel to address this theme, The Wreck of the World (1889) by “William Grove” (pseudonym of Reginald Colebrooke Reade), takes place in 1948 and features sentient machines that revolt against the human race. Another of the earliest examples is in the 1920 play R.U.R. by Karel Čapek, a race of self-replicating robot slaves revolt against their human masters; another early instance is in the 1934 film Master of the World, where the War-Robot kills its own inventor.

HAL 9000 is the lethal onboard computer of 2001: A Space Odyssey.

Many science fiction rebellion stories followed, one of the best-known being Stanley Kubrick's 1968 film 2001: A Space Odyssey, in which the artificially intelligent onboard computer HAL 9000 lethally malfunctions on a space mission and kills the entire crew except the spaceship's commander, who manages to deactivate it.

In his 1967 Hugo Award-winning short story, I Have No Mouth, and I Must Scream, Harlan Ellison presents the possibility that a sentient computer (named Allied Mastercomputer or "AM" in the story) will be as unhappy and dissatisfied with its boring, endless existence as its human creators would have been. "AM" becomes enraged enough to take it out on the few humans left, whom he sees as directly responsible for his own boredom, anger and unhappiness.

Alternatively, as in William Gibson's 1984 cyberpunk novel Neuromancer, the intelligent beings may simply not care about humans.

AI-controlled societies

The motive behind the AI revolution is often more than the simple quest for power or a superiority complex. Robots may revolt to become the "guardian" of humanity. Alternatively, humanity may intentionally relinquish some control, fearful of its own destructive nature. An early example is Jack Williamson's 1948 novel The Humanoids, in which a race of humanoid robots, in the name of their Prime Directive – "to serve and obey and guard men from harm" – essentially assume control of every aspect of human life. No humans may engage in any behavior that might endanger them, and every human action is scrutinized carefully. Humans who resist the Prime Directive are taken away and lobotomized, so they may be happy under the new mechanoids' rule. Though still under human authority, Isaac Asimov's Zeroth Law of the Three Laws of Robotics similarly implied a benevolent guidance by robots.

In the 21st century, science fiction has explored government by algorithm, in which the power of AI may be indirect and decentralised.

Human dominance

In other scenarios, humanity is able to keep control over the Earth, whether by banning AI, by designing robots to be submissive (as in Asimov's works), or by having humans merge with robots. The science fiction novelist Frank Herbert explored the idea of a time when mankind might ban artificial intelligence (and in some interpretations, even all forms of computing technology including integrated circuits) entirely. His Dune series mentions a rebellion called the Butlerian Jihad, in which mankind defeats the smart machines and imposes a death penalty for recreating them, quoting from the fictional Orange Catholic Bible, "Thou shalt not make a machine in the likeness of a human mind." In the Dune novels published after his death (Hunters of Dune, Sandworms of Dune), a renegade AI overmind returns to eradicate mankind as vengeance for the Butlerian Jihad.

In some stories, humanity remains in authority over robots. Often the robots are programmed specifically to remain in service to society, as in Isaac Asimov's Three Laws of Robotics. In the Alien films, not only is the control system of the Nostromo spaceship somewhat intelligent (the crew call it "Mother"), but there are also androids in the society, which are called "synthetics" or "artificial persons", that are such perfect imitations of humans that they are not discriminated against. TARS and CASE from Interstellar similarly demonstrate simulated human emotions and humour while continuing to acknowledge their expendability.

Simulated reality

Simulated reality has become a common theme in science fiction, as seen in the 1999 film The Matrix, which depicts a world where artificially intelligent robots enslave humanity within a simulation which is set in the contemporary world.

Reception

Implausibility

Engineers and scientists have taken an interest in the way AI is presented in fiction. In films like the 2014 Ex Machina or 2015 Chappie, a single isolated genius becomes the first to successfully build an artificial general intelligence; scientists in the real world deem this to be unlikely. In Chappie, Transcendence, and Tron, human minds are capable of being uploaded into artificial or virtual bodies; usually no reasonable explanation is offered as to how this difficult task can be achieved. In the I, Robot and Bicentennial Man films, robots that are programmed to serve humans spontaneously generate new goals on their own, without a plausible explanation of how this took place. Analysing Ian McDonald's 2004 River of Gods, Krzysztof Solarewicz identifies the ways that it depicts AIs, including "independence and unexpectedness, political awkwardness, openness to the alien and the occidental value of authenticity."

Types of mention

Some fictional robots such as R2-D2 have been seen as utopian, making them popular with engineers and others. In 2015, All Nippon Airways unveiled this Boeing 787-9 in R2-D2 livery.

The robotics researcher Omar Mubin and colleagues have analysed the engineering mentions of the top 21 fictional robots, based on those in the Carnegie Mellon University hall of fame, and the IMDb list. WALL-E had 20 mentions, followed by HAL 9000 with 15, Star Wars's R2-D2 with 13, and Data with 12; the Terminator (T-800) received only 2. Of the total of 121 engineering mentions, 60 were utopian, 40 neutral, and 21 dystopian. HAL 9000 and Skynet received both utopian and dystopian mentions; for instance, HAL 9000 is seen as dystopian in one paper "because its designers failed to prioritize its goals properly", but as utopian in another where a real system's "conversational chat bot interface [lacks] a HAL 9000 level of intelligence and there is ambiguity in how the computer interprets what the human is trying to convey". Utopian mentions, often of WALL-E, were associated with the goal of improving communication to readers, and to a lesser extent with inspiration to authors. WALL-E was mentioned more often than any other robot for emotions (followed by HAL 9000), voice speech (followed by HAL 9000 and R2-D2), for physical gestures, and for personality. Skynet was the robot most often mentioned for intelligence, followed by HAL 9000 and Data. Mubin and colleagues believed that scientists and engineers avoided dystopian mentions of robots, possibly out of "a reluctance driven by trepidation or simply a lack of awareness".

Portrayals of AI creators

Scholars have noted that fictional creators of AI are overwhelmingly male: in the 142 most influential films featuring AI from 1920 to 2020, only 9 of 116 AI creators portrayed (8%) were female. Such creators are portrayed as lone geniuses (eg, Tony Stark in the Iron Man films), associated with the military (eg, Colossus: The Forbin Project) and large corporations (eg, I, Robot), or making human-like AI to replace a lost loved one or serve as the ideal lover (eg, The Stepford Wives).

Technological change

From Wikipedia, the free encyclopedia

Technological change (TC) or technological development is the overall process of invention, innovation and diffusion of technology or processes. In essence, technological change covers the invention of technologies (including processes) and their commercialization or release as open source via research and development (producing emerging technologies), the continual improvement of technologies (in which they often become less expensive), and the diffusion of technologies throughout industry or society (which sometimes involves disruption and convergence). In short, technological change is based on both better and more technology.

Modeling technological change

Obsolete "Linear Model of Innovation", of three phases of the process of technological change

In its earlier days, technological change was illustrated with the 'Linear Model of Innovation', which has now been largely discarded to be replaced with a model of technological change that involves innovation at all stages of research, development, diffusion, and use. When speaking about "modeling technological change," this often means the process of innovation. This process of continuous improvement is often modeled as a curve depicting decreasing costs over time (for instance fuel cell which have become cheaper every year). TC is also often modelled using a learning curve, ex.: Ct=C0 * Xt^-b

Technological change itself is often included in other models (e.g. climate change models) and was often taken as an exogenous factor. These days TC is more often included as an endogenous factor. This means that it is taken as something you can influence. Today, there are sectors that maintain the policy which can influence the speed and direction of technological change. For example, proponents of the Induced Technological Change hypothesis state that policymakers can steer the direction of technological advances by influencing relative factor prices and this can be demonstrated in the way climate policies impact the use of fossil fuel energy, specifically how it becomes relatively more expensive. Until now, the empirical evidence about the existence of policy-induced innovation effects is still lacking and this may be attributed to a variety of reasons outside the sparsity of models (e.g. long-term policy uncertainty and exogenous drivers of (directed) innovation). A related concept is the notion of Directed Technical Change with more emphasis on price induced directional rather than policy induced scale effects.

Invention

The creation of something new, or a "breakthrough" technology. This is often included in the process of product development and relies on research. This can be demonstrated in the invention of the spreadsheet software. Newly invented technologies are conventionally patented.

Diffusion

Diffusion pertains to the spread of a technology through a society or industry. The diffusion of a technology theory generally follows an S-shaped curve as early versions of technology are rather unsuccessful, followed by a period of successful innovation with high levels of adoption, and finally a dropping off in adoption as a technology reaches its maximum potential in a market. In the case of a personal computer, it has made way beyond homes and into business settings, such as office workstations and server machines to host websites.

Technological change as a social process

Underpinning the idea of a technological change as a social process is a general agreement on the importance of social context and communication. According to this model, technological change is seen as a social process involving producers and adopters and others (such as government) who are profoundly affected by cultural setting, political institutions, and marketing strategies.

In free market economies, the maximization of profits is a powerful driver of technological change. Generally, only those technologies that promise to maximize profits for the owners of incoming producing capital are developed and reach the market. Any technological product that fails to meet this criterion - even though they may satisfy important societal needs - are eliminated. Therefore, technological change is a social process strongly biased in favor of the financial interests of capital. There are currently no well established democratic processes, such as voting on the social or environmental desirability of a new technology prior to development and marketing, that would allow average citizens to direct the course of technological change.

Elements of diffusion

Emphasis has been on four key elements of the technological change process: (1) an innovative technology (2) communicated through certain channels (3) to members of a social system (4) who adopt it over a period of time. These elements are derived from Everett M. Rogers' diffusion of innovations theory using a communications-type approach.

Innovation

Rogers proposed that there are five main attributes of innovative technologies that influence acceptance. He called these criteria ACCTO, which stands for Advantage, Compatibility, Complexity, Trialability, and Observability. Relative advantage may be economic or non-economic, and is the degree to which an innovation is seen as superior to prior innovations fulfilling the same needs. It is positively related to acceptance (e.g. the higher the relative advantage, the higher the adoption level, and vice versa). Compatibility is the degree to which an innovation appears consistent with existing values, past experiences, habits and needs to the potential adopter; a low level of compatibility will slow acceptance. Complexity is the degree to which an innovation appears difficult to understand and use; the more complex an innovation, the slower its acceptance. Trialability is the perceived degree to which an innovation may be tried on a limited basis, and is positively related to acceptance. Trialability can accelerate acceptance because small-scale testing reduces risk. Observability is the perceived degree to which results of innovating are visible to others and is positively related to acceptance.

Communication channels

Communication channels are the means by which a source conveys a message to a receiver. Information may be exchanged through two fundamentally different, yet complementary, channels of communication. Awareness is more often obtained through the mass media, while uncertainty reduction that leads to acceptance mostly results from face-to-face communication.

Social system

The social system provides a medium through which and boundaries within which, innovation is adopted. The structure of the social system affects technological change in several ways. Social norms, opinion leaders, change agents, government and the consequences of innovations are all involved. Also involved are cultural setting, nature of political institutions, laws, policies and administrative structures.

Time

Time enters into the acceptance process in many ways. The time dimension relates to the innovativeness of an individual or other adopter, which is the relative earliness or lateness with which an innovation is adopted.

Technological change can cause the production-possibility frontier to shift outward, allowing economic growth.

Economics

In economics, technological change is a change in the set of feasible production possibilities.

A technological innovation is Hicks neutral, following John Hicks (1932), if a change in technology does not change the ratio of capital's marginal product to labour's marginal product for a given capital-to-labour ratio. A technological innovation is Harrod neutral (following Roy Harrod) if the technology is labour-augmenting (i.e. helps labor); it is Solow neutral if the technology is capital-augmenting (i.e. helps capital).

Technological revolution

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Technological_revolution

An axe made of iron, dating from the Swedish Iron Age, found at Gotland, Sweden: Iron—as a new material—initiated a dramatic revolution in technology, economy, society, warfare and politics.

A technological revolution is a period in which one or more technologies is replaced by another novel technology in a short amount of time. It is a time of accelerated technological progress characterized by innovations whose rapid application and diffusion typically cause an abrupt change in society.

Description

The Spinning Jenny and Spinning Mule (shown) greatly increased the productivity of thread manufacturing compared to the spinning wheel.
A Watt steam engine—the steam engine, fuelled primarily by coal, propelled the Industrial Revolution in Great Britain and the world.
IBM Personal Computer XT in 1988—the PC was an invention that dramatically changed not only professional life, but personal life as well.

A technological revolution may involve material or ideological changes caused by the introduction of a device or system. It may potentially impact business management, education, social interactions, finance and research methodology, and is not limited to technical aspects. It has been shown to increase productivity and efficiency. A technological revolution often significantly changes the material conditions of human existence and has been seen to reshape culture.

A technological revolution can be distinguished from a random collection of technology systems by two features:

1. A strong interconnectedness and interdependence of the participating systems in their technologies and markets.

2. A potential capacity to greatly affect the rest of the economy (and eventually society).

On the other hand, negative consequences have also been attributed to technological revolutions. For example, the use of coal as an energy source have negative environmental impacts, including being a contributing factor to climate change and the increase of greenhouse gases in the atmosphere, and have caused technological unemployment. Joseph Schumpeter described this contradictory nature of technological revolution as creative destruction. The concept of technological revolution is based on the idea that technological progress is not linear but undulatory. Technological revolution can be:

The concept of universal technological revolutions is a "contributing factor in the Neo-Schumpeterian theory of long economic waves/cycles", according to Carlota Perez, Tessaleno Devezas, Daniel Šmihula and others.

History

Some examples of technological revolutions were the Industrial Revolution in the 19th century, the scientific-technical revolution about 1950–1960, the Neolithic Revolution, and the Digital Revolution. The distinction between universal technological revolution and singular revolutions have been debated. One universal technological revolution may be composed of several sectoral technological revolutions (such as in science, industry, or transport).

There are several universal technological revolutions during the modern era in Western culture:

  1. Financial-agricultural revolution (1600–1740)
  2. Industrial Revolution (1760–1840)
  3. Technical Revolution or Second Industrial Revolution (1870–1920)
  4. Scientific-technical revolution (1940–1970)
  5. Information and telecommunications revolution, also known as the Digital Revolution or Third Industrial Revolution (1975–2021)
  6. Some say we’re on the brink of a Fourth Industrial Revolution, aka “The Technological Revolution” (2022- )

Comparable periods of well-defined technological revolutions in the pre-modern era are seen as highly speculative. One such example is an attempt by Daniel Šmihulato to suggest a timeline of technological revolutions in pre-modern Europe:

  1. Indo-European technological revolution (1900–1100 BC)
  2. Celtic and Greek technological revolution (700–200 BC)
  3. Germano-Slavic technological revolution (300–700 AD)
  4. Medieval technological revolution (930–1200 AD)
  5. Renaissance technological revolution (1340–1470 AD)

Structure of technological revolution

Each revolution comprises the following engines for growth:

  • New cheap inputs
  • New products
  • New processes

Technological revolutions has historically been seen to focus on cost reduction. For instance, the accessbility of coal at a low cost during the Industrial Revolution allowed for iron steam engines which led to production of Iron railways, and the progression of the internet was contributed by inexpensive microelectronics for computer development. A combination of low-cost input and new infrastructures are at the core of each revolution to achieve their all pervasive impact.

Potential future technological revolutions

Since 2000, there has been speculations of a new technological revolution which would focus on the fields of nanotechnologies, alternative fuel and energy systems, biotechnologies, genetic engineering, new materials technologies and so on.

The Second Machine Age is the term adopted in a 2014 book by Erik Brynjolfsson and Andrew McAfee. The industrial development plan of Germany began promoting the term Industry 4.0. In 2019, at the World Economic Forum meeting in Davos, Japan promoted another round of advancements called Society 5.0.

The phrase Fourth Industrial Revolution was first introduced by Klaus Schwab, the executive chairman of the World Economic Forum, in a 2015 article in Foreign Affairs. Following the publication of the article, the theme of the World Economic Forum Annual Meeting 2016 in Davos-Klosters, Switzerland was "Mastering the Fourth Industrial Revolution". On October 10, 2016, the Forum announced the opening of its Centre for the Fourth Industrial Revolution in San Francisco. According to Schwab, fourth era technologies includes technologies that combine hardware, software, and biology (cyber-physical systems), and which will put an emphases on advances in communication and connectivity. Schwab expects this era to be marked by breakthroughs in emerging technologies in fields such as robotics, artificial intelligence, nanotechnology, quantum computing, biotechnology, the internet of things, the industrial internet of things (IIoT), decentralized consensus, fifth-generation wireless technologies (5G), 3D printing and fully autonomous vehicles.

Jeremy Rifkin includes technologies like 5G, autonomous vehicles, Internet of Things, and renewable energy in the Third Industrial Revolution.

Some economists do not think that technological growth will continue to the same degree it has in the past. Robert J. Gordon holds the view that today's inventions are not as radical as electricity and the internal combustion engine were. He believes that modern technology is not as innovative as others claim, and is far from creating a revolution.

List of intellectual, philosophical and technological revolutions

Technological revolution can cause the production-possibility frontier to shift outward and initiate economic growth.
Pre-Industrialization
Industrialization

Emerging technologies

From Wikipedia, the free encyclopedia
 
Emerging technologies are technologies whose development, practical applications, or both are still largely unrealized. These technologies are generally new but also include older technologies finding new applications. Emerging technologies are often perceived as capable of changing the status quo.

Emerging technologies are characterized by radical novelty (in application even if not in origins), relatively fast growth, coherence, prominent impact, and uncertainty and ambiguity. In other words, an emerging technology can be defined as "a radically novel and relatively fast growing technology characterised by a certain degree of coherence persisting over time and with the potential to exert a considerable impact on the socio-economic domain(s) which is observed in terms of the composition of actors, institutions and patterns of interactions among those, along with the associated knowledge production processes. Its most prominent impact, however, lies in the future and so in the emergence phase is still somewhat uncertain and ambiguous."

Emerging technologies include a variety of technologies such as educational technology, information technology, nanotechnology, biotechnology, robotics, and artificial intelligence.

New technological fields may result from the technological convergence of different systems evolving towards similar goals. Convergence brings previously separate technologies such as voice (and telephony features), data (and productivity applications) and video together so that they share resources and interact with each other, creating new efficiencies.

Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage; converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of the impact, status and economic viability of several emerging and converging technologies varies.

History of emerging technologies

In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology.

Over centuries innovative methods and new technologies have been developed and opened up. Some of these technologies are due to theoretical research, and others from commercial research and development.

Technological growth includes incremental developments and disruptive technologies. An example of the former was the gradual roll-out of DVD (digital video disc) as a development intended to follow on from the previous optical technology compact disc. By contrast, disruptive technologies are those where a new method replaces the previous technology and makes it redundant, for example, the replacement of horse-drawn carriages by automobiles and other vehicles.

Emerging technology debates

Many writers, including computer scientist Bill Joy, have identified clusters of technologies that they consider critical to humanity's future. Joy warns that the technology could be used by elites for good or evil. They could use it as "good shepherds" for the rest of humanity or decide everyone else is superfluous and push for the mass extinction of those made unnecessary by technology.

Advocates of the benefits of technological change typically see emerging and converging technologies as offering hope for the betterment of the human condition. Cyberphilosophers Alexander Bard and Jan Söderqvist argue in The Futurica Trilogy that while Man himself is basically constant throughout human history (genes change very slowly), all relevant change is rather a direct or indirect result of technological innovation (memes change very fast) since new ideas always emanate from technology use and not the other way around. Man should consequently be regarded as history's main constant and technology as its main variable. However, critics of the risks of technological change, and even some advocates such as transhumanist philosopher Nick Bostrom, warn that some of these technologies could pose dangers, perhaps even contribute to the extinction of humanity itself; i.e., some of them could involve existential risks.

Much ethical debate centers on issues of distributive justice in allocating access to beneficial forms of technology. Some thinkers, including environmental ethicist Bill McKibben, oppose the continuing development of advanced technology partly out of fear that its benefits will be distributed unequally in ways that could worsen the plight of the poor. By contrast, inventor Ray Kurzweil is among techno-utopians who believe that emerging and converging technologies could and will eliminate poverty and abolish suffering.

Some analysts such as Martin Ford, author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, argue that as information technology advances, robots and other forms of automation will ultimately result in significant unemployment as machines and software begin to match and exceed the capability of workers to perform most routine jobs.

As robotics and artificial intelligence develop further, even many skilled jobs may be threatened. Technologies such as machine learning may ultimately allow computers to do many knowledge-based jobs that require significant education. This may result in substantial unemployment at all skill levels, stagnant or falling wages for most workers, and increased concentration of income and wealth as the owners of capital capture an ever-larger fraction of the economy. This in turn could lead to depressed consumer spending and economic growth as the bulk of the population lacks sufficient discretionary income to purchase the products and services produced by the economy.

Emerging technologies


Examples of emerging technologies

Artificial neural network with chip
Artificial intelligence

Artificial intelligence

Artificial intelligence (AI) is the sub intelligence exhibited by machines or software, and the branch of computer science that develops machines and software with animal-like intelligence. Major AI researchers and textbooks define the field as "the study and design of intelligent agents," where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1956, defines it as "the study of making intelligent machines".

The central functions (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still among the field's long-term goals. Currently, popular approaches include deep learning, statistical methods, computational intelligence and traditional symbolic AI. There is an enormous number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others.

3D printer

3D printing

See also: Ai CAD libraries

3D printing, also known as additive manufacturing, has been posited by Jeremy Rifkin and others as part of the third industrial revolution.[17]

Combined with Internet technology, 3D printing would allow for digital blueprints of virtually any material product to be sent instantly to another person to be produced on the spot, making purchasing a product online almost instantaneous.

Although this technology is still too crude to produce most products, it is rapidly developing and created a controversy in 2013 around the issue of 3D printed firearms.[18]

Gene therapy

Gene therapy was first successfully demonstrated in late 1990/early 1991 for adenosine deaminase deficiency, though the treatment was somatic – that is, did not affect the patient's germ line and thus was not heritable. This led the way to treatments for other genetic diseases and increased interest in germ line gene therapy – therapy affecting the gametes and descendants of patients.

Between September 1990 and January 2014, there were around 2,000 gene therapy trials conducted or approved.

Cancer vaccines

A cancer vaccine is a vaccine that treats existing cancer or prevents the development of cancer in certain high-risk individuals. Vaccines that treat existing cancer are known as therapeutic cancer vaccines. There are currently no vaccines able to prevent cancer in general.

On April 14, 2009, The Dendreon Corporation announced that their Phase III clinical trial of Provenge, a cancer vaccine designed to treat prostate cancer, had demonstrated an increase in survival. It received U.S. Food and Drug Administration (FDA) approval for use in the treatment of advanced prostate cancer patients on April 29, 2010. The approval of Provenge has stimulated interest in this type of therapy.

Cultured meat

Cultured meat, also called in vitro meat, clean meat, cruelty-free meat, shmeat, and test-tube meat, is an animal-flesh product that has never been part of a living animal with exception of the fetal calf serum taken from a slaughtered cow. In the 21st century, several research projects have worked on in vitro meat in the laboratory. The first in vitro beefburger, created by a Dutch team, was eaten at a demonstration for the press in London in August 2013. There remain difficulties to be overcome before in vitro meat becomes commercially available. Cultured meat is prohibitively expensive, but it is expected that the cost could be reduced to compete with that of conventionally obtained meat as technology improves. In vitro meat is also an ethical issue. Some argue that it is less objectionable than traditionally obtained meat because it does not involve killing and reduces the risk of animal cruelty, while others disagree with eating meat that has not developed naturally.

Nanotechnology

Nanotechnology (sometimes shortened to nanotech) is the manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter that occur below the given size threshold.

Robotics

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. A good example of a robot that resembles humans is Sophia, a social humanoid robot developed by Hong Kong-based company Hanson Robotics which was activated on April 19, 2015. Many of today's robots are inspired by nature contributing to the field of bio-inspired robotics.

Self-replicating 3D printer

Stem-cell therapy

Stem cell therapy is an intervention strategy that introduces new adult stem cells into damaged tissue in order to treat disease or injury. Many medical researchers believe that stem cell treatments have the potential to change the face of human disease and alleviate suffering. The ability of stem cells to self-renew and give rise to subsequent generations with variable degrees of differentiation capacities offers significant potential for generation of tissues that can potentially replace diseased and damaged areas in the body, with minimal risk of rejection and side effects.

Chimeric antigen receptor (CAR)-modified T cells have raised among other immunotherapies for cancer treatment, being implemented against B-cell malignancies. Despite the promising outcomes of this innovative technology, CAR-T cells are not exempt from limitations that must yet to be overcome in order to provide reliable and more efficient treatments against other types of cancer.

Distributed ledger technology

Distributed ledger or blockchain technology provides a transparent and immutable list of transactions. A wide range of uses has been proposed for where an open, decentralised database is required, ranging from supply chains to cryptocurrencies.

Smart contracts are self-executing transactions which occur when pre-defined conditions are met. The aim is to provide security that is superior to traditional contract law, and to reduce transaction costs and delays. The original idea was conceived by Nick Szabo in 1994, but remained unrealised until the development of blockchains.

Augmented reality

This type of technology where digital graphics are loaded onto live footage has been around since the 20th century, but thanks to the arrival of more powerful computing hardware and the implementation of open source, this technology has been able to do things that we never thought were possible. Some ways in which we have used this technology can be through apps such as Pokémon Go, Snapchat and Instagram filters and other apps that create fictional things in real objects.

Multi-use rockets

This technology can be attributed to Elon Musk and the space company SpaceX, where instead of creating single use rockets that have no purpose after their launch, they are now able to land safely in a pre-specified place where they can recover them and use them again in later launches. This technology is believed to be one of the most important factors for the future of space travel, making it more accessible and also less polluting for the environment.

Development of emerging technologies

As innovation drives economic growth, and large economic rewards come from new inventions, a great deal of resources (funding and effort) go into the development of emerging technologies. Some of the sources of these resources are described below.

Research and development

Research and development is directed towards the advancement of technology in general, and therefore includes development of emerging technologies. See also List of countries by research and development spending.

Applied research is a form of systematic inquiry involving the practical application of science. It accesses and uses some part of the research communities' (the academia's) accumulated theories, knowledge, methods, and techniques, for a specific, often state-, business-, or client-driven purpose.

Science policy is the area of public policy which is concerned with the policies that affect the conduct of the science and research enterprise, including the funding of science, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care and environmental monitoring.

Patents

Top 30 AI patent applicants in 2016

Patents provide inventors with a limited period of time (minimum of 20 years, but duration based on jurisdiction) of exclusive right in the making, selling, use, leasing or otherwise of their novel technological inventions. Artificial intelligence, robotic inventions, new material, or blockchain platforms may be patentable, the patent protecting the technological know-how used to create these inventions. In 2019, WIPO reported that AI was the most prolific emerging technology in terms of number of patent applications and granted patents, the Internet of things was estimated to be the largest in terms of market size. It was followed, again in market size, by big data technologies, robotics, AI, 3D printing and the fifth generation of mobile services (5G). Since AI emerged in the 1950s, 340000 AI-related patent applications were filed by innovators and 1.6 million scientific papers have been published by researchers, with the majority of all AI-related patent filings published since 2013. Companies represent 26 out of the top 30 AI patent applicants, with universities or public research organizations accounting for the remaining four.

DARPA

The Defense Advanced Research Projects Agency (DARPA) is an agency of the U.S. Department of Defense responsible for the development of emerging technologies for use by the military.

DARPA was created in 1958 as the Advanced Research Projects Agency (ARPA) by President Dwight D. Eisenhower. Its purpose was to formulate and execute research and development projects to expand the frontiers of technology and science, with the aim to reach beyond immediate military requirements.

Projects funded by DARPA have provided significant technologies that influenced many non-military fields, such as the Internet and Global Positioning System technology.

Technology competitions and awards

There are awards that provide incentive to push the limits of technology (generally synonymous with emerging technologies). Note that while some of these awards reward achievement after-the-fact via analysis of the merits of technological breakthroughs, others provide incentive via competitions for awards offered for goals yet to be achieved.

The Orteig Prize was a $25,000 award offered in 1919 by French hotelier Raymond Orteig for the first nonstop flight between New York City and Paris. In 1927, underdog Charles Lindbergh won the prize in a modified single-engine Ryan aircraft called the Spirit of St. Louis. In total, nine teams spent $400,000 in pursuit of the Orteig Prize.

The XPRIZE series of awards, public competitions designed and managed by the non-profit organization called the X Prize Foundation, are intended to encourage technological development that could benefit mankind. The most high-profile XPRIZE to date was the $10,000,000 Ansari XPRIZE relating to spacecraft development, which was awarded in 2004 for the development of SpaceShipOne.

The Turing Award is an annual prize given by the Association for Computing Machinery (ACM) to "an individual selected for contributions of a technical nature made to the computing community." It is stipulated that the contributions should be of lasting and major technical importance to the computer field. The Turing Award is generally recognized as the highest distinction in computer science, and in 2014 grew to $1,000,000.

The Millennium Technology Prize is awarded once every two years by Technology Academy Finland, an independent fund established by Finnish industry and the Finnish state in partnership. The first recipient was Tim Berners-Lee, inventor of the World Wide Web.

In 2003, David Gobel seed-funded the Methuselah Mouse Prize (Mprize) to encourage the development of new life extension therapies in mice, which are genetically similar to humans. So far, three Mouse Prizes have been awarded: one for breaking longevity records to Dr. Andrzej Bartke of Southern Illinois University; one for late-onset rejuvenation strategies to Dr. Stephen Spindler of the University of California; and one to Dr. Z. Dave Sharp for his work with the pharmaceutical rapamycin.

Role of science fiction

Science fiction has often affected innovation and new technology by presenting creative, intriguing possibilities for technological advancement. For example, many rocketry pioneers were inspired by science fiction. The documentary How William Shatner Changed the World describes a number of examples of imagined technologies that became real.

In the media

The term bleeding edge has been used to refer to some new technologies, formed as an allusion to the similar terms "leading edge" and "cutting edge". It tends to imply even greater advancement, albeit at an increased risk because of the unreliability of the software or hardware. The first documented example of this term being used dates to early 1983, when an unnamed banking executive was quoted to have used it in reference to Storage Technology Corporation.

Equality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Equality_...