Search This Blog

Thursday, November 26, 2020

Vertical farming

From Wikipedia, the free encyclopedia
 
Lettuce grown in indoor vertical farming system

Vertical farming is the practice of growing crops in vertically stacked layers. It often incorporates controlled-environment agriculture, which aims to optimize plant growth, and soilless farming techniques such as hydroponics, aquaponics, and aeroponics. Some common choices of structures to house vertical farming systems include buildings, shipping containers, tunnels, and abandoned mine shafts. As of 2020, there is the equivalent of about 30 ha (74 acres) of operational vertical farmland in the world. The modern concept of vertical farming was proposed in 1999 by Dickson Despommier, professor of Public and Environmental Health at Columbia University. Despommier and his students came up with a design of a skyscraper farm that could feed 50,000 people. Although the design has not yet been built, it successfully popularized the idea of vertical farming. Current applications of vertical farmings coupled with other state-of-the-art technologies, such as specialized LED lights, have resulted in over 10 times the crop yield than would receive through traditional farming methods.

The main advantage of utilizing vertical farming technologies is the increased crop yield that comes with a smaller unit area of land requirement. The increased ability to cultivate a larger variety of crops at once because crops do not share the same plots of land while growing is another sought-after advantage. Additionally, crops are resistant to weather disruptions because of their placement indoors, meaning less crops lost to extreme or unexpected weather occurrences. Because of its limited land usage, vertical farming is less disruptive to the native plants and animals, leading to further conservation of the local flora and fauna.

Vertical farming technologies face economic challenges with large start-up costs compared to traditional farms. In Victoria, Australia, a “hypothetical 10 level vertical farm” would cost over 850 times more per cubic meter of arable land than a traditional farm in rural Victoria. Vertical farms also face large energy demands due to the use of supplementary light like LEDs. Moreover, if non-renewable energy is used to meet these energy demands, vertical farms could produce more pollution than traditional farms or greenhouses.

Techniques of vertical farming

Indoor Hydroponics of Morus, Japan

Hydroponics

Hydroponics refers to the technique of growing plants without soil. In hydroponic systems, the roots of plants are submerged in liquid solutions containing macronutrients, such as nitrogen, phosphorus, sulphur, potassium, calcium, and magnesium, as well as trace elements, including iron, chlorine, manganese, boron, zinc, copper, and molybdenum. Additionally, inert (chemically inactive) mediums such as gravel, sand, and sawdust are used as soil substitutes to provide support for the roots.

The advantages of hydroponics include the ability to increase yield per area and reduce water usage. A study has shown that, compared to conventional farming, hydroponic farming could increase the yield per area of lettuce by around 11 times while requiring 13 times less water. Due to these advantages, hydroponics is the predominant growing system used in vertical farming.

Aquaponics with catfish

Aquaponics

The term aquaponics is coined by combining two words: aquaculture, which refers to fish farming, and hydroponics—the technique of growing plants without soil. Aquaponics takes hydroponics one step further by integrating the production of terrestrial plants with the production of aquatic organisms in a closed-loop system that mimics nature itself. Nutrient-rich wastewater from the fish tanks is filtered by a solid removal unit and then led to a bio-filter, where toxic ammonia is converted to nutritious nitrate. While absorbing nutrients, the plants then purify the wastewater, which is recycled back to the fish tanks. Moreover, the plants consume carbon dioxide produced by the fish, and water in the fish tanks obtains heat and helps the greenhouse maintain temperature at night to save energy. As most commercial vertical farming systems focus on producing a few fast-growing vegetable crops, aquaponics, which also includes an aquacultural component, is currently not as widely used as conventional hydroponics.

Aeroponics

Aeroponically-grown chives

The invention of aeroponics was motivated by the initiative of NASA (the National Aeronautical and Space Administration) to find an efficient way to grow plants in space in the 1990s. Unlike conventional hydroponics and aquaponics, aeroponics does not require any liquid or solid medium to grow plants in. Instead, a liquid solution with nutrients is misted in air chambers where the plants are suspended. By far, aeroponics is the most sustainable soil-less growing technique, as it uses up to 90% less water than the most efficient conventional hydroponic systems and requires no replacement of growing medium. Moreover, the absence of growing medium allows aeroponic systems to adopt a vertical design, which further saves energy as gravity automatically drains away excess liquid, whereas conventional horizontal hydroponic systems often require water pumps for controlling excess solution. Currently, aeroponic systems have not been widely applied to vertical farming, but are starting to attract significant attention.

Controlled-environment agriculture

Controlled-environment agriculture (CEA) is the modification of the natural environment to increase crop yield or extend the growing season. CEA systems are typically hosted in enclosed structures such as greenhouses or buildings, where control can be imposed on environmental factors including air, temperature, light, water, humidity, carbon dioxide, and plant nutrition. In vertical farming systems, CEA is often used in conjunction with soilless farming techniques such as hydroponics, aquaponics, and aeroponics.

Types of vertical farming

Building-based vertical farms

Vertical farm in Moscow.

Abandoned buildings are often reused for vertical farming, such as a farm at Chicago called “The Plant,” which was transformed from an old meatpacking plant. However, new builds are sometimes also constructed to house vertical farming systems.

Shipping-container vertical farms

Recycled shipping containers are an increasingly popular option for housing vertical farming systems. The shipping containers serve as standardized, modular chambers for growing a variety of plants, and are often equipped with LED lighting, vertically stacked hydroponics, smart climate controls, and monitoring sensors. Moreover, by stacking the shipping containers, farms can save space even further and achieve higher yield per square foot.

Deep farms

A “deep farm” is a vertical farm built from refurbished underground tunnels or abandoned mine shafts. As temperature and humidity underground are generally temperate and constant, deep farms require less energy for heating. Deep farms can also use nearby groundwater to reduce the cost of water supply. Despite low costs, a deep farm can produce 7 to 9 times more food than a conventional farm above ground on the same area of land, according to Saffa Riffat, chair in Sustainable Energy at the University of Nottingham. Coupled with automated harvesting systems, these underground farms can be fully self-sufficient.

History

Initial propositions

Dickson Despommier, professor of Public and Environmental Health at Columbia University, founded the root of the concept of vertical farming. In 1999, he challenged his class of graduate students to calculate how much food they could grow on the rooftops of New York. The student concluded that they could only feed about 1000 people. Unsatisfied with the results, Despommier suggested growing plants indoors instead, on multiple layers vertically. Despommier and his students then proposed a design of a 30-story vertical farm equipped with artificial lighting, advanced hydroponics, and aeroponics that could produce enough food for 50,000 people. They further outlined that approximately 100 kinds of fruits and vegetables would grow on the upper floors while lower floors would house chickens and fish subsisting on the plant waste. Although Despommier's skyscraper farm has not yet been built, it popularized the idea of vertical farming and inspired many later designs.

Implementations

Developers and local governments in multiple cities have expressed interest in establishing a vertical farm: Incheon (South Korea), Abu Dhabi (United Arab Emirates), Dongtan (China), New York City, Portland, Los Angeles, Las Vegas, Seattle, Surrey, Toronto, Paris, Bangalore, Dubai, Shanghai, and Beijing. Around US$ 1.8 billion were invested into startups operating in the sector between 2014 and November 2020.

In 2009, the world's first pilot production system was installed at Paignton Zoo Environmental Park in the United Kingdom. The project showcased vertical farming and provided a solid base to research sustainable urban food production. The produce is used to feed the zoo's animals while the project enables evaluation of the systems and provides an educational resource to advocate for change in unsustainable land-use practices that impact upon global biodiversity and ecosystem services.

In 2010 the Green Zionist Alliance proposed a resolution at the 36th World Zionist Congress calling on Keren Kayemet L'Yisrael (Jewish National Fund in Israel) to develop vertical farms in Israel. Moreover, a company named "Podponics" built a vertical farm in Atlanta consisting of over 100 stacked "growpods" in 2010 but reportedly went bankrupt in May 2016.

In 2012 the world's first commercial vertical farm was opened in Singapore, developed by Sky Greens Farms, and is three stories high. They currently have over 100 nine meter-tall towers.

In 2012, a company named The Plant debuted its newly developed vertical farming system housed in an abandoned meatpacking building in Chicago, Illinois. The utilization of abandoned buildings to house vertical farms and other sustainable farming methods are a fact of the rapid urbanization of modern communities.

In 2013 the Association for Vertical Farming (AVF) was founded in Munich (Germany). By May 2015, the AVF had expanded with regional chapters all over Europe, Asia, USA, Canada and the United Kingdom. This organization unites growers and inventors to improve food security and sustainable development. The AVF focuses on advancing vertical farming technologies, designs and businesses by hosting international info-days, workshops, and summits.

In 2015 the London company, Growing Underground, began the production of leafy green produce underground in abandoned underground World War II tunnels.

In 2016, a startup called Local Roots launched the "TerraFarm", a vertical farming systems hosted in a 40-foot shipping container, which includes computer vision integrated with an artificial neural network to monitor the plants; and is remotely monitored from California. It is claimed that the TerraFarm system "has achieved cost parity with traditional, outdoor farming" with each unit producing the equivalent of "three to five acres of farmland," using 97% less water through water recapture and harvesting the evaporated water through the air conditioning. The first vertical farm in a US grocery store opened in Dallas, Texas in 2016, now closed.

In 2017, a Japanese company, Mirai, began marketing its multi-level vertical farming system. The company states that it can produce 10,000 heads of lettuce a day - 100 times the amount that could be produced with traditional agricultural methods, because their special purpose LED lights can decrease growing times by a factor of 2.5. Additionally, this can all be achieved with 40% less energy usage, 80% less food waste, and 99% less water usage than in traditional farming methods. Further requests have been made to implement this technology in several other Asian countries.

In 2019, Kroger partnered with German startup Infarm to install modular vertical farms in two Seattle-area grocery stores.

Advantages

Efficiency

Traditional farming's arable land requirements are too large and invasive to remain sustainable for future generations. With the ever-so-rapid population growth rates, it is expected that arable land per person will drop about 66% in 2050 in comparison to 1970. Vertical farming allows for, in some cases, over ten times the crop yield per acre than traditional methods. Unlike traditional farming in non-tropical areas, indoor farming can produce crops year-round. All-season farming multiplies the productivity of the farmed surface by a factor of 4 to 6 depending on the crop. With crops such as strawberries, the factor may be as high as 30.

Vertical farming also allows for the production of a larger variety of harvestable crops because of its usage of isolated crop sectors. As opposed to a traditional farm where one type of crop is harvested per season, vertical farms allow for a multitude of different crops to be grown and harvested at once due to their individual land plots.

According to the USDA, vertical farm produce only travels a short distance to reach stores compared to traditional farming method produce.

The United States Department of Agriculture predicts the worldwide population to exceed 9 billion by 2050, most of which will be living in urban or city areas. Vertical farming is the USDA's predicted answer to the potential food shortage as population increases. This method of farming is environmentally responsible by lowering emission and reducing needed water. This type of urban farming that would allow for nearly immediate farm to store transport would reduce distribution.

In a workshop on vertical farming put on by the USDA and the Department of Energy experts in vertical farming discussed plant breeding, pest management, and engineering. Control of pests (like insects, birds and rodents) is easily managed in vertical farms, because the area is so well-controlled. Without the need of chemical pesticides the ability to grow organic crops is easier than in traditional farming.

Resistance to weather

Crops grown in traditional outdoor farming depend on supportive weather and suffer from undesirable temperatures, rain, monsoon, hailstorm, tornado, flooding, wildfires, and drought. "Three recent floods (in 1993, 2007 and 2008) cost the United States billions of dollars in lost crops, with even more devastating losses in topsoil. Changes in rain patterns and temperature could diminish India's agricultural output by 30 percent by the end of the century."

The issue of adverse weather conditions is especially relevant for arctic and sub-arctic areas like Alaska and northern Canada where traditional farming is largely impossible. Food insecurity has been a long-standing problem in remote northern communities where fresh produce has to be shipped large distances resulting in high costs and poor nutrition. Container-based farms can provide fresh produce year-round at a lower cost than shipping in supplies from more southerly locations with a number of farms operating in locations such as Churchill, Manitoba, and Unalaska, Alaska. As with disruption to crop growing, local container-based farms are also less susceptible to disruption than the long supply chains necessary to deliver traditionally grown produce to remote communities. Food prices in Churchill spiked substantially after floods in May and June 2017 forced the closure of the rail line that forms the only permanent overland connection between Churchill and the rest of Canada.

Environmental conservation

Up to 20 units of outdoor farmland per unit of vertical farming could return to its natural state, due to vertical farming's increased productivity. Vertical farming would reduce the amount of farmland, thus saving many natural resources.

Deforestation and desertification caused by agricultural encroachment on natural biomes could be avoided. Producing food indoors reduces or eliminates conventional plowing, planting, and harvesting by farm machinery, protecting soil, and reducing emissions.

Traditional farming is often invasive to the native flora and fauna because it requires such a large area of arable land. One study showed that wood mouse populations dropped from 25 per hectare to 5 per hectare after harvest, estimating 10 animals killed per hectare each year with conventional farming. In comparison, vertical farming would cause nominal harm to wildlife because of its limited space usage.

Problems

Economics

Vertical farms must overcome the financial challenge of large startup costs. The initial building costs could exceed $100 million for a 60 hectare vertical farm. Urban occupancy costs can be high, resulting in much higher startup costs – and a longer break even time – than for a traditional farm in rural areas.

Opponents question the potential profitability of vertical farming. In order for vertical farms to be successful financially, high value crops must be grown since traditional farms provide low value crops like wheat at cheaper costs than a vertical farm. Louis Albright, a professor in biological and environmental engineering at Cornell stated that a loaf of bread that was made from wheat grown in a vertical farm would cost US$27. However, according to the US Bureau of Labor Statistics, the average loaf of bread cost US$1.296 in September 2019, clearly showing how crops grown in vertical farms will be noncompetitive compared to crops grown in traditional outdoor farms. In order for vertical farms to be profitable, the costs of operating these farms must decrease. The developers of the TerraFarm system produced from second hand, 40 foot shipping containers claimed that their system "has achieved cost parity with traditional, outdoor farming".

A theoretical 10 storey vertical wheat farm could produce up to 1,940 tons of wheat per hectare compared to a global average of 3.2 tons of wheat per hectare (600 times yield). Current methods require enormous energy consumption for lighting, temprerature, humidity control, carbon dioxide input and fertilizer and consequently the authors concluded it was "unlikely to be economically competitive with current market prices".

According to a report in The Financial Times as of 2020, most vertical farming companies have been unprofitable, except for a number of Japanese companies.

Energy use

During the growing season, the sun shines on a vertical surface at an extreme angle such that much less light is available to crops than when they are planted on flat land. Therefore, supplemental light would be required. Bruce Bugbee claimed that the power demands of vertical farming would be uncompetitive with traditional farms using only natural light. Environmental writer George Monbiot calculated that the cost of providing enough supplementary light to grow the grain for a single loaf would be about $15. An article in the Economist argued that "even though crops growing in a glass skyscraper will get some natural sunlight during the day, it won't be enough" and "the cost of powering artificial lights will make indoor farming prohibitively expensive". Moreover, researchers determined that if only solar panels were to be used to meet the energy consumption of a vertical farm, “the area of solar panels required would need to be a factor of twenty times greater than the arable area on a multi-level indoor farm”, which will be hard to accomplish with larger vertical farms. A hydroponic farm growing lettuce in Arizona would require 15,000 kJ of energy per kilogram of lettuce produced. To put this amount of energy into perspective, a traditional outdoor lettuce farm in Arizona only requires 1100 kJ of energy per kilogram of lettuce grown.

As the book by Dr. Dickson Despommier "The Vertical Farm" proposes a controlled environment, heating and cooling costs will resemble those of any other multiple story building. Plumbing and elevator systems are necessary to distribute nutrients and water. In the northern continental United States, fossil fuel heating cost can be over $200,000 per hectare. Research conducted in 2015 compared the growth of lettuce in Arizona using conventional agricultural methods and a hydroponic farm. They determined that heating and cooling made up more than 80% of the energy consumption in the hydroponic farm, with the heating and cooling needing 7400 kJ per kilogram of lettuce produced. According to the same study, the total energy consumption of the hydroponic farm is 90,000 kJ per kilogram of lettuce. If the energy consumption is not addressed, vertical farms may be an unsustainable alternative to traditional agriculture.

Pollution

There are a number of interrelated challenges with some potential solutions:

  • Power needs: If power needs are met by fossil fuels, the environmental effect may be a net loss; even building low-carbon capacity to power the farms may not make as much sense as simply leaving traditional farms in place, while burning less coal. Louis Albright argued that in a “closed-system urban farming based on electrically generated photosynthetic light”, a pound of lettuce would result in 8 pounds of carbon dioxide being produced at a power plant, and 4,000 pounds of lettuce produced would be equivalent to the annual emissions of a family car. He also argues that the carbon footprint of tomatoes grown in a similar system would be twice as big as the carbon footprint of lettuce. However, lettuce produced in a greenhouse that allows for sunlight to reach the crops saw a 300 percent reduction in carbon dioxide emissions per head of lettuce. As vertical farm system become more efficient in harnessing sunlight, they will produce less pollution.
  • Carbon emission: A vertical farm requires a CO2 source, most likely from combustion if colocated with electric utility plants; absorbing CO2 that would otherwise be jettisoned is possible. Greenhouses commonly supplement carbon dioxide levels to 3–4 times the atmospheric rate. This increase in CO2 increases photosynthesis at varying rates, averaging 50%, contributing not only to higher yields, but also to faster plant maturation, shrinking of pores and greater resilience to water stress (both too much and little). Vertical farms need not exist in isolation, hardier mature plants could be transferred to traditional greenhouse, freeing up space and increasing cost flexibility.
  • Crop damage: Some greenhouses burn fossil fuels purely to produce CO2, such as from furnaces, which contain pollutants such as sulphur dioxide and ethylene. These pollutants can significantly damage plants, so gas filtration is a component of high production systems.
  • Ventilation: "Necessary" ventilation may allow CO2 to leak into the atmosphere, though recycling systems could be devised. This is not limited to humidity tolerant and humidity intolerant crop polyculture cycling (as opposed to monoculture).
  • Light Pollution: Greenhouse growers commonly exploit photoperiodism in plants to control whether the plants are in a vegetative or reproductive stage. As part of this control, the lights stay on past sunset and before sunrise or periodically throughout the night. Single story greenhouses have attracted criticism over light pollution, though a typical urban vertical farm may also produce light pollution.
  • Water Pollution: Hydroponic greenhouses regularly change the water, producing water containing fertilizers and pesticides that must be disposed of. Spreading the effluent over neighboring farmland or wetlands would be difficult for an urban vertical farm, while water treatment remedies (natural or otherwise) could be part of a solution.

 

Molecular nanotechnology

From Wikipedia, the free encyclopedia
 
 

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis. This is distinct from nanoscale materials. Based on Richard Feynman's vision of miniature factories using nanomachines to build complex products (including additional nanomachines), this advanced form of nanotechnology (or molecular manufacturing) would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

Introduction

While conventional chemistry uses inexact processes obtaining inexact results, and biology exploits inexact processes to obtain definitive results, molecular nanotechnology would employ original definitive processes to obtain definitive results. The desire in molecular nanotechnology would be to balance molecular reactions in positionally-controlled locations and orientations to obtain desired chemical reactions, and then to build systems by further assembling the products of these reactions.

A roadmap for the development of MNT is an objective of a broadly based technology project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Institute. The roadmap was originally scheduled for completion by late 2006, but was released in January 2008. The Nanofactory Collaboration is a more focused ongoing effort involving 23 researchers from 10 organizations and 4 countries that is developing a practical research agenda specifically aimed at positionally-controlled diamond mechanosynthesis and diamondoid nanofactory development. In August 2005, a task force consisting of 50+ international experts from various fields was organized by the Center for Responsible Nanotechnology to study the societal implications of molecular nanotechnology.

Projected applications and capabilities

Smart materials and nanosensors

One proposed application of MNT is so-called smart materials. This term refers to any sort of material designed and engineered at the nanometer scale for a specific task. It encompasses a wide variety of possible commercial applications. One example would be materials designed to respond differently to various molecules; such a capability could lead, for example, to artificial drugs which would recognize and render inert specific viruses. Another is the idea of self-healing structures, which would repair small tears in a surface naturally in the same way as self-sealing tires or human skin.

A MNT nanosensor would resemble a smart material, involving a small component within a larger machine that would react to its environment and change in some fundamental, intentional way. A very simple example: a photosensor might passively measure the incident light and discharge its absorbed energy as electricity when the light passes above or below a specified threshold, sending a signal to a larger machine. Such a sensor would supposedly cost less and use less power than a conventional sensor, and yet function usefully in all the same applications — for example, turning on parking lot lights when it gets dark.

While smart materials and nanosensors both exemplify useful applications of MNT, they pale in comparison with the complexity of the technology most popularly associated with the term: the replicating nanorobot.

Replicating nanorobots

MNT nanofacturing is popularly linked with the idea of swarms of coordinated nanoscale robots working together, a popularization of an early proposal by K. Eric Drexler in his 1986 discussions of MNT, but superseded in 1992. In this early proposal, sufficiently capable nanorobots would construct more nanorobots in an artificial environment containing special molecular building blocks.

Critics have doubted both the feasibility of self-replicating nanorobots and the feasibility of control if self-replicating nanorobots could be achieved: they cite the possibility of mutations removing any control and favoring reproduction of mutant pathogenic variations. Advocates address the first doubt by pointing out that the first macroscale autonomous machine replicator, made of Lego blocks, was built and operated experimentally in 2002. While there are sensory advantages present at the macroscale compared to the limited sensorium available at the nanoscale, proposals for positionally controlled nanoscale mechanosynthetic fabrication systems employ dead reckoning of tooltips combined with reliable reaction sequence design to ensure reliable results, hence a limited sensorium is no handicap; similar considerations apply to the positional assembly of small nanoparts. Advocates address the second doubt by arguing that bacteria are (of necessity) evolved to evolve, while nanorobot mutation could be actively prevented by common error-correcting techniques. Similar ideas are advocated in the Foresight Guidelines on Molecular Nanotechnology, and a map of the 137-dimensional replicator design space recently published by Freitas and Merkle provides numerous proposed methods by which replicators could, in principle, be safely controlled by good design.

However, the concept of suppressing mutation raises the question: How can design evolution occur at the nanoscale without a process of random mutation and deterministic selection? Critics argue that MNT advocates have not provided a substitute for such a process of evolution in this nanoscale arena where conventional sensory-based selection processes are lacking. The limits of the sensorium available at the nanoscale could make it difficult or impossible to winnow successes from failures. Advocates argue that design evolution should occur deterministically and strictly under human control, using the conventional engineering paradigm of modeling, design, prototyping, testing, analysis, and redesign.

In any event, since 1992 technical proposals for MNT do not include self-replicating nanorobots, and recent ethical guidelines put forth by MNT advocates prohibit unconstrained self-replication.

Medical nanorobots

One of the most important applications of MNT would be medical nanorobotics or nanomedicine, an area pioneered by Robert Freitas in numerous books and papers. The ability to design, build, and deploy large numbers of medical nanorobots would, at a minimum, make possible the rapid elimination of disease and the reliable and relatively painless recovery from physical trauma. Medical nanorobots might also make possible the convenient correction of genetic defects, and help to ensure a greatly expanded lifespan. More controversially, medical nanorobots might be used to augment natural human capabilities. One study has reported on how conditions like tumors, arteriosclerosis, blood clots leading to stroke, accumulation of scar tissue and localized pockets of infection can possibly be addressed by employing medical nanorobots.

Utility fog

Diagram of a 100 micrometer foglet

Another proposed application of molecular nanotechnology is "utility fog" — in which a cloud of networked microscopic robots (simpler than assemblers) would change its shape and properties to form macroscopic objects and tools in accordance with software commands. Rather than modify the current practices of consuming material goods in different forms, utility fog would simply replace many physical objects.

Phased-array optics

Yet another proposed application of MNT would be phased-array optics (PAO). However, this appears to be a problem addressable by ordinary nanoscale technology. PAO would use the principle of phased-array millimeter technology but at optical wavelengths. This would permit the duplication of any sort of optical effect but virtually. Users could request holograms, sunrises and sunsets, or floating lasers as the mood strikes. PAO systems were described in BC Crandall's Nanotechnology: Molecular Speculations on Global Abundance in the Brian Wowk article "Phased-Array Optics."

Potential social impacts

Molecular manufacturing is a potential future subfield of nanotechnology that would make it possible to build complex structures at atomic precision. Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories weighing a kilogram or more. When nanofactories gain the ability to produce other nanofactories production may only be limited by relatively abundant factors such as input materials, energy and software.

The products of molecular manufacturing could range from cheaper, mass-produced versions of known high-tech products to novel products with added capabilities in many areas of application. Some applications that have been suggested are advanced smart materials, nanosensors, medical nanorobots and space travel. Additionally, molecular manufacturing could be used to cheaply produce highly advanced, durable weapons, which is an area of special concern regarding the impact of nanotechnology. Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.

According to Chris Phoenix and Mike Treder from the Center for Responsible Nanotechnology as well as Anders Sandberg from the Future of Humanity Institute molecular manufacturing is the application of nanotechnology that poses the most significant global catastrophic risk. Several nanotechnology researchers state that the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government. Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races): (1) A large number of players may be tempted to enter the race since the threshold for doing so is low; (2) the ability to make weapons with molecular manufacturing will be cheap and easy to hide; (3) therefore lack of insight into the other parties' capabilities can tempt players to arm out of caution or to launch preemptive strikes; (4) molecular manufacturing may reduce dependency on international trade, a potential peace-promoting factor; (5) wars of aggression may pose a smaller economic threat to the aggressor since manufacturing is cheap and humans may not be needed on the battlefield.

Since self-regulation by all state and non-state actors seems hard to achieve, measures to mitigate war-related risks have mainly been proposed in the area of international cooperation. International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control. International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed. One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour. The Center for Responsible Nanotechnology also suggest some technical restrictions. Improved transparency regarding technological capabilities may be another important facilitator for arms-control.

A grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation, has been analyzed by Freitas in "Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations"  and has been a theme in mainstream media and fiction. This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nanotech experts including Drexler now discredit the scenario. According to Chris Phoenix a "So-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident". With the advent of nano-biotech, a different scenario called green goo has been forwarded. Here, the malignant substance is not nanobots but rather self-replicating biological organisms engineered through nanotechnology.

Benefits

Nanotechnology (or molecular nanotechnology to refer more specifically to the goals discussed here) will let us continue the historical trends in manufacturing right up to the fundamental limits imposed by physical law. It will let us make remarkably powerful molecular computers. It will let us make materials over fifty times lighter than steel or aluminium alloy but with the same strength. We'll be able to make jets, rockets, cars or even chairs that, by today's standards, would be remarkably light, strong, and inexpensive. Molecular surgical tools, guided by molecular computers and injected into the blood stream could find and destroy cancer cells or invading bacteria, unclog arteries, or provide oxygen when the circulation is impaired.

Nanotechnology will replace our entire manufacturing base with a new, radically more precise, radically less expensive, and radically more flexible way of making products. The aim is not simply to replace today's computer chip making plants, but also to replace the assembly lines for cars, televisions, telephones, books, surgical tools, missiles, bookcases, airplanes, tractors, and all the rest. The objective is a pervasive change in manufacturing, a change that will leave virtually no product untouched. Economic progress and military readiness in the 21st Century will depend fundamentally on maintaining a competitive position in nanotechnology.

Despite the current early developmental status of nanotechnology and molecular nanotechnology, much concern surrounds MNT's anticipated impact on economics and on law. Whatever the exact effects, MNT, if achieved, would tend to reduce the scarcity of manufactured goods and make many more goods (such as food and health aids) manufacturable.

MNT should make possible nanomedical capabilities able to cure any medical condition not already cured by advances in other areas. Good health would be common, and poor health of any form would be as rare as smallpox and scurvy are today. Even cryonics would be feasible, as cryopreserved tissue could be fully repaired.

Risks

Molecular nanotechnology is one of the technologies that some analysts believe could lead to a technological singularity, in which technological growth has accelerated to the point of having unpredictable effects. Some effects could be beneficial, while others could be detrimental, such as the utilization of molecular nanotechnology by an unfriendly artificial general intelligence. Some feel that molecular nanotechnology would have daunting risks. It conceivably could enable cheaper and more destructive conventional weapons. Also, molecular nanotechnology might permit weapons of mass destruction that could self-replicate, as viruses and cancer cells do when attacking the human body. Commentators generally agree that, in the event molecular nanotechnology were developed, its self-replication should be permitted only under very controlled or "inherently safe" conditions.

A fear exists that nanomechanical robots, if achieved, and if designed to self-replicate using naturally occurring materials (a difficult task), could consume the entire planet in their hunger for raw materials, or simply crowd out natural life, out-competing it for energy (as happened historically when blue-green algae appeared and outcompeted earlier life forms). Some commentators have referred to this situation as the "grey goo" or "ecophagy" scenario. K. Eric Drexler considers an accidental "grey goo" scenario extremely unlikely and says so in later editions of Engines of Creation.

In light of this perception of potential danger, the Foresight Institute, founded by Drexler, has prepared a set of guidelines for the ethical development of nanotechnology. These include the banning of free-foraging self-replicating pseudo-organisms on the Earth's surface, at least, and possibly in other places.

Technical issues and criticism

The feasibility of the basic technologies analyzed in Nanosystems has been the subject of a formal scientific review by U.S. National Academy of Sciences, and has also been the focus of extensive debate on the internet and in the popular press.

Study and recommendations by the U.S. National Academy of Sciences

In 2006, U.S. National Academy of Sciences released the report of a study of molecular manufacturing as part of a longer report, A Matter of Size: Triennial Review of the National Nanotechnology Initiative The study committee reviewed the technical content of Nanosystems, and in its conclusion states that no current theoretical analysis can be considered definitive regarding several questions of potential system performance, and that optimal paths for implementing high-performance systems cannot be predicted with confidence. It recommends experimental research to advance knowledge in this area:

"Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths that might lead to systems which greatly exceed the thermodynamic efficiencies and other capabilities of biological systems cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal."

Assemblers versus nanofactories

A section heading in Drexler's Engines of Creation reads "Universal Assemblers", and the following text speaks of multiple types of assemblers which, collectively, could hypothetically "build almost anything that the laws of nature allow to exist." Drexler's colleague Ralph Merkle has noted that, contrary to widespread legend, Drexler never claimed that assembler systems could build absolutely any molecular structure. The endnotes in Drexler's book explain the qualification "almost": "For example, a delicate structure might be designed that, like a stone arch, would self-destruct unless all its pieces were already in place. If there were no room in the design for the placement and removal of a scaffolding, then the structure might be impossible to build. Few structures of practical interest seem likely to exhibit such a problem, however."

In 1992, Drexler published Nanosystems: Molecular Machinery, Manufacturing, and Computation, a detailed proposal for synthesizing stiff covalent structures using a table-top factory. Diamondoid structures and other stiff covalent structures, if achieved, would have a wide range of possible applications, going far beyond current MEMS technology. An outline of a path was put forward in 1992 for building a table-top factory in the absence of an assembler. Other researchers have begun advancing tentative, alternative proposed paths for this in the years since Nanosystems was published.

Hard versus soft nanotechnology

In 2004 Richard Jones wrote Soft Machines (nanotechnology and life), a book for lay audiences published by Oxford University. In this book he describes radical nanotechnology (as advocated by Drexler) as a deterministic/mechanistic idea of nano engineered machines that does not take into account the nanoscale challenges such as wetness, stickiness, Brownian motion, and high viscosity. He also explains what is soft nanotechnology or more appropriately biomimetic nanotechnology which is the way forward, if not the best way, to design functional nanodevices that can cope with all the problems at a nanoscale. One can think of soft nanotechnology as the development of nanomachines that uses the lessons learned from biology on how things work, chemistry to precisely engineer such devices and stochastic physics to model the system and its natural processes in detail.

The Smalley–Drexler debate

Several researchers, including Nobel Prize winner Dr. Richard Smalley (1943–2005), attacked the notion of universal assemblers, leading to a rebuttal from Drexler and colleagues, and eventually to an exchange of letters. Smalley argued that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Drexler and colleagues, however, noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley's arguments to the more specific proposals advanced in Nanosystems. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler's proposal to use a high vacuum environment was not feasible. However, Drexler addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. It is noteworthy that, contrary to Smalley's opinion that enzymes require water, "Not only do enzymes work vigorously in anhydrous organic media, but in this unnatural milieu they acquire remarkable properties such as greatly enhanced stability, radically altered substrate and enantiomeric specificities, molecular memory, and the ability to catalyse unusual reactions."

Redefining of the word "nanotechnology"

For the future, some means have to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: "A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works."  A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the "blind watchmaker" comprising random molecular variation and deterministic reproduction/extinction.

At present in 2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example, supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid-1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of "nano"-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.

The feasibility of the proposals in Nanosystems

Top, a molecular propellor. Bottom, a molecular planetary gear system. The feasibility of devices like these has been questioned.

The feasibility of Drexler's proposals largely depends, therefore, on whether designs like those in Nanosystems could be built in the absence of a universal assembler to build them and would work as described. Supporters of molecular nanotechnology frequently claim that no significant errors have been discovered in Nanosystems since 1992. Even some critics concede that "Drexler has carefully considered a number of physical principles underlying the 'high level' aspects of the nanosystems he proposes and, indeed, has thought in some detail" about some issues.

Other critics claim, however, that Nanosystems omits important chemical details about the low-level 'machine language' of molecular nanotechnology. They also claim that much of the other low-level chemistry in Nanosystems requires extensive further work, and that Drexler's higher-level designs therefore rest on speculative foundations. Recent such further work by Freitas and Merkle is aimed at strengthening these foundations by filling the existing gaps in the low-level chemistry.

Drexler argues that we may need to wait until our conventional nanotechnology improves before solving these issues: "Molecular manufacturing will result from a series of advances in molecular machine systems, much as the first Moon landing resulted from a series of advances in liquid-fuel rocket systems. We are now in a position like that of the British Interplanetary Society of the 1930s which described how multistage liquid-fueled rockets could reach the Moon and pointed to early rockets as illustrations of the basic principle." However, Freitas and Merkle argue  that a focused effort to achieve diamond mechanosynthesis (DMS) can begin now, using existing technology, and might achieve success in less than a decade if their "direct-to-DMS approach is pursued rather than a more circuitous development approach that seeks to implement less efficacious nondiamondoid molecular manufacturing technologies before progressing to diamondoid".

To summarize the arguments against feasibility: First, critics argue that a primary barrier to achieving molecular nanotechnology is the lack of an efficient way to create machines on a molecular/atomic scale, especially in the absence of a well-defined path toward a self-replicating assembler or diamondoid nanofactory. Advocates respond that a preliminary research path leading to a diamondoid nanofactory is being developed.

A second difficulty in reaching molecular nanotechnology is design. Hand design of a gear or bearing at the level of atoms might take a few to several weeks. While Drexler, Merkle and others have created designs of simple parts, no comprehensive design effort for anything approaching the complexity of a Model T Ford has been attempted. Advocates respond that it is difficult to undertake a comprehensive design effort in the absence of significant funding for such efforts, and that despite this handicap much useful design-ahead has nevertheless been accomplished with new software tools that have been developed, e.g., at Nanorex.

In the latest report A Matter of Size: Triennial Review of the National Nanotechnology Initiative put out by the National Academies Press in December 2006 (roughly twenty years after Engines of Creation was published), no clear way forward toward molecular nanotechnology could yet be seen, as per the conclusion on page 108 of that report: "Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths that might lead to systems which greatly exceed the thermodynamic efficiencies and other capabilities of biological systems cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal." This call for research leading to demonstrations is welcomed by groups such as the Nanofactory Collaboration who are specifically seeking experimental successes in diamond mechanosynthesis. The "Technology Roadmap for Productive Nanosystems" aims to offer additional constructive insights.

It is perhaps interesting to ask whether or not most structures consistent with physical law can in fact be manufactured. Advocates assert that to achieve most of the vision of molecular manufacturing it is not necessary to be able to build "any structure that is compatible with natural law." Rather, it is necessary to be able to build only a sufficient (possibly modest) subset of such structures—as is true, in fact, of any practical manufacturing process used in the world today, and is true even in biology. In any event, as Richard Feynman once said, "It is scientific only to say what's more likely or less likely, and not to be proving all the time what's possible or impossible."

Existing work on diamond mechanosynthesis

There is a growing body of peer-reviewed theoretical work on synthesizing diamond by mechanically removing/adding hydrogen atoms  and depositing carbon atoms  (a process known as mechanosynthesis). This work is slowly permeating the broader nanoscience community and is being critiqued. For instance, Peng et al. (2006) (in the continuing research effort by Freitas, Merkle and their collaborators) reports that the most-studied mechanosynthesis tooltip motif (DCB6Ge) successfully places a C2 carbon dimer on a C(110) diamond surface at both 300 K (room temperature) and 80 K (liquid nitrogen temperature), and that the silicon variant (DCB6Si) also works at 80 K but not at 300 K. Over 100,000 CPU hours were invested in this latest study. The DCB6 tooltip motif, initially described by Merkle and Freitas at a Foresight Conference in 2002, was the first complete tooltip ever proposed for diamond mechanosynthesis and remains the only tooltip motif that has been successfully simulated for its intended function on a full 200-atom diamond surface.

The tooltips modeled in this work are intended to be used only in carefully controlled environments (e. g., vacuum). Maximum acceptable limits for tooltip translational and rotational misplacement errors are reported in Peng et al. (2006) -- tooltips must be positioned with great accuracy to avoid bonding the dimer incorrectly. Peng et al. (2006) reports that increasing the handle thickness from 4 support planes of C atoms above the tooltip to 5 planes decreases the resonance frequency of the entire structure from 2.0 THz to 1.8 THz. More importantly, the vibrational footprints of a DCB6Ge tooltip mounted on a 384-atom handle and of the same tooltip mounted on a similarly constrained but much larger 636-atom "crossbar" handle are virtually identical in the non-crossbar directions. Additional computational studies modeling still bigger handle structures are welcome, but the ability to precisely position SPM tips to the requisite atomic accuracy has been repeatedly demonstrated experimentally at low temperature, or even at room temperature constituting a basic existence proof for this capability.

Further research to consider additional tooltips will require time-consuming computational chemistry and difficult laboratory work.

A working nanofactory would require a variety of well-designed tips for different reactions, and detailed analyses of placing atoms on more complicated surfaces. Although this appears a challenging problem given current resources, many tools will be available to help future researchers: Moore's law predicts further increases in computer power, semiconductor fabrication techniques continue to approach the nanoscale, and researchers grow ever more skilled at using proteins, ribosomes and DNA to perform novel chemistry.

Works of fiction

  • In The Diamond Age by Neal Stephenson, diamond can be built directly out of carbon atoms. All sorts of devices from dust-size detection devices to giant diamond zeppelins are constructed atom by atom using only carbon, oxygen, nitrogen and chlorine atoms.
  • In the novel Tomorrow by Andrew Saltzman (ISBN 1-4243-1027-X), a scientist uses nanorobotics to create a liquid that when inserted into the bloodstream, renders one nearly invincible given that the microscopic machines repair tissue almost instantaneously after it is damaged.
  • In the roleplaying game Splicers by Palladium Books, humanity has succumbed to a "nanobot plague" that causes any object made of a non-precious metal to twist and change shape (sometimes into a type of robot) moments after being touched by a human. The object will then proceed to attack the human. This has forced humanity to develop "biotechnological" devices to replace those previously made of metal.
  • On the television show Mystery Science Theater 3000, the Nanites (voiced variously by Kevin Murphy, Paul Chaplin, Mary Jo Pehl, and Bridget Jones) – are self-replicating, bio-engineered organisms that work on the ship, they are microscopic creatures that reside in the Satellite of Love's computer systems. (They are similar to the creatures in Star Trek: The Next Generation episode "Evolution", which featured "nanites" taking over the Enterprise.) The Nanites made their first appearance in season 8. Based on the concept of nanotechnology, their comical deus ex machina activities included such diverse tasks as instant repair and construction, hairstyling, performing a Nanite variation of a flea circus, conducting a microscopic war, and even destroying the Observers' planet after a dangerously vague request from Mike to "take care of [a] little problem". They also ran a microbrewery.
  • Stargate Atlantis has an enemy made of self-assembling nanorobots, which also convert a planet into grey goo.
  • In the novel "Prey" by Michael Crichton, self replicating nanobots create autonomous nano-swarms with predatory behaviors. The protagonist must stop the swarm before it evolves into a grey goo plague.
  • In the films Avengers Infinity War and Avengers Endgame Tony Stark's Iron Man suit was constructed using nanotechnology.

 

Emerging technologies

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Emerging_technologies

Emerging technologies are technologies whose development, practical applications, or both are still largely unrealized, such that they are figuratively emerging into prominence from a background of nonexistence or obscurity. These technologies are new, such as various applications of biotechnology including gene therapy (which date to circa 1990 but even today have large undeveloped potential). Emerging technologies are often perceived as capable of changing the status quo.

Emerging technologies are characterized by radical novelty (in application even if not in origins), relatively fast growth, coherence, prominent impact, and uncertainty and ambiguity. In other words, an emerging technology can be defined as "a radically novel and relatively fast growing technology characterised by a certain degree of coherence persisting over time and with the potential to exert a considerable impact on the socio-economic domain(s) which is observed in terms of the composition of actors, institutions and patterns of interactions among those, along with the associated knowledge production processes. Its most prominent impact, however, lies in the future and so in the emergence phase is still somewhat uncertain and ambiguous."

Emerging technologies include a variety of technologies such as educational technology, information technology, nanotechnology, biotechnology, cognitive science, psychotechnology, robotics, and artificial intelligence.

New technological fields may result from the technological convergence of different systems evolving towards similar goals. Convergence brings previously separate technologies such as voice (and telephony features), data (and productivity applications) and video together so that they share resources and interact with each other, creating new efficiencies.

Emerging technologies are those technical innovations which represent progressive developments within a field for competitive advantage; converging technologies represent previously distinct fields which are in some way moving towards stronger inter-connection and similar goals. However, the opinion on the degree of the impact, status and economic viability of several emerging and converging technologies varies.

History of emerging technologies

In the history of technology, emerging technologies are contemporary advances and innovation in various fields of technology.

Over centuries innovative methods and new technologies are developed and opened up. Some of these technologies are due to theoretical research, and others from commercial research and development.

Technological growth includes incremental developments and disruptive technologies. An example of the former was the gradual roll-out of DVD (digital video disc) as a development intended to follow on from the previous optical technology compact disc. By contrast, disruptive technologies are those where a new method replaces the previous technology and makes it redundant, for example, the replacement of horse-drawn carriages by automobiles and other vehicles.

Emerging technology debates

Many writers, including computer scientist Bill Joy, have identified clusters of technologies that they consider critical to humanity's future. Joy warns that the technology could be used by elites for good or evil. They could use it as "good shepherds" for the rest of humanity or decide everyone else is superfluous and push for mass extinction of those made unnecessary by technology.

Advocates of the benefits of technological change typically see emerging and converging technologies as offering hope for the betterment of the human condition. Cyberphilosophers Alexander Bard and Jan Söderqvist argue in The Futurica Trilogy that while Man himself is basically constant throughout human history (genes change very slowly), all relevant change is rather a direct or indirect result of technological innovation (memes change very fast) since new ideas always emanate from technology use and not the other way around. Man should consequently be regarded as history's main constant and technology as its main variable. However, critics of the risks of technological change, and even some advocates such as transhumanist philosopher Nick Bostrom, warn that some of these technologies could pose dangers, perhaps even contribute to the extinction of humanity itself; i.e., some of them could involve existential risks.

Much ethical debate centers on issues of distributive justice in allocating access to beneficial forms of technology. Some thinkers, including environmental ethicist Bill McKibben, oppose the continuing development of advanced technology partly out of fear that its benefits will be distributed unequally in ways that could worsen the plight of the poor. By contrast, inventor Ray Kurzweil is among techno-utopians who believe that emerging and converging technologies could and will eliminate poverty and abolish suffering.

Some analysts such as Martin Ford, author of The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, argue that as information technology advances, robots and other forms of automation will ultimately result in significant unemployment as machines and software begin to match and exceed the capability of workers to perform most routine jobs.

As robotics and artificial intelligence develop further, even many skilled jobs may be threatened. Technologies such as machine learning may ultimately allow computers to do many knowledge-based jobs that require significant education. This may result in substantial unemployment at all skill levels, stagnant or falling wages for most workers, and increased concentration of income and wealth as the owners of capital capture an ever-larger fraction of the economy. This in turn could lead to depressed consumer spending and economic growth as the bulk of the population lacks sufficient discretionary income to purchase the products and services produced by the economy.

Emerging technologies

Examples

Artificial intelligence

Artificial intelligence (AI) is the sub intelligence exhibited by machines or software, and the branch of computer science that develops machines and software with animal-like intelligence. Major AI researchers and textbooks define the field as "the study and design of intelligent agents," where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1942, defines it as "the study of making intelligent machines".

The central functions (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still among the field's long-term goals. Currently, popular approaches include deep learning, statistical methods, computational intelligence and traditional symbolic AI. There is an enormous number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others.

3D printing

3D printing, also known as additive manufacturing, has been posited by Jeremy Rifkin and others as part of the third industrial revolution.[17]

Combined with Internet technology, 3D printing would allow for digital blueprints of virtually any material product to be sent instantly to another person to be produced on the spot, making purchasing a product online almost instantaneous.

Although this technology is still too crude to produce most products, it is rapidly developing and created a controversy in 2013 around the issue of 3D printed guns.

Gene therapy

Gene therapy was first successfully demonstrated in late 1990/early 1991 for adenosine deaminase deficiency, though the treatment was somatic – that is, did not affect the patient's germ line and thus was not heritable. This led the way to treatments for other genetic diseases and increased interest in germ line gene therapy – therapy affecting the gametes and descendants of patients.

Between September 1990 and January 2014, there were around 2,000 gene therapy trials conducted or approved.

Cancer vaccines

A cancer vaccine is a vaccine that treats existing cancer or prevents the development of cancer in certain high-risk individuals. Vaccines that treat existing cancer are known as therapeutic cancer vaccines. There are currently no vaccines able to prevent cancer in general.

On April 14, 2009, The Dendreon Corporation announced that their Phase III clinical trial of Provenge, a cancer vaccine designed to treat prostate cancer, had demonstrated an increase in survival. It received U.S. Food and Drug Administration (FDA) approval for use in the treatment of advanced prostate cancer patients on April 29, 2010. The approval of Provenge has stimulated interest in this type of therapy.

In vitro meat

In vitro meat, also called cultured meat, clean meat, cruelty-free meat, shmeat, and test-tube meat, is an animal-flesh product that has never been part of a living animal with exception of the fetal calf serum taken from a slaughtered cow. In the 21st century, several research projects have worked on in vitro meat in the laboratory. The first in vitro beefburger, created by a Dutch team, was eaten at a demonstration for the press in London in August 2013. There remain difficulties to be overcome before in vitro meat becomes commercially available. Cultured meat is prohibitively expensive, but it is expected that the cost could be reduced to compete with that of conventionally obtained meat as technology improves. In vitro meat is also an ethical issue. Some argue that it is less objectionable than traditionally obtained meat because it doesn't involve killing and reduces the risk of animal cruelty, while others disagree with eating meat that has not developed naturally.

Nanotechnology

Nanotechnology (sometimes shortened to nanotech) is the manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter that occur below the given size threshold.

Robotics

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. A good example of a robot that resembles humans is Sophia, a social humanoid robot developed by Hong Kong-based company Hanson Robotics which was activated on April 19, 2015. Many of today's robots are inspired by nature contributing to the field of bio-inspired robotics.

Stem-cell therapy

Stem cell therapy is an intervention strategy that introduces new adult stem cells into damaged tissue in order to treat disease or injury. Many medical researchers believe that stem cell treatments have the potential to change the face of human disease and alleviate suffering. The ability of stem cells to self-renew and give rise to subsequent generations with variable degrees of differentiation capacities offers significant potential for generation of tissues that can potentially replace diseased and damaged areas in the body, with minimal risk of rejection and side effects.

Distributed ledger technology

Distributed ledger or blockchain technology provides a transparent and immutable list of transactions. A wide range of uses has been proposed for where an open, decentralised database is required, ranging from supply chains to cryptocurrencies.

Smart contracts are self-executing transactions which occur when pre-defined conditions are met. The aim is to provide security that is superior to traditional contract law, and to reduce transaction costs and delays. The original idea was conceived by Nick Szabo in 1994, but remained unrealised until the development of blockchains.

Medical field advancements

With technology being faster with delivering data with cloud computing, the medical field is taking advantage of this by creating digital health records. Since doctors recently created digital health records, this can greatly improve the efficiency, the hospital can have with patients. Hospitals will improve public health by being able to share valuable information about an illness, make the workflow more smooth by doctors being able to pull up records on a patient with ease, and even lower healthcare costs by not using as much paper (Banova). With the advancement of cloud computing, information can be delivered faster for doctors to help the medical field grow.

Development of emerging technologies

As innovation drives economic growth, and large economic rewards come from new inventions, a great deal of resources (funding and effort) go into the development of emerging technologies. Some of the sources of these resources are described below...

Research and development

Research and development is directed towards the advancement of technology in general, and therefore includes development of emerging technologies. See also List of countries by research and development spending.

Applied research is a form of systematic inquiry involving the practical application of science. It accesses and uses some part of the research communities' (the academia's) accumulated theories, knowledge, methods, and techniques, for a specific, often state-, business-, or client-driven purpose.

Science policy is the area of public policy which is concerned with the policies that affect the conduct of the science and research enterprise, including the funding of science, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care and environmental monitoring.

DARPA

The Defense Advanced Research Projects Agency (DARPA) is an agency of the U.S. Department of Defense responsible for the development of emerging technologies for use by the military.

DARPA was created in 1958 as the Advanced Research Projects Agency (ARPA) by President Dwight D. Eisenhower. Its purpose was to formulate and execute research and development projects to expand the frontiers of technology and science, with the aim to reach beyond immediate military requirements.

Projects funded by DARPA have provided significant technologies that influenced many non-military fields, such as the Internet and Global Positioning System technology.

Technology competitions and awards

There are awards that provide incentive to push the limits of technology (generally synonymous with emerging technologies). Note that while some of these awards reward achievement after-the-fact via analysis of the merits of technological breakthroughs, others provide incentive via competitions for awards offered for goals yet to be achieved.

The Orteig Prize was a $25,000 award offered in 1919 by French hotelier Raymond Orteig for the first nonstop flight between New York City and Paris. In 1927, underdog Charles Lindbergh won the prize in a modified single-engine Ryan aircraft called the Spirit of St. Louis. In total, nine teams spent $400,000 in pursuit of the Orteig Prize.

The XPRIZE series of awards, public competitions designed and managed by the non-profit organization called the X Prize Foundation, are intended to encourage technological development that could benefit mankind. The most high-profile XPRIZE to date was the $10,000,000 Ansari XPRIZE relating to spacecraft development, which was awarded in 2004 for the development of SpaceShipOne.

The Turing Award is an annual prize given by the Association for Computing Machinery (ACM) to "an individual selected for contributions of a technical nature made to the computing community." It is stipulated that the contributions should be of lasting and major technical importance to the computer field. The Turing Award is generally recognized as the highest distinction in computer science, and in 2014 grew to $1,000,000.

The Millennium Technology Prize is awarded once every two years by Technology Academy Finland, an independent fund established by Finnish industry and the Finnish state in partnership. The first recipient was Tim Berners-Lee, inventor of the World Wide Web.

In 2003, David Gobel seed-funded the Methuselah Mouse Prize (Mprize) to encourage the development of new life extension therapies in mice, which are genetically similar to humans. So far, three Mouse Prizes have been awarded: one for breaking longevity records to Dr. Andrzej Bartke of Southern Illinois University; one for late-onset rejuvenation strategies to Dr. Stephen Spindler of the University of California; and one to Dr. Z. Dave Sharp for his work with the pharmaceutical rapamycin.

Role of science fiction

Science fiction has often affected innovation and new technology - for example many rocketry pioneers were inspired by science fiction - and the documentary How William Shatner Changed the World gives a number of examples of imagined technologies being actualized.

 

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...