Search This Blog

Thursday, March 11, 2021

Autonomous robot

From Wikipedia, the free encyclopedia

An autonomous robot, also known as simply an autorobot or autobot, is a robot that performs behaviors or tasks with a high degree of autonomy (without external influence). Autonomous robotics is usually considered to be a subfield of artificial intelligence, robotics, and information engineering. Early versions were proposed and demonstrated by author/inventor David L. Heiserman.

Autonomous robots are particularly desirable in fields such as spaceflight, household maintenance (such as cleaning), waste water treatment, and delivering goods and services.

Some modern factory robots are "autonomous" within the strict confines of their direct environment. It may not be that every degree of freedom exists in their surrounding environment, but the factory robot's workplace is challenging and can often contain chaotic, unpredicted variables. The exact orientation and position of the next object of work and (in the more advanced factories) even the type of object and the required task must be determined. This can vary unpredictably (at least from the robot's point of view).

One important area of robotics research is to enable the robot to cope with its environment whether this be on land, underwater, in the air, underground, or in space.

A fully autonomous robot can:

  • Gain information about the environment
  • Work for an extended period without human intervention
  • Move either all or part of itself throughout its operating environment without human assistance
  • Avoid situations that are harmful to people, property, or itself unless those are part of its design specifications

An autonomous robot may also learn or gain new knowledge like adjusting for new methods of accomplishing its tasks or adapting to changing surroundings.

Like other machines, autonomous robots still require regular maintenance.

Components and criteria of robotic autonomy

Self-maintenance

The first requirement for complete physical autonomy is the ability for a robot to take care of itself. Many of the battery-powered robots on the market today can find and connect to a charging station, and some toys like Sony's Aibo are capable of self-docking to charge their batteries.

Self-maintenance is based on "proprioception", or sensing one's own internal status. In the battery charging example, the robot can tell proprioceptively that its batteries are low and it then seeks the charger. Another common proprioceptive sensor is for heat monitoring. Increased proprioception will be required for robots to work autonomously near people and in harsh environments. Common proprioceptive sensors include thermal, optical, and haptic sensing, as well as the Hall effect (electric).

Robot GUI display showing battery voltage and other proprioceptive data in lower right-hand corner. The display is for user information only. Autonomous robots monitor and respond to proprioceptive sensors without human intervention to keep themselves safe and operating properly.

Sensing the environment

Exteroception is sensing things about the environment. Autonomous robots must have a range of environmental sensors to perform their task and stay out of trouble.

Some robotic lawn mowers will adapt their programming by detecting the speed in which grass grows as needed to maintain a perfectly cut lawn, and some vacuum cleaning robots have dirt detectors that sense how much dirt is being picked up and use this information to tell them to stay in one area longer.

Task performance

The next step in autonomous behavior is to actually perform a physical task. A new area showing commercial promise is domestic robots, with a flood of small vacuuming robots beginning with iRobot and Electrolux in 2002. While the level of intelligence is not high in these systems, they navigate over wide areas and pilot in tight situations around homes using contact and non-contact sensors. Both of these robots use proprietary algorithms to increase coverage over simple random bounce.

The next level of autonomous task performance requires a robot to perform conditional tasks. For instance, security robots can be programmed to detect intruders and respond in a particular way depending upon where the intruder is.

Autonomous navigation

Indoor navigation

For a robot to associate behaviors with a place (localization) requires it to know where it is and to be able to navigate point-to-point. Such navigation began with wire-guidance in the 1970s and progressed in the early 2000s to beacon-based triangulation. Current commercial robots autonomously navigate based on sensing natural features. The first commercial robots to achieve this were Pyxus' HelpMate hospital robot and the CyberMotion guard robot, both designed by robotics pioneers in the 1980s. These robots originally used manually created CAD floor plans, sonar sensing and wall-following variations to navigate buildings. The next generation, such as MobileRobots' PatrolBot and autonomous wheelchair, both introduced in 2004, have the ability to create their own laser-based maps of a building and to navigate open areas as well as corridors. Their control system changes its path on the fly if something blocks the way.

At first, autonomous navigation was based on planar sensors, such as laser range-finders, that can only sense at one level. The most advanced systems now fuse information from various sensors for both localization (position) and navigation. Systems such as Motivity can rely on different sensors in different areas, depending upon which provides the most reliable data at the time, and can re-map a building autonomously.

Rather than climb stairs, which requires highly specialized hardware, most indoor robots navigate handicapped-accessible areas, controlling elevators, and electronic doors. With such electronic access-control interfaces, robots can now freely navigate indoors. Autonomously climbing stairs and opening doors manually are topics of research at the current time.

As these indoor techniques continue to develop, vacuuming robots will gain the ability to clean a specific user-specified room or a whole floor. Security robots will be able to cooperatively surround intruders and cut off exits. These advances also bring concomitant protections: robots' internal maps typically permit "forbidden areas" to be defined to prevent robots from autonomously entering certain regions.

Outdoor navigation

Outdoor autonomy is most easily achieved in the air, since obstacles are rare. Cruise missiles are rather dangerous highly autonomous robots. Pilotless drone aircraft are increasingly used for reconnaissance. Some of these unmanned aerial vehicles (UAVs) are capable of flying their entire mission without any human interaction at all except possibly for the landing where a person intervenes using radio remote control. Some drones are capable of safe, automatic landings, however. An autonomous ship was announced in 2014—the Autonomous spaceport drone ship—and is scheduled to make its first operational test in December 2014.

Outdoor autonomy is the most difficult for ground vehicles, due to:

  • Three-dimensional terrain
  • Great disparities in surface density
  • Weather exigencies
  • Instability of the sensed environment

Open problems in autonomous robotics

There are several open problems in autonomous robotics which are special to the field rather than being a part of the general pursuit of AI. According to George A. Bekey's Autonomous Robots: From Biological Inspiration to Implementation and Control, problems include things such as making sure the robot is able to function correctly and not run into obstacles autonomously.

Energy autonomy and foraging

Researchers concerned with creating true artificial life are concerned not only with intelligent control, but further with the capacity of the robot to find its own resources through foraging (looking for food, which includes both energy and spare parts).

This is related to autonomous foraging, a concern within the sciences of behavioral ecology, social anthropology, and human behavioral ecology; as well as robotics, artificial intelligence, and artificial life.

History and development

The Seekur and MDARS robots demonstrate their autonomous navigation and security capabilities at an airbase.

The Seekur robot was the first commercially available robot to demonstrate MDARS-like capabilities for general use by airports, utility plants, corrections facilities and Homeland Security.

The Mars rovers MER-A and MER-B (now known as Spirit rover and Opportunity rover) can find the position of the sun and navigate their own routes to destinations, on the fly, by:

  • Mapping the surface with 3D vision
  • Computing safe and unsafe areas on the surface within that field of vision
  • Computing optimal paths across the safe area towards the desired destination
  • Driving along the calculated route;
  • Repeating this cycle until either the destination is reached, or there is no known path to the destination

The planned ESA Rover, ExoMars Rover, is capable of vision based relative localisation and absolute localisation to autonomously navigate safe and efficient trajectories to targets by:

  • Reconstructing 3D models of the terrain surrounding the Rover using a pair of stereo cameras
  • Determining safe and unsafe areas of the terrain and the general "difficulty" for the Rover to navigate the terrain
  • Computing efficient paths across the safe area towards the desired destination
  • Driving the Rover along the planned path
  • Building up a navigation map of all previous navigation data

During the final NASA Sample Return Robot Centennial Challenge in 2016, a rover, named Cataglyphis, successfully demonstrated fully autonomous navigation, decision-making, and sample detection, retrieval, and return capabilities. The rover relied on a fusion of measurements from inertial sensors, wheel encoders, Lidar, and camera for navigation and mapping, instead of using GPS or magnetometers. During the 2 hour challenge, Cataglyphis traversed over 2.6 km and returned five different samples to its starting position.

The DARPA Grand Challenge and DARPA Urban Challenge have encouraged development of even more autonomous capabilities for ground vehicles, while this has been the demonstrated goal for aerial robots since 1990 as part of the AUVSI International Aerial Robotics Competition.

Between 2013 and 2017, Total S.A. has held the ARGOS Challenge to develop the first autonomous robot for oil and gas production sites. The robots had to face adverse outdoor conditions such as rain, wind and extreme temperatures.

Delivery robot

A food delivery robot

A delivery robot is an autonomous robot used for delivering goods.

Construction robots

Construction robots are used directly on job sites and perform work such as building, material handling, earthmoving, and surveillance.

Numerous companies have robotics applications in stages ranging from R&D to fully-commercialized.

  • ASI Robots: heavy equipment automation and autonomy platform
  • Builder [X]: heavy equipment automation
  • Built Robotics: heavy equipment automation
  • Doxel: autonomous surveillance and job site tracking
  • EquipmentShare: equipment automation and remote control
  • Fastbrick Robotics: bricklaying robot
  • Jaybridge Robotics: heavy equipment automation
  • Robo Industries: heavy equipment automation
  • SafeAI: heavy equipment automation
  • Scaled Robotics: autonomous surveillance and job site tracking
  • Semcon: autonomous compactors and plows
  • Steer: remote control operation
  • Zoomlion: heavy equipment automation

Research and education mobile robots

Research and education mobile robots are mainly used during a prototyping phase in the process of building full scale robots. They are a scaled down versions of bigger robots with the same types of sensors, kinematics and software stack (e.g. ROS). They are often extendable and provide comfortable programming interface and development tools. Next to full scale robot prototyping they are also used for education, especially at university level, where more and more labs about programming autonomous vehicles are being introduced. Some of the popular research and education robots are:

  • TurtleBot
  • ROSbot 2.0

Legislation

In March 2016, a bill was introduced in Washington, D.C., allowing pilot ground robotic deliveries. The program was to take place from September 15 through the end of December 2017. The robots were limited to a weight of 50 pounds unloaded and a maximum speed of 10 miles per hour. In case the robot stopped moving because of malfunction the company was required to remove it from the streets within 24 hours. There were allowed only 5 robots to be tested per company at a time. A 2017 version of the Personal Delivery Device Act bill was under review as of March 2017.

In February 2017, a bill was passed in the US state of Virginia via the House bill, HB2016, and the Senate bill, SB1207, that will allow autonomous delivery robots to travel on sidewalks and use crosswalks statewide beginning on July 1, 2017. The robots will be limited to a maximum speed of 10 mph and a maximum weight of 50 pounds. In the states of Idaho and Florida there are also talks about passing the similar legislature.

It has been discussed that robots with similar characteristics to invalid carriages (e.g. 10 mph maximum, limited battery life) might be a workaround for certain classes of applications. If the robot was sufficiently intelligent and able to recharge itself using the existing electric vehicle (EV) charging infrastructure it would only need minimal supervision and a single arm with low dexterity might be enough to enable this function if its visual systems had enough resolution.[citation needed]

In November 2017, the San Francisco Board of Supervisors announced that companies would need to get a city permit in order to test these robots. In addition, sidewalk delivery robots have been banned from making non-research deliveries.

 

Technological fix

From Wikipedia, the free encyclopedia
 
Renewable energy is one primary example of a technological fix, as it has been designed to combat the issues with global warming and climate change.

A technological fix, technical fix, technological shortcut or solutionism refers to the attempt of using engineering or technology to solve a problem (often created by earlier technological interventions).

Some references define technological fix as an "attempt to repair the harm of a technology by modification of the system", that might involve modification of the machine and/or modification of the procedures for operating and maintaining it.

Technological fixes are inevitable in modern technology. It has been observed that many technologies, although invented and developed to solve certain perceived problems, often create other problems in the process, known as externalities. In other words, there would be modification of the basic hardware, modification of techniques and procedures, or both.

The technological fix is the idea that all problems can find solutions in better and new technologies. It now is used as a dismissive phrase to describe cheap, quick fixes by using inappropriate technologies; these fixes often create more problems than they solve, or give people a sense that they have solved the problem.

Contemporary context

In the contemporary context, technological fix is sometimes used to refer to the idea of using data and intelligent algorithms to supplement and improve human decision making in hope that this would result in ameliorating the bigger problem. One critic, Evgeny Morozov defines this as "Recasting all complex social situations either as neat problems with definite, computable solutions or as transparent and self-evident processes that can be easily optimized--if only the right algorithms are in place." While some criticizes this approach to the issues of today as detrimental to efforts to truly solve these problems, opponents finds merits in such approach to technological improvement of our society as complements to existing activists and policy efforts.

An example of the criticism is how policy makers may be tempted to think that installing smart energy monitors would help people conserve energy better, thus improving global warming, rather than focusing on the arduous process of passing laws to tax carbon, etc. Another example is thinking of obesity as a lifestyle choice of eating high caloric foods and not exercising enough, rather than viewing obesity as more of a social and class problem where individuals are predisposed to eat certain kind of foods (due to the lack of affordable health-supporting food in urban food deserts), to lack optimally evidence-based health behaviors, and lack of proper health care to mitigate behavioral outcomes.

Climate change

The technological fix for climate change is an example of the use of technology to restore the environment. This can be seen through various different strategies such as: geo-engineering and renewable energy.

Geo-engineering

Geo-engineering is referred as "the artificial modification of Earth's climate systems through two primary ideologies, Solar Radiation Management (SRM) and Carbon Dioxide Removal (CDR)". Different schemes, projects and technologies have been designed to tackle the effects of climate change, usually by removing CO2 from the air as seen by Klaus Lackner's invention of a Co2 prototype, or by limiting the amount of sunlight that reaches the Earth's surface, by space mirrors. However, "critics by contrast claim that geoengineering isn't realistic – and may be a distraction from reducing emissions." It has been argued that geo-engineering is an adaptation to global warming. It allows TNC's, humans and governments to avoid facing the facts that global warming is a crisis that needs to be dealt with head-on by reducing emissions and implementing green technologies, rather than developing ways to control the environment and ultimately allow Greenhouse Gases to continue to be released into the atmosphere.

Renewable energy

Wind turbine

Renewable energy is also another example of a technological fix, as technology is being used in attempts to reduce and mitigate the effects of global warming. Renewable energy refers to technologies that has been designed to be eco-friendly and efficient for the well-being of the Earth. They are generally regarded as infinite energy sources, which means they will never run out, unlike fossil fuels such as oil and coal, which are finite sources of energy. They additionally release no greenhouse gases such as carbon dioxide, which is harmful for the planet as it depletes the ozone layer. Examples of renewable energy can be seen by wind turbines, solar energy such as solar panels and kinetic energy from waves. These energies are regarded as a technological fix as they have been designed and innovated to overcome issues with energy insecurity, as well as to help protect Earth from the harmful emissions released from non-renewable energy sources, and thus overcome global warming. It is also known that such technologies will in turn require their own technological fixes. For example, some types of solar energy have local impacts on ambient temperature, which can be a hazard to birdlife.

Food famine

It has been made explicit within society that the world's population is rapidly increasing, with the "UNICEF estimating that an average of 353,000 babies are born each day around the world." Therefore, it is expected that the production of food will not be able to progress and develop to keep up with the needs of species. Ester Boserup highlighted in 1965 that when the human population increases and food production decreases, an innovation will take place. This can be demonstrated in the technological development of hydroponics and genetically modified crops.

hydroponics

Hydroponics

Hydroponics is an example of a technological fix. It demonstrates the ability for humans to recognise a problem within society, such as the lack of food for an increasing population, and therefore attempt to fix this problem with the development of an innovative technology. Hydroponics is a method of food production to increase productivity, in an "artificial environment." The soil is replaced by a mineral solution that is left around the plant roots. Removing the soil allows a greater crop yield, as there is less chance of soil-borne diseases, as well as being able to monitor plant growth and mineral concentrations. This innovative technology to yield more food reflects the ability for humans to develop their way out of a problem, portraying a technological fix.

Genetically modified organism

Genetically modified organism (GMO) reflect the use of technology to innovate our way out of a problem such as the lack of food to cater for the growing population, demonstrating a technological fix. GM crops can create many advantages, such as higher food fields, added vitamins and increased farm profits. Depending on the modifications, they may also introduce the problem of increasing resistance to pesticides and herbicides, which may inevitably precipitate the need for further fixes in the future.

Golden rice

Golden rice is one example of a technological fix. It demonstrates the ability for humans to develop and innovate themselves out of problems, such as the deficiency of vitamin A in Taiwan and Philippines, in which the World Health Organization reported that about 250 million preschool children are affected by. Through the technological development of GM Crops, scientists were able to develop golden rice that can be grown in these countries with genetically higher levels of beta-carotene (a precursor of vitamin A). This enables healthier and fulfilling lifestyles for these individuals and consequently helps to reduce the deaths caused by the deficiency.

Externalities

Externalities refer to the unforeseen or unintended consequences of technology. It is evident that everything new and innovative can potentially have negative effects, especially if it is a new area of development. Although technologies are invented and developed to solve certain perceived problems, they often create other problems in the process.

DDT

DDT was initially use by the Military in World War II to control a range of different illnesses, varying from Malaria to the bubonic plague and body lice. Due to the efficiency of DDT, it was soon adopted as a farm pesticide to help maximise crop yields to consequently cope with the rising populations food demands post WWII. This pesticide proved to be extremely effective in killing bugs and animals on crops, and was often referred as the "wonder-chemical." However, despite being banned for over forty years, we are still facing the externalities of this technology. It was found that DDT had major health impacts on both humans and animals. It was found that DDT accumulated within the fatty cells of both humans and animals and therefore highlights that technological fixes have their negatives as well as positives.

DDT being sprayed (1958, The United States' National Malaria Eradication Program)

Humans

  • Breast & other cancers
  • Male infertility
  • Miscarriages & low birth weight
  • Developmental delay
  • Nervous system & liver damage

Animals

  • DDT is toxic to birds when eaten.
  • Decreases the reproductive rate of birds by causing eggshell thinning and embryo deaths.
  • Highly toxic to aquatic animals. DDT affects various systems in aquatic animals including the heart and brain.
  • DDT moderately toxic to amphibians like frogs, toads, and salamanders. Immature amphibians are more sensitive to the effects of DDT than adults.
Automobile

Global warming

Global warming can be a natural phenomenon that occurs in long (geologic) cycles. However, it has been found that the release of greenhouse gases through industry and traffic causes the earth to warm. This is causing externalities on the environment, such as melting icecaps, shifting biomes, and extinction of many aquatic species through ocean acidification and changing ocean temperatures.

Automobiles

Automobiles with internal combustion engines have revolutionised civilisation and technology. However, whilst the technology was new and innovative, helping to connect places through the ability of transport, it was not recognised at the time that burning fossil fuels, such as coal and oil, inside the engines would release pollutants. This is an explicit example of an externality caused by a technological fix, as the problems caused from the development of the technology was not recognised at the time.

Different types of technological fixes

High-tech megaprojects

High-tech megaprojects are large scale and require huge sums of investment and revenue to be created. Examples of these high technologies are dams, nuclear power plants, and airports. They usually cause externalities on other factors such as the environment, are highly expensive, and are top-down governmental plans.

Three Gorges Dam

The Three Gorges Dam is an example of a high-tech technological fix. The creation of the multi-purpose navigation hydropower and flood control scheme was designed to fix the issues with flooding whilst providing efficient, clean renewable hydro-electric power in China. The Three Gorges Dam is the world's largest power station in terms of installed capacity (22,500 MW). The dam is the largest operating hydroelectric facility in terms of annual energy generation, generating 83.7 TWh in 2013 and 98.8 TWh in 2014, while the annual energy generation of the Itaipú Dam in Brazil and Paraguay was 98.6 TWh in 2013 and 87.8 in 2014. It was estimated to have cost over £25 billion. There have been many externalities from this technology, such as the extinction of the Chinese River Dolphin, an increase in pollution, as the river can no longer 'flush' itself, and over 4 million locals being displaced in the area.

Intermediate technology

Is usually small-scale and cheap technologies that are usually seen in developing countries. The capital to build and create these technologies are usually low, yet labour is high. Local expertise can be used to maintain these technologies making them very quick and effective to build and repair. An example of an intermediate technology can be seen by water wells, rain barrels and pumpkin tanks.

Appropriate technologies

Technology that suits the level of income, skills and needs of the people. Therefore, this factor encompasses both high and low technologies.

An example of this can be seen by developing countries that implement technologies that suit their expertise, such as rain barrels and hand pumps. These technologies are low costing and can be maintained by local skills, making them affordable and efficient. However, to implement rain barrels in a developed country would not be appropriate, as it would not suit the technological advancement apparent in these countries. Therefore, appropriate technological fixes take into consideration the level of development within a country before implementing them.

Concerns

Michael and Joyce Huesemann caution against the hubris of large-scale techno-fixes In the book Techno-Fix: Why Technology Won't Save Us Or the Environment they show why negative unintended consequences of science and technology are inherently unavoidable and unpredictable, why counter-technologies or techno-fixes are no lasting solutions, and why modern technology in current context does not promote sustainability but instead collapse.

Naomi Klein is a prominent opponent of the view that simply technological fixes will solve our problems. She explained her concerns in her book This Changes Everything: Capitalism vs. the Climate and states that technical fixes for climate change such as geoengineering bring significant risks as "we simply don't know enough about the Earth system to be able to re-engineer it safely". According to her the proposed technique of dimming the rays of the sun with sulphate-spraying helium balloons in order to mimic the cooling effect on the atmosphere of large volcanic eruptions for instance is highly dangerous and such schemes will surely be attempted if abrupt climate change gets seriously under way. Such concerns are explored in their complexity in Elizabeth Kolbert's Under a White Sky.

Various experts and environmental groups have also come forward with their concerns over views and approaches that look for techno fixes as solutions and warn that those would be "misguided, unjust, profoundly arrogant and endlessly dangerous" approaches as well as over the prospect of a technological 'fix' for global warming, however impractical, causing lessened political pressure for a real solution.

Developmental robotics

From Wikipedia, the free encyclopedia

Developmental robotics (DevRob), sometimes called epigenetic robotics, is a scientific field which aims at studying the developmental mechanisms, architectures and constraints that allow lifelong and open-ended learning of new skills and new knowledge in embodied machines. As in human children, learning is expected to be cumulative and of progressively increasing complexity, and to result from self-exploration of the world in combination with social interaction. The typical methodological approach consists in starting from theories of human and animal development elaborated in fields such as developmental psychology, neuroscience, developmental and evolutionary biology, and linguistics, then to formalize and implement them in robots, sometimes exploring extensions or variants of them. The experimentation of those models in robots allows researchers to confront them with reality, and as a consequence, developmental robotics also provides feedback and novel hypotheses on theories of human and animal development.

Developmental robotics is related to but differs from evolutionary robotics (ER). ER uses populations of robots that evolve over time, whereas DevRob is interested in how the organization of a single robot's control system develops through experience, over time.

DevRob is also related to work done in the domains of robotics and artificial life.

Background

Can a robot learn like a child? Can it learn a variety of new skills and new knowledge unspecified at design time and in a partially unknown and changing environment? How can it discover its body and its relationships with the physical and social environment? How can its cognitive capacities continuously develop without the intervention of an engineer once it is "out of the factory"? What can it learn through natural social interactions with humans? These are the questions at the center of developmental robotics. Alan Turing, as well as a number of other pioneers of cybernetics, already formulated those questions and the general approach in 1950, but it is only since the end of the 20th century that they began to be investigated systematically.

Because the concept of adaptive intelligent machines is central to developmental robotics, it has relationships with fields such as artificial intelligence, machine learning, cognitive robotics or computational neuroscience. Yet, while it may reuse some of the techniques elaborated in these fields, it differs from them from many perspectives. It differs from classical artificial intelligence because it does not assume the capability of advanced symbolic reasoning and focuses on embodied and situated sensorimotor and social skills rather than on abstract symbolic problems. It differs from traditional machine learning because it targets task-independent self-determined learning rather than task-specific inference over "spoon-fed human-edited sensory data" (Weng et al., 2001). It differs from cognitive robotics because it focuses on the processes that allow the formation of cognitive capabilities rather than these capabilities themselves. It differs from computational neuroscience because it focuses on functional modeling of integrated architectures of development and learning. More generally, developmental robotics is uniquely characterized by the following three features:

  1. It targets task-independent architectures and learning mechanisms, i.e. the machine/robot has to be able to learn new tasks that are unknown by the engineer;
  2. It emphasizes open-ended development and lifelong learning, i.e. the capacity of an organism to acquire continuously novel skills. This should not be understood as a capacity for learning "anything" or even “everything”, but just that the set of skills that is acquired can be infinitely extended at least in some (not all) directions;
  3. The complexity of acquired knowledge and skills shall increase (and the increase be controlled) progressively.

Developmental robotics emerged at the crossroads of several research communities including embodied artificial intelligence, enactive and dynamical systems cognitive science, connectionism. Starting from the essential idea that learning and development happen as the self-organized result of the dynamical interactions among brains, bodies and their physical and social environment, and trying to understand how this self-organization can be harnessed to provide task-independent lifelong learning of skills of increasing complexity, developmental robotics strongly interacts with fields such as developmental psychology, developmental and cognitive neuroscience, developmental biology (embryology), evolutionary biology, and cognitive linguistics. As many of the theories coming from these sciences are verbal and/or descriptive, this implies a crucial formalization and computational modeling activity in developmental robotics. These computational models are then not only used as ways to explore how to build more versatile and adaptive machines but also as a way to evaluate their coherence and possibly explore alternative explanations for understanding biological development.

Research directions

Skill domains

Due to the general approach and methodology, developmental robotics projects typically focus on having robots develop the same types of skills as human infants. A first category that is important being investigated is the acquisition of sensorimotor skills. These include the discovery of one's own body, including its structure and dynamics such as hand-eye coordination, locomotion, and interaction with objects as well as tool use, with a particular focus on the discovery and learning of affordances. A second category of skills targeted by developmental robots are social and linguistic skills: the acquisition of simple social behavioural games such as turn-taking, coordinated interaction, lexicons, syntax and grammar, and the grounding of these linguistic skills into sensorimotor skills (sometimes referred as symbol grounding). In parallel, the acquisition of associated cognitive skills are being investigated such as the emergence of the self/non-self distinction, the development of attentional capabilities, of categorization systems and higher-level representations of affordances or social constructs, of the emergence of values, empathy, or theories of mind.

Mechanisms and constraints

The sensorimotor and social spaces in which humans and robot live are so large and complex that only a small part of potentially learnable skills can actually be explored and learnt within a life-time. Thus, mechanisms and constraints are necessary to guide developmental organisms in their development and control of the growth of complexity. There are several important families of these guiding mechanisms and constraints which are studied in developmental robotics, all inspired by human development:

  1. Motivational systems, generating internal reward signals that drive exploration and learning, which can be of two main types:
    • extrinsic motivations push robots/organisms to maintain basic specific internal properties such as food and water level, physical integrity, or light (e.g. in phototropic systems);
    • intrinsic motivations push robot to search for novelty, challenge, compression or learning progress per se, thus generating what is sometimes called curiosity-driven learning and exploration, or alternatively active learning and exploration;
  2. Social guidance: as humans learn a lot by interacting with their peers, developmental robotics investigates mechanisms that can allow robots to participate to human-like social interaction. By perceiving and interpreting social cues, this may allow robots both to learn from humans (through diverse means such as imitation, emulation, stimulus enhancement, demonstration, etc. ...) and to trigger natural human pedagogy. Thus, social acceptance of developmental robots is also investigated;
  3. Statistical inference biases and cumulative knowledge/skill reuse: biases characterizing both representations/encodings and inference mechanisms can typically allow considerable improvement of the efficiency of learning and are thus studied. Related to this, mechanisms allowing to infer new knowledge and acquire new skills by reusing previously learnt structures is also an essential field of study;
  4. The properties of embodiment, including geometry, materials, or innate motor primitives/synergies often encoded as dynamical systems, can considerably simplify the acquisition of sensorimotor or social skills, and is sometimes referred as morphological computation. The interaction of these constraints with other constraints is an important axis of investigation;
  5. Maturational constraints: In human infants, both the body and the neural system grow progressively, rather than being full-fledged already at birth. This implies, for example, that new degrees of freedom, as well as increases of the volume and resolution of available sensorimotor signals, may appear as learning and development unfold. Transposing these mechanisms in developmental robots, and understanding how it may hinder or on the contrary ease the acquisition of novel complex skills is a central question in developmental robotics.

From bio-mimetic development to functional inspiration.

While most developmental robotics projects interact closely with theories of animal and human development, the degrees of similarities and inspiration between identified biological mechanisms and their counterpart in robots, as well as the abstraction levels of modeling, may vary a lot. While some projects aim at modeling precisely both the function and biological implementation (neural or morphological models), such as in Neurorobotics, some other projects only focus on functional modeling of the mechanisms and constraints described above, and might for example reuse in their architectures techniques coming from applied mathematics or engineering fields.

Open questions

As developmental robotics is a relatively new research field and at the same time very ambitious, many fundamental open challenges remain to be solved.

First of all, existing techniques are far from allowing real-world high-dimensional robots to learn an open-ended repertoire of increasingly complex skills over a life-time period. High-dimensional continuous sensorimotor spaces constitute a significant obstacle to be solved. Lifelong cumulative learning is another one. Actually, no experiments lasting more than a few days have been set up so far, which contrasts severely with the time needed by human infants to learn basic sensorimotor skills while equipped with brains and morphologies which are tremendously more powerful than existing computational mechanisms.

Among the strategies to explore to progress towards this target, the interaction between the mechanisms and constraints described in the previous section shall be investigated more systematically. Indeed, they have so far mainly been studied in isolation. For example, the interaction of intrinsically motivated learning and socially guided learning, possibly constrained by maturation, is an essential issue to be investigated.

Another important challenge is to allow robots to perceive, interpret and leverage the diversity of multimodal social cues provided by non-engineer humans during human-robot interaction. These capacities are so far, mostly too limited to allow efficient general-purpose teaching from humans.

A fundamental scientific issue to be understood and resolved, which applied equally to human development, is how compositionality, functional hierarchies, primitives, and modularity, at all levels of sensorimotor and social structures, can be formed and leveraged during development. This is deeply linked with the problem of the emergence of symbols, sometimes referred to as the "symbol grounding problem" when it comes to language acquisition. Actually, the very existence and need for symbols in the brain are actively questioned, and alternative concepts, still allowing for compositionality and functional hierarchies are being investigated.

During biological epigenesis, morphology is not fixed but rather develops in constant interaction with the development of sensorimotor and social skills. The development of morphology poses obvious practical problems with robots, but it may be a crucial mechanism that should be further explored, at least in simulation, such as in morphogenetic robotics.

Another open problem is the understanding of the relation between the key phenomena investigated by developmental robotics (e.g., hierarchical and modular sensorimotor systems, intrinsic/extrinsic/social motivations, and open-ended learning) and the underlying brain mechanisms.

Similarly, in biology, developmental mechanisms (operating at the ontogenetic time scale) interact closely with evolutionary mechanisms (operating at the phylogenetic time scale) as shown in the flourishing "evo-devo" scientific literature. However, the interaction of those mechanisms in artificial organisms, developmental robots, in particular, is still vastly understudied. The interaction of evolutionary mechanisms, unfolding morphologies and developing sensorimotor and social skills will thus be a highly stimulating topic for the future of developmental robotics.

Main journals

Main conferences

The NSF/DARPA funded Workshop on Development and Learning was held April 5–7, 2000 at Michigan State University. It was the first international meeting devoted to computational understanding of mental development by robots and animals. The term "by" was used since the agents are active during development.

 

Technological determinism

From Wikipedia, the free encyclopedia

Technological determinism is a reductionist theory that assumes that a society's technology determines the development of its social structure and cultural values. The term is believed to have originated from Thorstein Veblen (1857–1929), an American sociologist and economist. The most radical technological determinist in the United States in the 20th century was most likely Clarence Ayres who was a follower of Thorstein Veblen and John Dewey. William Ogburn was also known for his radical technological determinism.

The first major elaboration of a technological determinist view of socioeconomic development came from the German philosopher and economist Karl Marx, who argued that changes in technology, and specifically productive technology, are the primary influence on human social relations and organizational structure, and that social relations and cultural practices ultimately revolve around the technological and economic base of a given society. Marx's position has become embedded in contemporary society, where the idea that fast-changing technologies alter human lives is pervasive. Although many authors attribute a technologically determined view of human history to Marx's insights, not all Marxists are technological determinists, and some authors question the extent to which Marx himself was a determinist. Furthermore, there are multiple forms of technological determinism.

Origin

The term is believed to have been coined by Thorstein Veblen (1857–1929), an American social scientist. Veblen's contemporary, popular historian Charles A. Beard, provided this apt determinist image, "Technology marches in seven-league boots from one ruthless, revolutionary conquest to another, tearing down old factories and industries, flinging up new processes with terrifying rapidity." As to the meaning, it is described as the ascription to machines of "powers" that they do not have. Veblen, for instance, asserted that "the machine throws out anthropomorphic habits of thought." There is also the case of Karl Marx who expected that the construction of the railway in India would dissolve the caste system. The general idea, according to Robert Heilbroner, is that technology, by way of its machines, can cause historical change by changing the material conditions of human existence.

One of the most radical technological determinists was a man named Clarence Ayres, who was a follower of Veblen's theory in the 20th century. Ayres is best known for developing economic philosophies, but he also worked closely with Veblen who coined the technological determinism theory. He often talked about the struggle between technology and ceremonial structure. One of his most notable theories involved the concept of "technological drag" where he explains technology as a self-generating process and institutions as ceremonial and this notion creates a technological over-determinism in the process.

Explanation

Technological determinism seeks to show technical developments, media, or technology as a whole, as the key mover in history and social change. It is a theory subscribed by "hyperglobalists" who claim that as a consequence of the wide availability of technology, accelerated globalization is inevitable. Therefore, technological development and innovation become the principal motor of social, economic or political change.

Strict adherents to technological determinism do not believe the influence of technology differs based on how much a technology is or can be used. Instead of considering technology as part of a larger spectrum of human activity, technological determinism sees technology as the basis for all human activity.

Technological determinism has been summarized as 'The belief in technology as a key governing force in society ...' (Merritt Roe Smith). 'The idea that technological development determines social change ...' (Bruce Bimber). It changes the way people think and how they interact with others and can be described as '...a three-word logical proposition: "Technology determines history"' (Rosalind Williams) . It is, '... the belief that social progress is driven by technological innovation, which in turn follows an "inevitable" course.' (Michael L. Smith). This 'idea of progress' or 'doctrine of progress' is centralised around the idea that social problems can be solved by technological advancement, and this is the way that society moves forward. Technological determinists believe that "'You can't stop progress', implying that we are unable to control technology" (Lelia Green). This suggests that we are somewhat powerless and society allows technology to drive social changes because, "societies fail to be aware of the alternatives to the values embedded in it [technology]" (Merritt Roe Smith).

Technological determinism has been defined as an approach that identifies technology, or technological advances, as the central causal element in processes of social change (Croteau and Hoynes). As a technology is stabilized, its design tends to dictate users' behaviors, consequently diminishing human agency. This stance however ignores the social and cultural circumstances in which the technology was developed. Sociologist Claude Fischer (1992) characterized the most prominent forms of technological determinism as "billiard ball" approaches, in which technology is seen as an external force introduced into a social situation, producing a series of ricochet effects.

Rather than acknowledging that a society or culture interacts with and even shapes the technologies that are used, a technological determinist view holds that "the uses made of technology are largely determined by the structure of the technology itself, that is, that its functions follow from its form" (Neil Postman). However, this is not to be confused with Daniel Chandler's "inevitability thesis", which states that once a technology is introduced into a culture that what follows is the inevitable development of that technology.

For example, we could examine why Romance Novels have become so dominant in our society compared to other forms of novels like the Detective or Western novel. We might say that it was because of the invention of the perfect binding system developed by publishers. This was where glue was used instead of the time-consuming and very costly process of binding books by sewing in separate signatures. This meant that these books could be mass-produced for the wider public. We would not be able to have mass literacy without mass production. This example is closely related to Marshall McLuhan's belief that print helped produce the nation state. This moved society on from an oral culture to a literate culture but also introduced a capitalist society where there was clear class distinction and individualism. As Postman maintains

The printing press, the computer, and television are not therefore simply machines which convey information. They are metaphors through which we conceptualize reality in one way or another. They will classify the world for us, sequence it, frame it, enlarge it, reduce it, argue a case for what it is like. Through these media metaphors, we do not see the world as it is. We see it as our coding systems are. Such is the power of the form of information.

Hard and soft determinism

In examining determinism, hard determinism can be contrasted with soft determinism. A compatibilist says that it is possible for free will and determinism to exist in the world together, while an incompatibilist would say that they can not and there must be one or the other. Those who support determinism can be further divided.

Hard determinists would view technology as developing independent from social concerns. They would say that technology creates a set of powerful forces acting to regulate our social activity and its meaning. According to this view of determinism we organize ourselves to meet the needs of technology and the outcome of this organization is beyond our control or we do not have the freedom to make a choice regarding the outcome (autonomous technology). The 20th century French philosopher and social theorist Jacques Ellul could be said to be a hard determinist and proponent of autonomous technique (technology). In his 1954 work The Technological Society, Ellul essentially posits that technology, by virtue of its power through efficiency, determines which social aspects are best suited for its own development through a process of natural selection. A social system's values, morals, philosophy etc. that are most conducive to the advancement of technology allow that social system to enhance its power and spread at the expense of those social systems whose values, morals, philosophy etc. are less promoting of technology. While geography, climate, and other "natural" factors largely determined the parameters of social conditions for most of human history, technology has recently become the dominant objective factor (largely due to forces unleashed by the industrial revolution) and it has been the principal objective and determining factor.

Soft determinism, as the name suggests, is a more passive view of the way technology interacts with socio-political situations. Soft determinists still subscribe to the fact that technology is the guiding force in our evolution, but would maintain that we have a chance to make decisions regarding the outcomes of a situation. This is not to say that free will exists, but that the possibility for us to roll the dice and see what the outcome is exists. A slightly different variant of soft determinism is the 1922 technology-driven theory of social change proposed by William Fielding Ogburn, in which society must adjust to the consequences of major inventions, but often does so only after a period of cultural lag.

Technology as neutral

Individuals who consider technology as neutral see technology as neither good nor bad and what matters are the ways in which we use technology. An example of a neutral viewpoint is, "guns are neutral and its up to how we use them whether it would be 'good or bad'" (Green, 2001). Mackenzie and Wajcman believe that technology is neutral only if it's never been used before, or if no one knows what it is going to be used for (Green, 2001). In effect, guns would be classified as neutral if and only if society were none the wiser of their existence and functionality (Green, 2001). Obviously, such a society is non-existent and once becoming knowledgeable about technology, the society is drawn into a social progression where nothing is 'neutral about society' (Green). According to Lelia Green, if one believes technology is neutral, one would disregard the cultural and social conditions that technology has produced (Green, 2001). This view is also referred to as technological instrumentalism.

In what is often considered a definitive reflection on the topic, the historian Melvin Kranzberg famously wrote in the first of his six laws of technology: "Technology is neither good nor bad; nor is it neutral."

Criticism

Skepticism about technological determinism emerged alongside increased pessimism about techno-science in the mid-20th century, in particular around the use of nuclear energy in the production of nuclear weapons, Nazi human experimentation during World War II, and the problems of economic development in the Third World. As a direct consequence, desire for greater control of the course of development of technology gave rise to disenchantment with the model of technological determinism in academia.

Modern theorists of technology and society no longer consider technological determinism to be a very accurate view of the way in which we interact with technology, even though determinist assumptions and language fairly saturate the writings of many boosters of technology, the business pages of many popular magazines, and much reporting on technology. Instead, research in science and technology studies, social construction of technology and related fields have emphasised more nuanced views that resist easy causal formulations. They emphasise that "The relationship between technology and society cannot be reduced to a simplistic cause-and-effect formula. It is, rather, an 'intertwining'", whereby technology does not determine but "operates, and are operated upon in a complex social field" (Murphie and Potts).

T. Snyder approached the aspect of technological determinism in his concept: 'politics of inevitability'. A concept utilized by politicians in which society is promised the idea that the future will be only more of the present, this concept removes responsibility. This could be applied to free markets, the development of nation states and technological progress.

In his article "Subversive Rationalization: Technology, Power and Democracy with Technology," Andrew Feenberg argues that technological determinism is not a very well founded concept by illustrating that two of the founding theses of determinism are easily questionable and in doing so calls for what he calls democratic rationalization (Feenberg 210–212).

Prominent opposition to technologically determinist thinking has emerged within work on the social construction of technology (SCOT). SCOT research, such as that of Mackenzie and Wajcman (1997) argues that the path of innovation and its social consequences are strongly, if not entirely shaped by society itself through the influence of culture, politics, economic arrangements, regulatory mechanisms and the like. In its strongest form, verging on social determinism, "What matters is not the technology itself, but the social or economic system in which it is embedded" (Langdon Winner).

In his influential but contested (see Woolgar and Cooper, 1999) article "Do Artifacts Have Politics?", Langdon Winner illustrates not a form of determinism but the various sources of the politics of technologies. Those politics can stem from the intentions of the designer and the culture of the society in which a technology emerges or can stem from the technology itself, a "practical necessity" for it to function. For instance, New York City urban planner Robert Moses is purported to have built Long Island's parkway tunnels too low for buses to pass in order to keep minorities away from the island's beaches, an example of externally inscribed politics. On the other hand, an authoritarian command-and-control structure is a practical necessity of a nuclear power plant if radioactive waste is not to fall into the wrong hands. As such, Winner neither succumbs to technological determinism nor social determinism. The source of a technology's politics is determined only by carefully examining its features and history.

Although "The deterministic model of technology is widely propagated in society" (Sarah Miller), it has also been widely questioned by scholars. Lelia Green explains that, "When technology was perceived as being outside society, it made sense to talk about technology as neutral". Yet, this idea fails to take into account that culture is not fixed and society is dynamic. When "Technology is implicated in social processes, there is nothing neutral about society" (Lelia Green). This confirms one of the major problems with "technological determinism and the resulting denial of human responsibility for change. There is a loss of human involvement that shape technology and society" (Sarah Miller).

Another conflicting idea is that of technological somnambulism, a term coined by Winner in his essay "Technology as Forms of Life". Winner wonders whether or not we are simply sleepwalking through our existence with little concern or knowledge as to how we truly interact with technology. In this view, it is still possible for us to wake up and once again take control of the direction in which we are traveling (Winner 104). However, it requires society to adopt Ralph Schroeder's claim that, "users don't just passively consume technology, but actively transform it".

In opposition to technological determinism are those who subscribe to the belief of social determinism and postmodernism. Social determinists believe that social circumstances alone select which technologies are adopted, with the result that no technology can be considered "inevitable" solely on its own merits. Technology and culture are not neutral and when knowledge comes into the equation, technology becomes implicated in social processes. The knowledge of how to create and enhance technology, and of how to use technology is socially bound knowledge. Postmodernists take another view, suggesting that what is right or wrong is dependent on circumstance. They believe technological change can have implications on the past, present and future. While they believe technological change is influenced by changes in government policy, society and culture, they consider the notion of change to be a paradox, since change is constant.

Media and cultural studies theorist Brian Winston, in response to technological determinism, developed a model for the emergence of new technologies which is centered on the Law of the suppression of radical potential. In two of his books – Technologies of Seeing: Photography, Cinematography and Television (1997) and Media Technology and Society (1998) – Winston applied this model to show how technologies evolve over time, and how their 'invention' is mediated and controlled by society and societal factors which suppress the radical potential of a given technology.

The stirrup

One continued argument for technological determinism is centered on the stirrup and its impact on the creation of feudalism in Europe in the late 8th century/early 9th century. Lynn White is credited with first drawing this parallel between feudalism and the stirrup in his book Medieval Technology and Social Change, which was published in 1962 and argued that as "it made possible mounted shock combat", the new form of war made the soldier that much more efficient in supporting feudal townships (White, 2). According to White, the superiority of the stirrup in combat was found in the mechanics of the lance charge: "The stirrup made possible- though it did not demand- a vastly more effective mode of attack: now the rider could law his lance at rest, held between the upper arm and the body, and make at his foe, delivering the blow not with his muscles but with the combined weight of himself and his charging stallion (White, 2)." White draws from a large research base, particularly Heinrich Brunner's "Der Reiterdienst und die Anfänge des Lehnwesens" in substantiating his claim of the emergence of feudalism. In focusing on the evolution of warfare, particularly that of cavalry in connection with Charles Martel's "diversion of a considerable part of the Church's vast military riches...from infantry to cavalry", White draws from Brunner's research and identifies the stirrup as the underlying cause for such a shift in military division and the subsequent emergence of feudalism (White, 4). Under the new brand of warfare garnered from the stirrup, White implicitly argues in favor of technological determinism as the vehicle by which feudalism was created.

Though an accomplished work, White's Medieval Technology and Social Change has since come under heavy scrutiny and condemnation. The most volatile critics of White's argument at the time of its publication, P.H. Sawyer and R.H. Hilton, call the work as a whole "a misleading adventurist cast to old-fashioned platitudes with a chain of obscure and dubious deductions from scanty evidence about the progress of technology (Sawyer and Hilton, 90)." They further condemn his methods and, by association, the validity of technological determinism: "Had Mr. White been prepared to accept the view that the English and Norman methods of fighting were not so very different in the eleventh century, he would have made the weakness of his argument less obvious, but the fundamental failure would remain: the stirrup cannot alone explain the changes it made possible (Sawyer and Hilton, 91)." For Sawyer and Hilton, though the stirrup may be useful in the implementation of feudalism, it cannot be credited for the creation of feudalism alone.

Despite the scathing review of White's claims, the technological determinist aspect of the stirrup is still in debate. Alex Roland, author of "Once More into the Stirrups; Lynne White Jr, Medieval Technology and Social Change", provides an intermediary stance: not necessarily lauding White's claims, but providing a little defense against Sawyer and Hilton's allegations of gross intellectual negligence. Roland views White's focus on technology to be the most relevant and important aspect of Medieval Technology and Social Change rather than the particulars of its execution: "But can these many virtues, can this utility for historians of technology, outweigh the most fundamental standards of the profession? Can historians of technology continue to read and assign a book that is, in the words of a recent critic, "shot through with over-simplification, with a progression of false connexions between cause and effect, and with evidence presented selectively to fit with [White's] own pre-conceived ideas"? The answer, I think, is yes, at least a qualified yes (Roland, 574-575)." Objectively, Roland claims Medieval Technology and Social Change a variable success, at least as "Most of White's argument stands... the rest has sparked useful lines of research (Roland, 584)." This acceptance of technological determinism is ambiguous at best, neither fully supporting the theory at large nor denouncing it, rather placing the construct firmly in the realm of the theoretical. Roland neither views technological determinism as completely dominant over history nor completely absent as well; in accordance with the above criterion of technological determinist structure, would Roland be classified as a "soft determinist".

Notable technological determinists

Thomas L. Friedman, American journalist, columnist and author, admits to being a technological determinist in his book The World is Flat.

Futurist Raymond Kurzweil's theories about a technological singularity follow a technologically deterministic view of history.

Some interpret Karl Marx as advocating technological determinism, with such statements as "The Handmill gives you society with the feudal lord: the steam-mill, society with the industrial capitalist" (The Poverty of Philosophy, 1847), but others argue that Marx was not a determinist.

Technological determinist Walter J. Ong reviews the societal transition from an oral culture to a written culture in his work Orality and Literacy: The Technologizing of the Word (1982). He asserts that this particular development is attributable to the use of new technologies of literacy (particularly print and writing,) to communicate thoughts which could previously only be verbalized. He furthers this argument by claiming that writing is purely context dependent as it is a "secondary modelling system" (8). Reliant upon the earlier primary system of spoken language, writing manipulates the potential of language as it depends purely upon the visual sense to communicate the intended information. Furthermore, the rather stagnant technology of literacy distinctly limits the usage and influence of knowledge, it unquestionably effects the evolution of society. In fact, Ong asserts that "more than any other single invention, writing has transformed human consciousness" (Ong 1982: 78).

Media determinism as a form of technological determinism

Media determinism is a form of technological determinism, a philosophical and sociological position which posits the power of the media to impact society. Two foundational media determinists are the Canadian scholars Harold Innis and Marshall McLuhan. One of the best examples of technological determinism in media theory is Marshall McLuhan's theory "the medium is the message" and the ideas of his mentor Harold Adams Innis. Both these Canadian theorists saw media as the essence of civilization. The association of different media with particular mental consequences by McLuhan and others can be seen as related to technological determinism. It is this variety of determinism that is referred to as media determinism. According to McLuhan, there is an association between communications media/technology and language; similarly, Benjamin Lee Whorf argues that language shapes our perception of thinking (linguistic determinism). For McLuhan, media is a more powerful and explicit determinant than is the more general concept of language. McLuhan was not necessarily a hard determinist. As a more moderate version of media determinism, he proposed that our use of particular media may have subtle influences on us, but more importantly, it is the social context of use that is crucial. Media determinism is a form of the popular dominant theory of the relationship between technology and society. In a determinist view, technology takes on an active life of its own and is seen be as a driver of social phenomena. Innis believed that the social, cultural, political, and economic developments of each historical period can be related directly to the technology of the means of mass communication of that period. In this sense, like Dr. Frankenstein's monster, technology itself appears to be alive, or at least capable of shaping human behavior. However, it has been increasingly subject to critical review by scholars. For example, scholar Raymond Williams, criticizes media determinism and rather believes social movements define technological and media processes. With regard to communications media, audience determinism is a viewpoint opposed to media determinism. This is described as instead of media being presented as doing things to people; the stress is on the way people do things with media. Individuals need to be aware that the term "deterministic" is a negative one for many social scientists and modern sociologists; in particular they often use the word as a term of abuse.

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...