Original link: http://phys.org/news/2015-04-robot-self-awareness.html
A year ago, researchers at Bielefeld University showed that their software endowed the walking robot Hector with a simple form of consciousness. Their new research goes one step forward: they have now developed a software architecture that could enable Hector to see himself as others see him. "With this, he would have reflexive consciousness," explains Dr. Holk Cruse, professor at the Cluster of Excellence Cognitive Interaction Technology (CITEC) at Bielefeld University. The architecture is based on artificial neural networks. Together with colleague Dr. Malte Schilling, Prof. Dr. Cruse published this new study in the online collection Open MIND, a volume from the Mind-Group, which is a group of philosophers and other scientists studying the mind, consciousness, and cognition.
Both biologists are involved in further developing and enhancing walking robot Hector's software. The robot is modelled after a stick insect. How Hector walks and deals with obstacles in its path were first demonstrated at the end of 2014. Next, Hector's extended software will now be tested using a computer simulation. "What works in the computer simulation must then, in a second phase, be transferred over to the robot and tested on it," explains Cruse. Drs. Schilling and Cruse are investigating to what extent various higher level mental states, for example aspects of consciousness, may develop in Hector with this software – even though these traits were not specifically built in to the robot beforehand. The researchers speak of "emergent" abilities, that is, capabilities that suddenly appear or emerge.
Until now, Hector has been a reactive system. It reacts to stimuli in its surroundings. Thanks to the software program "Walknet," Hector can walk with an insect-like gait, and another program called "Navinet" may enable the robot to find a path to a distant target. Both researchers have also developed the software expansion programme "reaCog." This software is activated in instances when both of the other programmes are unable to solve a given problem. This new expanded software enables the robot to simulate "imagined behaviour" that may solve the problem: first, it looks for new solutions and evaluates whether this action makes sense, instead of just automatically completing a pre-determined operation. Being able to perform imagined actions is a central characteristic of a simple form of consciousness.
In their previous research, both CITEC researchers had already determined that Hector's control system could adopt a number of higher-level mental states. "Intentions, for instance, can be found in the system," explains Malte Schilling. These "inner mental states," such as intentions, make goal-directed behaviour possible, which for example may direct the robot to a certain location (like a charging station). The researchers have also identified how properties of emotions may show up in the system. "Emotions can be read from behaviour. For example, a person who is happy takes more risks and makes decisions faster than someone who is anxious," says Holk Cruse. This behaviour could also be implemented in the control model reaCog: "Depending on its inner mental state, the system may adopt quick, but risky solutions, and at other times, it may take its time to search for a safer solution."
To examine which forms of consciousness are present in Hector, the researchers rely on psychological and neurobiological definitions in particular. As Holk Cruse explains, "A human possesses reflexive consciousness when he not only can perceive what he experiences, but also has the ability to experience that he is experiencing something. Reflexive consciousness thus exists if a human or a technical system can see itself "from outside of itself," so to speak."
In their new research, Cruse and Schilling show a way in which reflexive consciousness could emerge. "With the new software, Hector could observe its inner mental state – to a certain extent, its moods – and direct its actions using this information," says Malte Schilling. "What makes this unique, however, is that with our software expansion, the basic faculties are prepared so that Hector may also be able to assess the mental state of others. It may be able to sense other people's intentions or expectations and act accordingly." Dr. Cruse explains further, "the robot may then be able to "think": what does this subject expect from me? And then it can orient its actions accordingly."
Cruse and Schilling's study is part of the online publication "Open MIND." This approximately 2,000-page collection marks the 10-year anniversary of the MIND-Group and contains 39 original articles from scientists in the fields of philosophy, psychology, and neurological research. Dr. Thomas Metzinger of the University of Mainz is the initiator and co-editor of the volume. The collection can be accessed online for free at www.open-mind.net and is also available in print.
Explore further:
First steps for Hector the robot stick insect
More information: Holk Cruse, Malte Schilling: "Mental States as Emergent Properties. From Walking to Consciousness." In: Thomas Metzinger, Jennifer Windt (eds.): Open Mind. MIND Group, 335–373, open-mind.net/papers/mental-states-as-emergent-properties-from-walking-to-consciousness, published on 20 January 2015.
Provided by Bielefeld University