Search This Blog

Tuesday, May 2, 2023

Cognitive revolution

From Wikipedia, the free encyclopedia

The cognitive revolution was an intellectual movement that began in the 1950s as an interdisciplinary study of the mind and its processes. It later became known collectively as cognitive science. The relevant areas of interchange were between the fields of psychology, linguistics, computer science, anthropology, neuroscience, and philosophy. The approaches used were developed within the then-nascent fields of artificial intelligence, computer science, and neuroscience. In the 1960s, the Harvard Center for Cognitive Studies and the Center for Human Information Processing at the University of California, San Diego were influential in developing the academic study of cognitive science. By the early 1970s, the cognitive movement had surpassed behaviorism as a psychological paradigm. Furthermore, by the early 1980s the cognitive approach had become the dominant line of research inquiry across most branches in the field of psychology.

A key goal of early cognitive psychology was to apply the scientific method to the study of human cognition. Some of the main ideas and developments from the cognitive revolution were the use of the scientific method in cognitive science research, the necessity of mental systems to process sensory input, the innateness of these systems, and the modularity of the mind. Important publications in triggering the cognitive revolution include psychologist George Miller's 1956 article "The Magical Number Seven, Plus or Minus Two" (one of the most frequently cited papers in psychology), linguist Noam Chomsky's Syntactic Structures (1957) and "Review of B. F. Skinner's Verbal Behavior" (1959), and foundational works in the field of artificial intelligence by John McCarthy, Marvin Minsky, Allen Newell, and Herbert Simon, such as the 1958 article "Elements of a Theory of Human Problem Solving". Ulric Neisser's 1967 book Cognitive Psychology was also a landmark contribution.

Historical background

Prior to the cognitive revolution, behaviorism was the dominant trend in psychology in the United States. Behaviorists were interested in "learning," which was seen as "the novel association of stimuli with responses." Animal experiments played a significant role in behaviorist research, and prominent behaviorist J. B. Watson, interested in describing the responses of humans and animals as one group, stated that there was no need to distinguish between the two. Watson hoped to learn to predict and control behavior through his research. The popular Hull-Spence stimulus-response approach was, according to George Mandler, impossible to use to research topics that held the interest of cognitive scientists, like memory and thought, because both the stimulus and the response were thought of as completely physical events. Behaviorists typically did not research these subjects. B. F. Skinner, a functionalist behaviorist, criticized certain mental concepts like instinct as "explanatory fiction(s)," ideas that assume more than humans actually know about a mental concept. Various types of behaviorists had different views on the exact role (if any) that consciousness and cognition played in behavior. Although behaviorism was popular in the United States, Europe was not particularly influenced by it, and research on cognition could easily be found in Europe during this time.

Noam Chomsky has framed the cognitive and behaviorist positions as rationalist and empiricist, respectively, which are philosophical positions that arose long before behaviorism became popular and the cognitive revolution occurred. Empiricists believe that humans acquire knowledge only through sensory input, while rationalists believe that there is something beyond sensory experience that contributes to human knowledge. However, whether Chomsky's position on language fits into the traditional rationalist approach has been questioned by philosopher John Cottingham.

George Miller, one of the scientists involved in the cognitive revolution, sets the date of its beginning as September 11, 1956, when several researchers from fields like experimental psychology, computer science, and theoretical linguistics presented their work on cognitive science-related topics at a meeting of the ‘Special Interest Group in Information Theory’ at the Massachusetts Institute of Technology. This interdisciplinary cooperation went by several names like cognitive studies and information-processing psychology but eventually came to be known as cognitive science. Grants from the Alfred P. Sloan Foundation in the 1970s advanced interdisciplinary understanding in the relevant fields and supported the research that led to the field of cognitive neuroscience.

Main ideas

George Miller states that six fields participated in the development of cognitive science: psychology, linguistics, computer science, anthropology, neuroscience, and philosophy, with the first three playing the main roles.

The Scientific Method

A key goal of early cognitive psychology was to apply the scientific method to the study of human cognition. This was done by designing experiments that used computational models of artificial intelligence to systematically test theories about human mental processes in a controlled laboratory setting.

Mediation and information processing

When defining the "Cognitive Approach," Ulric Neisser says that humans can only interact with the "real world" through intermediary systems that process information like sensory input. As understood by a cognitive scientist, the study of cognition is the study of these systems and the ways they process information from the input. The processing includes not just the initial structuring and interpretation of the input but also the storage and later use.

Steven Pinker claims that the cognitive revolution bridged the gap between the physical world and the world of ideas, concepts, meanings and intentions. It unified the two worlds with a theory that mental life can be explained in terms of information, computation and feedback.

Innateness

In his 1975 book Reflections on Language, Noam Chomsky questions how humans can know so much, despite relatively limited input. He argues that they must have some kind of innate learning mechanism that processes input, and that mechanism must be domain-specific and innate. Chomsky observes that physical organs do not develop based on their experience, but based on some inherent genetic coding, and wrote that the mind should be treated the same way. He says that there is no question that there is some kind of innate structure in the mind, but it is less agreed upon whether the same structure is used by all organisms for different types of learning. He compares humans to rats in the task of maze running to show that the same learning theory cannot be used for different species because they would be equally good at what they are learning, which is not the case. He also says that even within humans, using the same learning theory for multiple types of learning could be possible, but there is no solid evidence to suggest it. He proposes a hypothesis that claims that there is a biologically based language faculty that organizes the linguistic information in the input and constrains human language to a set of particular types of grammars. He introduces universal grammar, a set of inherent rules and principles that all humans have to govern language, and says that the components of universal grammar are biological. To support this, he points out that children seem to know that language has a hierarchical structure, and they never make mistakes that one would expect from a hypothesis that language is linear.

Steven Pinker has also written on this subject from the perspective of modern-day cognitive science. He says that modern cognitive scientists, like figures in the past such as Gottfried Wilhelm Leibniz (1646-1716), don't believe in the idea of the mind starting a "blank slate." Though they have disputes on the nature-nurture diffusion, they all believe that learning is based on something innate to humans. Without this innateness, there will be no learning process. He points out that humans' acts are non-exhaustive, even though basic biological functions are finite. An example of this from linguistics is the fact that humans can produce infinite sentences, most of which are brand new to the speaker themselves, even though the words and phrases they have heard are not infinite.

Pinker, who agrees with Chomsky's idea of innate universal grammar, claims that although humans speak around six thousand mutually unintelligible languages, the grammatical programs in their minds differ far less than the actual speech. Many different languages can be used to convey the same concepts or ideas, which suggests there may be a common ground for all the languages.

Modularity of the mind

Pinker claims another important idea from the cognitive revolution was that the mind is modular, with many parts cooperating to generate a train of thought or an organized action. It has different distinct systems for different specific missions. Behaviors can vary across cultures, but the mental programs that generate the behaviors don't need to be varied.

Criticism

There have been criticisms of the typical characterization of the shift from behaviorism to cognitivism.

Henry L. Roediger III argues that the common narrative most people believe about the cognitive revolution is inaccurate. The narrative he describes states that psychology started out well but lost its way and fell into behaviorism, but this was corrected by the Cognitive Revolution, which essentially put an end to behaviorism. He claims that behavior analysis is actually still an active area of research that produces successful results in psychology and points to the Association for Behavior Analysis International as evidence. He claims that behaviorist research is responsible for successful treatments of autism, stuttering, and aphasia, and that most psychologists actually study observable behavior, even if they interpret their results cognitively. He believes that the change from behaviorism to cognitivism was gradual, slowly evolving by building on behaviorism.

Lachman and Butterfield were among the first to imply that cognitive psychology has a revolutionary origin. Thomas H. Leahey has criticized the idea that the introduction of behaviorism and the cognitive revolution were actually revolutions and proposed an alternative history of American psychology as "a narrative of research traditions."

Other authors criticize behaviorism, but they also criticize the cognitive revolution for having adopted new forms of anti-mentalism.

Cognitive psychologist Jerome Bruner criticized the adoption of the computational theory of mind and the exclusion of meaning from cognitive science, and he characterized one of the primary objects of the cognitive revolution as changing the study of psychology so that meaning was its core.

His understanding of the cognitive revolution revolves entirely around "meaning-making" and the hermeneutic description of how people go about this. He believes that the cognitive revolution steered psychology away from behaviorism and this was good, but then another form of anti-mentalism took its place: computationalism. Bruner states that the cognitive revolution should replace behaviorism rather than only modify it.

Neuroscientist Gerald Edelman argues in his book Bright Air, Brilliant Fire (1991) that a positive result of the emergence of "cognitive science" was the departure from "simplistic behaviorism”. However, he adds, a negative result was the growing popularity of a total misconception of the nature of thought: the computational theory of mind or cognitivism, which asserts that the brain is a computer that processes symbols whose meanings are entities of the objective world. In this view, the symbols of the mind correspond exactly to entities or categories in the world defined by criteria of necessary and sufficient conditions, that is, classical categories. The representations would be manipulated according to certain rules that constitute a syntax.

Edelman rejects the idea that objects of the world come in classical categories, and also rejects the idea that the brain/mind is a computer. The author rejects behaviorism (a points he also makes in his 2006 book Second Nature. Brain science and human knowledge), but also cognitivism (the computational-representational theory of the mind), since the latter conceptualizes the mind as a computer and meaning as objective correspondence. Furthermore, Edelman criticizes "functionalism", the idea that formal and abstract functional properties of the mind can be analyzed without making direct reference to the brain and its processes. 

Edelman asserts that most of those who work in the field of cognitive psychology and cognitive science seem to adhere to this computational view, but he mentions some important exceptions. Exceptions include John Searle, Jerome Bruner, George Lakoff, Ronald Langacker, Alan Gauld, Benny Shanon, Claes von Hofsten, and others. Edelman argues that he agrees with the critical and dissenting approaches of these authors that are exceptions to the majority view of cognitivism. 

Perceptual symbols, imagery and the cognitive neuroscience revolution

In their paper “The cognitive neuroscience revolution”, Gualtiero Piccinini and Worth Boone argue that cognitive neuroscience emerged as a discipline in the late 1980s. Prior to that time, cognitive science and neuroscience had largely developed in isolation. Cognitive science developed between the 1950s and 1970s as an interdisciplinary field composed primarily of aspects of psychology, linguistics, and computer science. However, both classical symbolic computational theories and connectionist models developed largely independently of biological considerations. The authors argue that connectionist models were closer to symbolic models than to neurobiology.

Piccinini and Boone state that a revolutionary change is currently taking place: the move from cognitive science (autonomous from neuroscience) to cognitive neuroscience. The authors point out that many researchers who previously carried out psychological and behavioral studies now give properly cognitive neuroscientific explanations. They mention the example of Stephen Kosslyn, who postulated his theory of the pictorial format of mental images in the 1980s based on behavioral studies. Later, with the advent of magnetic resonance imaging technology, Kosslyn was able to show that when people imagine, the visual cortex is activated. This lent strong neuroscientific evidence to his theory of the pictorial format, refuting speculations about a supposed non-pictorial format of mental images.

According to Canales Johnson et al. (2021):

Many studies using imaging and neurophysiological techniques have shown several similarities in brain activity between visual imagery and visual perception, and have identified frontoparietal, occipital and temporal neural components of visual imagery.

— Canales Johnson et al.

Neuroscientist Joseph LeDoux in his book The Emotional Brain argues that cognitive science emerged around the middle of the 20th century, and is often described as 'the new science of the mind.' However, in fact, cognitive science is actually a science of only one part of the mind, the part that has to do with thinking, reasoning, and intellect. It leaves emotions out. “And minds without emotions are not really minds at all…”

Psychologist Lawrence Barsalou argues that human cognitive processing involves the simulation of perceptual, motor, and emotional states. The classical and ‘intellectualist’ view of cognition, considers that it is essentially processing propositional information of a verbal or numerical type. However, Barsalou's theory explains human conceptual processing by the activation of regions of the sensory cortices of different modalities, as well as of the motor cortex, and by the simulation of embodied experiences –visual, auditory, emotional, motor–, that ground meaning in experience situated in the world.

Modal symbols are those analogical mental representations linked to a specific sensory channel: for example, the representation of 'dog' through a visual image similar to a dog or through an auditory image of the barking of dogs, based on the memory of the experiences of seeing a dog or hearing its barking. Lawrence Barsalou's 'perceptual symbols' theory asserts that mental processes operate with modal symbols that maintain the sensory properties of perceptual experiences.

According to Barsalou (2020), the “grounded cognition” perspective in which his theory is framed asserts that cognition emerges from the interaction between amodal symbols, modal symbols, the body and the world. Therefore, this perspective does not rule out 'classical' symbols –amodal ones, such as those typical of verbal language or numerical reasoning– but rather considers that these interact with imagination, perception and action situated in the world.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...