Search This Blog

Saturday, February 13, 2021

Cognitive biology

From Wikipedia, the free encyclopedia

Cognitive biology is an emerging science that regards natural cognition as a biological function. It is based on the theoretical assumption that every organism—whether a single cell or multicellular—is continually engaged in systematic acts of cognition coupled with intentional behaviors, i.e., a sensory-motor coupling. That is to say, if an organism can sense stimuli in its environment and respond accordingly, it is cognitive. Any explanation of how natural cognition may manifest in an organism is constrained by the biological conditions in which its genes survive from one generation to the next. And since by Darwinian theory the species of every organism is evolving from a common root, three further elements of cognitive biology are required: (i) the study of cognition in one species of organism is useful, through contrast and comparison, to the study of another species’ cognitive abilities; (ii) it is useful to proceed from organisms with simpler to those with more complex cognitive systems, and (iii) the greater the number and variety of species studied in this regard, the more we understand the nature of cognition.

Overview

While cognitive science endeavors to explain human thought and the conscious mind, the work of cognitive biology is focused on the most fundamental process of cognition for any organism. In the past several decades, biologists have investigated cognition in organisms large and small, both plant and animal. “Mounting evidence suggests that even bacteria grapple with problems long familiar to cognitive scientists, including: integrating information from multiple sensory channels to marshal an effective response to fluctuating conditions; making decisions under conditions of uncertainty; communicating with conspecifics and others (honestly and deceptively); and coordinating collective behaviour to increase the chances of survival.” Without thinking or perceiving as humans would have it, an act of basic cognition is arguably a simple step-by-step process through which an organism senses a stimulus, then finds an appropriate response in its repertoire and enacts the response. However, the biological details of such basic cognition have neither been delineated for a great many species nor sufficiently generalized to stimulate further investigation. This lack of detail is due to the lack of a science dedicated to the task of elucidating the cognitive ability common to all biological organisms. That is to say, a science of cognitive biology has yet to be established. A prolegomena for such science was presented in 2007 and several authors have published their thoughts on the subject since the late 1970s. Yet, as the examples in the next section suggest, there is neither consensus on the theory nor widespread application in practice.

Although the two terms are sometimes used synonymously, cognitive biology should not be confused with the biology of cognition in the sense that it is used by adherents to the Chilean School of Biology of Cognition. Also known as the Santiago School, the biology of cognition is based on the work of Francisco Varela and Humberto Maturana, who crafted the doctrine of autopoiesis. Their work began in 1970 while the first mention of cognitive biology by Brian Goodwin (discussed below) was in 1977 from a different perspective.

History

'Cognitive biology' first appeared in the literature as a paper with that title by Brian C. Goodwin in 1977. There and in several related publications Goodwin explained the advantage of cognitive biology in the context of his work on morphogenesis. He subsequently moved on to other issues of structure, form, and complexity with little further mention of cognitive biology. Without an advocate, Goodwin's concept of cognitive biology has yet to gain widespread acceptance.

Aside from an essay regarding Goodwin's conception by Margaret Boden in 1980, the next appearance of ‘cognitive biology’ as a phrase in the literature came in 1986 from a professor of biochemistry, Ladislav Kováč. His conception, based on natural principles grounded in bioenergetics and molecular biology, is briefly discussed below. Kováč's continued advocacy has had a greater influence in his homeland, Slovakia, than elsewhere partly because several of his most important papers were written and published only in Slovakian.

By the 1990s, breakthroughs in molecular, cell, evolutionary, and developmental biology generated a cornucopia of data-based theory relevant to cognition. Yet aside from the theorists already mentioned, no one was addressing cognitive biology except for Kováč.

Kováč’s cognitive biology

Ladislav Kováč's “Introduction to cognitive biology” (Kováč, 1986a) lists ten ‘Principles of Cognitive Biology.’ A closely related thirty page paper was published the following year: “Overview: Bioenergetics between chemistry, genetics and physics.” (Kováč, 1987). Over the following decades, Kováč elaborated, updated, and expanded these themes in frequent publications, including "Fundamental principles of cognitive biology" (Kováč, 2000), “Life, chemistry, and cognition” (Kováč, 2006a), "Information and Knowledge in Biology: Time for Reappraisal” (Kováč, 2007) and "Bioenergetics: A key to brain and mind" (Kováč, 2008).

Academic usage

University seminar

The concept of cognitive biology is exemplified by this seminar description:

Cognitive science has focused primarily on human cognitive activities. These include perceiving, remembering and learning, evaluating and deciding, planning actions, etc. But humans are not the only organisms that engage in these activities. Indeed, virtually all organisms need to be able to procure information both about their own condition and their environment and regulate their activities in ways appropriate to this information. In some cases species have developed distinctive ways of performing cognitive tasks. But in many cases these mechanisms have been conserved and modified in other species. This course will focus on a variety of organisms not usually considered in cognitive science such as bacteria, planaria, leeches, fruit flies, bees, birds and various rodents, asking about the sorts of cognitive activities these organisms perform, the mechanisms they employ to perform them, and what lessons about cognition more generally we might acquire from studying them.

University workgroup

The University of Adelaide has established a "Cognitive Biology" workgroup using this operating concept:

Cognition is, first and foremost, a natural biological phenomenon — regardless of how the engineering of artificial intelligence proceeds. As such, it makes sense to approach cognition like other biological phenomena. This means first assuming a meaningful degree of continuity among different types of organisms—an assumption borne out more and more by comparative biology, especially genomics—studying simple model systems (e.g., microbes, worms, flies) to understand the basics, then scaling up to more complex examples, such as mammals and primates, including humans.

Members of the group study the biological literature on simple organisms (e.g., nematode) in regard to cognitive process and look for homologues in more complex organisms (e.g., crow) already well studied. This comparative approach is expected to yield simple cognitive concepts common to all organisms. “It is hoped a theoretically well-grounded toolkit of basic cognitive concepts will facilitate the use and discussion of research carried out in different fields to increase understanding of two foundational issues: what cognition is and what cognition does in the biological context.” (Bold letters from original text.)

The group's choice of name, as they explain on a separate webpage, might have been ‘embodied cognition’ or ‘biological cognitive science.’ But the group chose ‘cognitive biology’ for the sake of (i) emphasis and (ii) method. For the sake of emphasis, (i) “We want to keep the focus on biology because for too long cognition was considered a function that could be almost entirely divorced from its physical instantiation, to the extent that whatever could be said of cognition almost by definition had to be applicable to both organisms and machines.” (ii) The method is to “assume (if only for the sake of enquiry) that cognition is a biological function similar to other biological functions—such as respiration, nutrient circulation, waste elimination, and so on.”

The method supposes that the genesis of cognition is biological, i.e., the method is biogenic. The host of the group's website has said elsewhere that cognitive biology requires a biogenic approach, having identified ten principles of biogenesis in an earlier work. The first four biogenic principles are quoted here to illustrate the depth at which the foundations have been set at the Adelaide school of cognitive biology:

  1. “Complex cognitive capacities have evolved from simpler forms of cognition. There is a continuous line of meaningful descent.”
  2. “Cognition directly or indirectly modulates the physico-chemical-electrical processes that constitute an organism .”
  3. “Cognition enables the establishment of reciprocal causal relations with an environment, leading to exchanges of matter and energy that are essential to the organism’s continued persistence, well-being or replication.”
  4. “Cognition relates to the (more or less) continuous assessment of system needs relative to prevailing circumstances, the potential for interaction, and whether the current interaction is working or not.”

Other universities

  • As another example, the Department für Kognitionsbiologie at the University of Vienna declares in its mission statement a strong commitment “to experimental evaluation of multiple, testable hypotheses” regarding cognition in terms of evolutionary and developmental history as well as adaptive function and mechanism, whether the mechanism is cognitive, neural, and/or hormonal. “The approach is strongly comparative: multiple species are studied, and compared within a rigorous phylogenetic framework, to understand the evolutionary history and adaptive function of cognitive mechanisms (‘cognitive phylogenetics’).” Their website offers a sample of their work: “Social Cognition and the Evolution of Language: Constructing Cognitive Phylogenies.”
  • A more restricted example can be found with the Cognitive Biology Group, Institute of Biology, Faculty of Science, Otto-von-Guericke University (OVGU) in Magdeburg, Germany. The group offers courses titled “Neurobiology of Consciousness” and “Cognitive Neurobiology.” Its website lists the papers generated from its lab work, focusing on the neural correlates of perceptual consequences and visual attention. The group's current work is aimed at detailing a dynamic known as ‘multistable perception.’ The phenomenon, described in a sentence: “Certain visual displays are not perceived in a stable way but, from time to time and seemingly spontaneously, their appearance wavers and settles in a distinctly different form.”
  • A final example of university commitment to cognitive biology can be found at Comenius University in Bratislava, Slovakia. There in the Faculty of Natural Sciences, the Bratislava Biocenter is presented as a consortium of research teams working in biomedical sciences. Their website lists the Center for Cognitive Biology in the Department of Biochemistry at the top of the page, followed by five lab groups, each at a separate department of bioscience. The webpage for the Center for Cognitive Biology offers a link to "Foundations of Cognitive Biology," a page that simply contains a quotation from a paper authored by Ladislav Kováč, the site's founder. His perspective is briefly discussed below.

Cognitive biology as a category

The words ‘cognitive’ and ‘biology’ are also used together as the name of a category. The category of cognitive biology has no fixed content but, rather, the content varies with the user. If the content can only be recruited from cognitive science, then cognitive biology would seem limited to a selection of items in the main set of sciences included by the interdisciplinary concept—cognitive psychology, artificial intelligence, linguistics, philosophy, neuroscience, and cognitive anthropology. These six separate sciences were allied “to bridge the gap between brain and mind” with an interdisciplinary approach in the mid-1970s. Participating scientists were concerned only with human cognition. As it gained momentum, the growth of cognitive science in subsequent decades seemed to offer a big tent to a variety of researchers. Some, for example, considered evolutionary epistemology a fellow-traveler. Others appropriated the keyword, as for example Donald Griffin in 1978, when he advocated the establishment of cognitive ethology.

Meanwhile, breakthroughs in molecular, cell, evolutionary, and developmental biology generated a cornucopia of data-based theory relevant to cognition. Categorical assignments were problematic. For example, the decision to append cognitive to a body of biological research on neurons, e.g. the cognitive biology of neuroscience, is separate from the decision to put such body of research in a category named cognitive sciences. No less difficult a decision needs be made—between the computational and constructivist approach to cognition, and the concomitant issue of simulated v. embodied cognitive models—before appending biology to a body of cognitive research, e.g. the cognitive science of artificial life.

One solution is to consider cognitive biology only as a subset of cognitive science. For example, a major publisher's website displays links to material in a dozen domains of major scientific endeavor. One of which is described thus: “Cognitive science is the study of how the mind works, addressing cognitive functions such as perception and action, memory and learning, reasoning and problem solving, decision-making and consciousness.” Upon its selection from the display, the Cognitive Science page offers in nearly alphabetical order these topics: Cognitive Biology, Computer Science, Economics, Linguistics, Psychology, Philosophy, and Neuroscience. Linked through that list of topics, upon its selection the Cognitive Biology page offers a selection of reviews and articles with biological content ranging from cognitive ethology through evolutionary epistemology; cognition and art; evo-devo and cognitive science; animal learning; genes and cognition; cognition and animal welfare; etc.

A different application of the cognitive biology category is manifest in the 2009 publication of papers presented at a three-day interdisciplinary workshop on “The New Cognitive Sciences” held at the Konrad Lorenz Institute for Evolution and Cognition Research in 2006. The papers were listed under four headings, each representing a different domain of requisite cognitive ability: (i) space, (ii) qualities and objects, (iii) numbers and probabilities, and (iv) social entities. The workshop papers examined topics ranging from “Animals as Natural Geometers” and “Color Generalization by Birds” through “Evolutionary Biology of Limited Attention” and “A comparative Perspective on the Origin of Numerical Thinking” as well as “Neuroethology of Attention in Primates” and ten more with less colorful titles. “[O]n the last day of the workshop the participants agreed [that] the title ‘Cognitive Biology’ sounded like a potential candidate to capture the merging of the cognitive and the life sciences that the workshop aimed at representing.” Thus the publication of Tommasi, et al. (2009), Cognitive Biology: Evolutionary and Developmental Perspectives on Mind, Brain and Behavior.

A final example of categorical use comes from an author’s introduction to his 2011 publication on the subject, Cognitive Biology: Dealing with Information from Bacteria to Minds. After discussing the differences between the cognitive and biological sciences, as well as the value of one to the other, the author concludes: “Thus, the object of this book should be considered as an attempt at building a new discipline, that of cognitive biology, which endeavors to bridge these two domains.” There follows a detailed methodology illustrated by examples in biology anchored by concepts from cybernetics (e.g., self-regulatory systems) and quantum information theory (regarding probabilistic changes of state) with an invitation "to consider system theory together with information theory as the formal tools that may ground biology and cognition as traditional mathematics grounds physics.”

Cognition

From Wikipedia, the free encyclopedia

A cognitive model illustrated by Robert Fludd
A cognitive model, as illustrated by Robert Fludd (1619)

Cognition (/kɒɡˈnɪʃ(ə)n/ (About this soundlisten)) refers to "the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses". It encompasses many aspects of intellectual functions and processes such as: attention, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and "computation", problem solving and decision making, comprehension and production of language. Cognitive processes use existing knowledge and generate new knowledge.

Cognitive processes are analyzed from different perspectives within different contexts, notably in the fields of linguistics, anesthesia, neuroscience, psychiatry, psychology, education, philosophy, anthropology, biology, systemics, logic, and computer science. These and other different approaches to the analysis of cognition are synthesised in the developing field of cognitive science, a progressively autonomous academic discipline.

Etymology

The word cognition dates back to the 15th century, where it meant "thinking and awareness". The term comes from the Latin noun cognitio ('examination,' 'learning,' or 'knowledge'), derived from the verb cognosco, a compound of con ('with') and gnōscō ('know'). The latter half, gnōscō, itself is a cognate of a Greek verb, gi(g)nόsko (γι(γ)νώσκω, 'I know,' or 'perceive').

Early studies

Despite the word cognitive itself dating back to the 15th century, attention to cognitive processes came about more than eighteen centuries earlier, beginning with Aristotle (384–322 BC) and his interest in the inner workings of the mind and how they affect the human experience. Aristotle focused on cognitive areas pertaining to memory, perception, and mental imagery. He placed great importance on ensuring that his studies were based on empirical evidence, that is, scientific information that is gathered through observation and conscientious experimentation. Two millennia later, the groundwork for modern concepts of cognition was laid during the Enlightenment by thinkers such as John Locke and Dugald Stewart who sought to develop a model of the mind in which ideas were acquired, remembered and manipulated.

During the early nineteenth century cognitive models were developed both in philosophy—particularly by authors writing about the philosophy of mind—and within medicine, especially by physicians seeking to understand how to cure madness. In Britain, these models were studied in the academy by scholars such as James Sully at University College London, and they were even used by politicians when considering the national Elementary Education Act of 1870.

As psychology emerged as a burgeoning field of study in Europe, whilst also gaining a following in America, scientists such as Wilhelm Wundt, Herman Ebbinghaus, Mary Whiton Calkins, and William James would offer their contributions to the study of human cognition.

Early theorists

Wilhelm Wundt (1832–1920) emphasized the notion of what he called introspection: examining the inner feelings of an individual. With introspection, the subject had to be careful to describe their feelings in the most objective manner possible in order for Wundt to find the information scientific. Though Wundt's contributions are by no means minimal, modern psychologists find his methods to be quite subjective and choose to rely on more objective procedures of experimentation to make conclusions about the human cognitive process.

Hermann Ebbinghaus (1850–1909) conducted cognitive studies that mainly examined the function and capacity of human memory. Ebbinghaus developed his own experiment in which he constructed over 2,000 syllables made out of nonexistent words, for instance EAS. He then examined his own personal ability to learn these non-words. He purposely chose non-words as opposed to real words to control for the influence of pre-existing experience on what the words might symbolize, thus enabling easier recollection of them. Ebbinghaus observed and hypothesized a number of variables that may have affected his ability to learn and recall the non-words he created. One of the reasons, he concluded, was the amount of time between the presentation of the list of stimuli and the recitation or recall of same. Ebbinghaus was the first to record and plot a "learning curve" and a "forgetting curve". His work heavily influenced the study of serial position and its effect on memory (discussed further below).

Mary Whiton Calkins (1863–1930) was an influential American pioneer in the realm of psychology. Her work also focused on the human memory capacity. A common theory, called the recency effect, can be attributed to the studies that she conducted. The recency effect, also discussed in the subsequent experiment section, is the tendency for individuals to be able to accurately recollect the final items presented in a sequence of stimuli. Calkin's theory is closely related to the aforementioned study and conclusion of the memory experiments conducted by Hermann Ebbinghaus.

William James (1842–1910) is another pivotal figure in the history of cognitive science. James was quite discontent with Wundt's emphasis on introspection and Ebbinghaus' use of nonsense stimuli. He instead chose to focus on the human learning experience in everyday life and its importance to the study of cognition. James' most significant contribution to the study and theory of cognition was his textbook Principles of Psychology that preliminarily examines aspects of cognition such as perception, memory, reasoning, and attention.

René Descartes (1596-1650) was a seventeenth-century philosopher who came up with the phrase "Cogito, ergo sum." Which means "I think, therefore I am." He took a philosophical approach to the study of cognition and the mind, with his Meditations he wanted people to meditate along with him to come to the same conclusions as he did but in their own free cognition.

Psychology

Diagram
When the mind makes a generalization such as the concept of tree, it extracts similarities from numerous examples; the simplification enables higher-level thinking (abstract thinking).

In psychology, the term "cognition" is usually used within an information processing view of an individual's psychological functions, and such is the same in cognitive engineering. In the study of social cognition, a branch of social psychology, the term is used to explain attitudes, attribution, and group dynamics.

Human cognition is conscious and unconscious, concrete or abstract, as well as intuitive (like knowledge of a language) and conceptual (like a model of a language). It encompasses processes such as memory, association, concept formation, pattern recognition, language, attention, perception, action, problem solving, and mental imagery. Traditionally, emotion was not thought of as a cognitive process, but now much research is being undertaken to examine the cognitive psychology of emotion; research is also focused on one's awareness of one's own strategies and methods of cognition, which is called metacognition.

While few people would deny that cognitive processes are a function of the brain, a cognitive theory will not necessarily make reference to the brain or to biological processes (cf. neurocognitive). It may purely describe behavior in terms of information flow or function. Relatively recent fields of study such as neuropsychology aim to bridge this gap, using cognitive paradigms to understand how the brain implements the information-processing functions (cf. cognitive neuroscience), or to understand how pure information-processing systems (e.g., computers) can simulate human cognition (cf. artificial intelligence). The branch of psychology that studies brain injury to infer normal cognitive function is called cognitive neuropsychology. The links of cognition to evolutionary demands are studied through the investigation of animal cognition.

Piaget's theory of cognitive development

For years, sociologists and psychologists have conducted studies on cognitive development, i.e. the construction of human thought or mental processes.

Jean Piaget was one of the most important and influential people in the field of developmental psychology. He believed that humans are unique in comparison to animals because we have the capacity to do "abstract symbolic reasoning". His work can be compared to Lev Vygotsky, Sigmund Freud, and Erik Erikson who were also great contributors in the field of developmental psychology. Today, Piaget is known for studying the cognitive development in children, having studied his own three children and their intellectual development, from which he would come to a theory of cognitive development that describes the developmental stages of childhood.

Stage Age or Period Description
Sensorimotor stage Infancy (0–2 years) Intelligence is present; motor activity but no symbols; knowledge is developing yet limited; knowledge is based on experiences/ interactions; mobility allows child to learn new things; some language skills are developed at the end of this stage. The goal is to develop object permanence, achieving basic understanding of causality, time, and space.
Preoperational stage Toddler and Early Childhood (2–7 years) Symbols or language skills are present; memory and imagination are developed; non-reversible and non-logical thinking; shows intuitive problem solving; begins to perceive relationships; grasps concept of conservation of numbers; predominantly egocentric thinking.
Concrete operational stage Elementary and Early Adolescence (7–12 years) Logical and systematic form of intelligence; manipulation of symbols related to concrete objects; thinking is now characterized by reversibility and the ability to take the role of another; grasps concepts of the conservation of mass, length, weight, and volume; predominantly operational thinking; nonreversible and egocentric thinking
Formal operational stage Adolescence and Adulthood (12 years and on) Logical use of symbols related to abstract concepts; acquires flexibility in thinking as well as the capacities for abstract thinking and mental hypothesis testing; can consider possible alternatives in complex reasoning and problem solving.

Common types of tests on human cognition

Serial position

The serial position experiment is meant to test a theory of memory that states that when information is given in a serial manner, we tend to remember information in the beginning of the sequence, called the primacy effect, and information in the end of the sequence, called the recency effect. Consequently, information given in the middle of the sequence is typically forgotten, or not recalled as easily. This study predicts that the recency effect is stronger than the primacy effect, because the information that is most recently learned is still in working memory when asked to be recalled. Information that is learned first still has to go through a retrieval process. This experiment focuses on human memory processes.

Word superiority

The word superiority experiment presents a subject with a word, or a letter by itself, for a brief period of time, i.e. 40ms, and they are then asked to recall the letter that was in a particular location in the word. In theory, the subject should be better able to correctly recall the letter when it was presented in a word than when it was presented in isolation. This experiment focuses on human speech and language.

Brown-Peterson

In the Brown-Peterson experiment, participants are briefly presented with a trigram and in one particular version of the experiment, they are then given a distractor task, asking them to identify whether a sequence of words are in fact words, or non-words (due to being misspelled, etc.). After the distractor task, they are asked to recall the trigram from before the distractor task. In theory, the longer the distractor task, the harder it will be for participants to correctly recall the trigram. This experiment focuses on human short-term memory.

Memory span

During the memory span experiment, each subject is presented with a sequence of stimuli of the same kind; words depicting objects, numbers, letters that sound similar, and letters that sound dissimilar. After being presented with the stimuli, the subject is asked to recall the sequence of stimuli that they were given in the exact order in which it was given. In one particular version of the experiment, if the subject recalled a list correctly, the list length was increased by one for that type of material, and vice versa if it was recalled incorrectly. The theory is that people have a memory span of about seven items for numbers, the same for letters that sound dissimilar and short words. The memory span is projected to be shorter with letters that sound similar and with longer words.

Visual search

In one version of the visual search experiment, a participant is presented with a window that displays circles and squares scattered across it. The participant is to identify whether there is a green circle on the window. In the featured search, the subject is presented with several trial windows that have blue squares or circles and one green circle or no green circle in it at all. In the conjunctive search, the subject is presented with trial windows that have blue circles or green squares and a present or absent green circle whose presence the participant is asked to identify. What is expected is that in the feature searches, reaction time, that is the time it takes for a participant to identify whether a green circle is present or not, should not change as the number of distractors increases. Conjunctive searches where the target is absent should have a longer reaction time than the conjunctive searches where the target is present. The theory is that in feature searches, it is easy to spot the target, or if it is absent, because of the difference in color between the target and the distractors. In conjunctive searches where the target is absent, reaction time increases because the subject has to look at each shape to determine whether it is the target or not because some of the distractors if not all of them, are the same color as the target stimuli. Conjunctive searches where the target is present take less time because if the target is found, the search between each shape stops.

Knowledge representation

The semantic network of knowledge representation systems has been studied in various paradigms. One of the oldest paradigms is the leveling and sharpening of stories as they are repeated from memory studied by Bartlett. The semantic differential used factor analysis to determine the main meanings of words, finding that value or "goodness" of words is the first factor. More controlled experiments examine the categorical relationships of words in free recall. The hierarchical structure of words has been explicitly mapped in George Miller's Wordnet. More dynamic models of semantic networks have been created and tested with neural network experiments based on computational systems such as latent semantic analysis (LSA), Bayesian analysis, and multidimensional factor analysis. The semantics (meaning) of words is studied by all the disciplines of cognitive science.

Metacognition

Metacognition is "cognition about cognition", "thinking about thinking", "knowing about knowing", becoming "aware of one's awareness" and higher-order thinking skills. The term comes from the root word meta, meaning "beyond", or "on top of". Metacognition can take many forms; it includes knowledge about when and how to use particular strategies for learning or problem-solving. There are generally two components of metacognition: (1) knowledge about cognition and (2) regulation of cognition.

Metamemory, defined as knowing about memory and mnemonic strategies, is an especially important form of metacognition. Academic research on metacognitive processing across cultures is in the early stages, but there are indications that further work may provide better outcomes in cross-cultural learning between teachers and students.

Writings on metacognition date back at least as far as two works by the Greek philosopher Aristotle (384–322 BC): On the Soul and the Parva Naturalia.

Improving cognition

Physical exercise

Aerobic and anaerobic exercise has been studied concerning cognitive improvement. There appears to be short-term increases in attention span, verbal and visual memory in some studies. However, the effects are transient and diminishes over time and after cessation of the physical activity.

Dietary supplements

Studies evaluating phytoestrogen, blueberry supplementation and antioxidants showed minor increases in cognitive function after the supplementation compared to before but no significant effects compared to placebo.

Pleasurable social stimulation

Exposing individuals with cognitive impairment (i.e., Dementia) to daily activities designed to stimulate thinking and memory in a social setting, seems to improve cognition. Although study materials are small, and larger studies need to confirm the results, the effect of social cognitive stimulation seems to be larger than the effects of some drug treatments.

Other methods

Transcranial magnetic stimulation (TMS) has been shown to improve cognition in individuals without dementia 1 month after treatment session compared to before treatment. The effect was not significantly larger compared to placebo. Computerized cognitive training, utilising a computer based training regime for different cognitive functions has been examined in a clinical setting but no lasting effects has been shown.

Neuroimaging

From Wikipedia, the free encyclopedia

Neuroimaging
Parasagittal MRI of human head in patient with benign familial macrocephaly prior to brain injury (ANIMATED).gif
Para-sagittal MRI of the head in a patient with benign familial macrocephaly.
Purposeindirectly(directly) image structure, function/pharmacology of the nervous system

Neuroimaging or brain imaging is the use of various techniques to either directly or indirectly image the structure, function, or pharmacology of the nervous system. It is a relatively new discipline within medicine, neuroscience, and psychology. Physicians who specialize in the performance and interpretation of neuroimaging in the clinical setting are neuroradiologists. Neuroimaging falls into two broad categories:

Functional imaging enables, for example, the processing of information by centers in the brain to be visualized directly. Such processing causes the involved area of the brain to increase metabolism and "light up" on the scan. One of the more controversial uses of neuroimaging has been researching "thought identification" or mind-reading.

History

Functional magnetic resonance imaging (fMRI) of a head, from top to base of the skull

The first chapter of the history of neuroimaging traces back to the Italian neuroscientist Angelo Mosso who invented the 'human circulation balance', which could non-invasively measure the redistribution of blood during emotional and intellectual activity.

In 1918, the American neurosurgeon Walter Dandy introduced the technique of ventriculography. X-ray images of the ventricular system within the brain were obtained by injection of filtered air directly into one or both lateral ventricles of the brain. Dandy also observed that air introduced into the subarachnoid space via lumbar spinal puncture could enter the cerebral ventricles and also demonstrate the cerebrospinal fluid compartments around the base of the brain and over its surface. This technique was called pneumoencephalography.

In 1927, Egas Moniz introduced cerebral angiography, whereby both normal and abnormal blood vessels in and around the brain could be visualized with great precision.

In the early 1970s, Allan McLeod Cormack and Godfrey Newbold Hounsfield introduced computerized axial tomography (CAT or CT scanning), and ever more detailed anatomic images of the brain became available for diagnostic and research purposes. Cormack and Hounsfield won the 1979 Nobel Prize for Physiology or Medicine for their work. Soon after the introduction of CAT in the early 1980s, the development of radioligands allowed single photon emission computed tomography (SPECT) and positron emission tomography (PET) of the brain.

More or less concurrently, magnetic resonance imaging (MRI or MR scanning) was developed by researchers including Peter Mansfield and Paul Lauterbur, who were awarded the Nobel Prize for Physiology or Medicine in 2003. In the early 1980s MRI was introduced clinically, and during the 1980s a veritable explosion of technical refinements and diagnostic MR applications took place. Scientists soon learned that the large blood flow changes measured by PET could also be imaged by the correct type of MRI. Functional magnetic resonance imaging (fMRI) was born, and since the 1990s, fMRI has come to dominate the brain mapping field due to its low invasiveness, lack of radiation exposure, and relatively wide availability.

In the early 2000s, the field of neuroimaging reached the stage where limited practical applications of functional brain imaging have become feasible. The main application area is crude forms of brain-computer interface.

Indications

Neuroimaging follows a neurological examination in which a physician has found cause to more deeply investigate a patient who has or may have a neurological disorder.

One of the more common neurological problems which a person may experience is simple syncope. In cases of simple syncope in which the patient's history does not suggest other neurological symptoms, the diagnosis includes a neurological examination but routine neurological imaging is not indicated because the likelihood of finding a cause in the central nervous system is extremely low and the patient is unlikely to benefit from the procedure.

Neuroimaging is not indicated for patients with stable headaches which are diagnosed as migraine. Studies indicate that presence of migraine does not increase a patient's risk for intracranial disease. A diagnosis of migraine which notes the absence of other problems, such as papilledema, would not indicate a need for neuroimaging. In the course of conducting a careful diagnosis, the physician should consider whether the headache has a cause other than the migraine and might require neuroimaging.

Another indication for neuroimaging is CT-, MRI- and PET-guided stereotactic surgery or radiosurgery for treatment of intracranial tumors, arteriovenous malformations and other surgically treatable conditions.

Brain imaging techniques

Computed axial tomography

Computed tomography (CT) or Computed Axial Tomography (CAT) scanning uses a series of x-rays of the head taken from many different directions. Typically used for quickly viewing brain injuries, CT scanning uses a computer program that performs a numerical integral calculation (the inverse Radon transform) on the measured x-ray series to estimate how much of an x-ray beam is absorbed in a small volume of the brain. Typically the information is presented as cross-sections of the brain.

Diffuse optical imaging

Diffuse optical imaging (DOI) or diffuse optical tomography (DOT) is a medical imaging modality which uses near infrared light to generate images of the body. The technique measures the optical absorption of haemoglobin, and relies on the absorption spectrum of haemoglobin varying with its oxygenation status. High-density diffuse optical tomography (HD-DOT) has been compared directly to fMRI using response to visual stimulation in subjects studied with both techniques, with reassuringly similar results. HD-DOT has also been compared to fMRI in terms of language tasks and resting state functional connectivity.

Event-related optical signal

Event-related optical signal (EROS) is a brain-scanning technique which uses infrared light through optical fibers to measure changes in optical properties of active areas of the cerebral cortex. Whereas techniques such as diffuse optical imaging (DOT) and near-infrared spectroscopy (NIRS) measure optical absorption of haemoglobin, and thus are based on blood flow, EROS takes advantage of the scattering properties of the neurons themselves and thus provides a much more direct measure of cellular activity. EROS can pinpoint activity in the brain within millimeters (spatially) and within milliseconds (temporally). Its biggest downside is the inability to detect activity more than a few centimeters deep. EROS is a new, relatively inexpensive technique that is non-invasive to the test subject. It was developed at the University of Illinois at Urbana-Champaign where it is now used in the Cognitive Neuroimaging Laboratory of Dr. Gabriele Gratton and Dr. Monica Fabiani.

Magnetic resonance imaging

Sagittal MRI slice at the midline.

Magnetic resonance imaging (MRI) uses magnetic fields and radio waves to produce high quality two- or three-dimensional images of brain structures without the use of ionizing radiation (X-rays) or radioactive tracers.

the record for the highest spatial resolution of a whole intact brain (postmortem) is 100 microns, from Massachusetts General Hospital. The data was published in NATURE on 30th of October 2019.

Functional magnetic resonance imaging

Axial MRI slice at the level of the basal ganglia, showing fMRI BOLD signal changes overlaid in red (increase) and blue (decrease) tones.

Functional magnetic resonance imaging (fMRI) and arterial spin labeling (ASL) relies on the paramagnetic properties of oxygenated and deoxygenated hemoglobin to see images of changing blood flow in the brain associated with neural activity. This allows images to be generated that reflect which brain structures are activated (and how) during the performance of different tasks or at resting state. According to the oxygenation hypothesis, changes in oxygen usage in regional cerebral blood flow during cognitive or behavioral activity can be associated with the regional neurons as being directly related to the cognitive or behavioral tasks being attended.

Most fMRI scanners allow subjects to be presented with different visual images, sounds and touch stimuli, and to make different actions such as pressing a button or moving a joystick. Consequently, fMRI can be used to reveal brain structures and processes associated with perception, thought and action. The resolution of fMRI is about 2-3 millimeters at present, limited by the spatial spread of the hemodynamic response to neural activity. It has largely superseded PET for the study of brain activation patterns. PET, however, retains the significant advantage of being able to identify specific brain receptors (or transporters) associated with particular neurotransmitters through its ability to image radiolabelled receptor "ligands" (receptor ligands are any chemicals that stick to receptors).

As well as research on healthy subjects, fMRI is increasingly used for the medical diagnosis of disease. Because fMRI is exquisitely sensitive to oxygen usage in blood flow, it is extremely sensitive to early changes in the brain resulting from ischemia (abnormally low blood flow), such as the changes which follow stroke. Early diagnosis of certain types of stroke is increasingly important in neurology, since substances which dissolve blood clots may be used in the first few hours after certain types of stroke occur, but are dangerous to use afterward. Brain changes seen on fMRI may help to make the decision to treat with these agents. With between 72% and 90% accuracy where chance would achieve 0.8%, fMRI techniques can decide which of a set of known images the subject is viewing.

Magnetoencephalography

Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic fields produced by electrical activity in the brain via extremely sensitive devices such as superconducting quantum interference devices (SQUIDs) or spin exchange relaxation-free (SERF) magnetometers. MEG offers a very direct measurement of neural electrical activity (compared to fMRI for example) with very high temporal resolution but relatively low spatial resolution. The advantage of measuring the magnetic fields produced by neural activity is that they are likely to be less distorted by surrounding tissue (particularly the skull and scalp) compared to the electric fields measured by electroencephalography (EEG). Specifically, it can be shown that magnetic fields produced by electrical activity are not affected by the surrounding head tissue, when the head is modeled as a set of concentric spherical shells, each being an isotropic homogeneous conductor. Real heads are non-spherical and have largely anisotropic conductivities (particularly white matter and skull). While skull anisotropy has a negligible effect on MEG (unlike EEG), white matter anisotropy strongly affects MEG measurements for radial and deep sources. Note, however, that the skull was assumed to be uniformly anisotropic in this study, which is not true for a real head: the absolute and relative thicknesses of diploë and tables layers vary among and within the skull bones. This makes it likely that MEG is also affected by the skull anisotropy, although probably not to the same degree as EEG.

There are many uses for MEG, including assisting surgeons in localizing a pathology, assisting researchers in determining the function of various parts of the brain, neurofeedback, and others.

Positron emission tomography

Positron emission tomography (PET) and brain positron emission tomography, measure emissions from radioactively labeled metabolically active chemicals that have been injected into the bloodstream. The emission data are computer-processed to produce 2- or 3-dimensional images of the distribution of the chemicals throughout the brain. The positron emitting radioisotopes used are produced by a cyclotron, and chemicals are labeled with these radioactive atoms. The labeled compound, called a radiotracer, is injected into the bloodstream and eventually makes its way to the brain. Sensors in the PET scanner detect the radioactivity as the compound accumulates in various regions of the brain. A computer uses the data gathered by the sensors to create multicolored 2- or 3-dimensional images that show where the compound acts in the brain. Especially useful are a wide array of ligands used to map different aspects of neurotransmitter activity, with by far the most commonly used PET tracer being a labeled form of glucose.

The greatest benefit of PET scanning is that different compounds can show blood flow and oxygen and glucose metabolism in the tissues of the working brain. These measurements reflect the amount of brain activity in the various regions of the brain and allow to learn more about how the brain works. PET scans were superior to all other metabolic imaging methods in terms of resolution and speed of completion (as little as 30 seconds) when they first became available. The improved resolution permitted better study to be made as to the area of the brain activated by a particular task. The biggest drawback of PET scanning is that because the radioactivity decays rapidly, it is limited to monitoring short tasks. Before fMRI technology came online, PET scanning was the preferred method of functional (as opposed to structural) brain imaging, and it continues to make large contributions to neuroscience.

PET scanning is also used for diagnosis of brain disease, most notably because brain tumors, strokes, and neuron-damaging diseases which cause dementia (such as Alzheimer's disease) all cause great changes in brain metabolism, which in turn causes easily detectable changes in PET scans. PET is probably most useful in early cases of certain dementias (with classic examples being Alzheimer's disease and Pick's disease) where the early damage is too diffuse and makes too little difference in brain volume and gross structure to change CT and standard MRI images enough to be able to reliably differentiate it from the "normal" range of cortical atrophy which occurs with aging (in many but not all) persons, and which does not cause clinical dementia.

Single-photon emission computed tomography

Single-photon emission computed tomography (SPECT) is similar to PET and uses gamma ray-emitting radioisotopes and a gamma camera to record data that a computer uses to construct two- or three-dimensional images of active brain regions. SPECT relies on an injection of radioactive tracer, or "SPECT agent," which is rapidly taken up by the brain but does not redistribute. Uptake of SPECT agent is nearly 100% complete within 30 to 60 seconds, reflecting cerebral blood flow (CBF) at the time of injection. These properties of SPECT make it particularly well-suited for epilepsy imaging, which is usually made difficult by problems with patient movement and variable seizure types. SPECT provides a "snapshot" of cerebral blood flow since scans can be acquired after seizure termination (so long as the radioactive tracer was injected at the time of the seizure). A significant limitation of SPECT is its poor resolution (about 1 cm) compared to that of MRI. Today, SPECT machines with Dual Detector Heads are commonly used, although Triple Detector Head machines are available in the marketplace. Tomographic reconstruction, (mainly used for functional "snapshots" of the brain) requires multiple projections from Detector Heads which rotate around the human skull, so some researchers have developed 6 and 11 Detector Head SPECT machines to cut imaging time and give higher resolution.

Like PET, SPECT also can be used to differentiate different kinds of disease processes which produce dementia, and it is increasingly used for this purpose. Neuro-PET has a disadvantage of requiring the use of tracers with half-lives of at most 110 minutes, such as FDG. These must be made in a cyclotron, and are expensive or even unavailable if necessary transport times are prolonged more than a few half-lives. SPECT, however, is able to make use of tracers with much longer half-lives, such as technetium-99m, and as a result, is far more widely available.

Cranial ultrasound

Cranial ultrasound is usually only used in babies, whose open fontanelles provide acoustic windows allowing ultrasound imaging of the brain. Advantages include the absence of ionising radiation and the possibility of bedside scanning, but the lack of soft-tissue detail means MRI is preferred for some conditions.

Functional ultrasound imaging

Functional ultrasound imaging (fUS) is a medical ultrasound imaging technique of detecting or measuring changes in neural activities or metabolism, for example, the loci of brain activity, typically through measuring blood flow or hemodynamic changes. Functional ultrasound relies on Ultrasensitive Doppler and ultrafast ultrasound imaging which allows high sensitivity blood flow imaging.

Advantages and concerns of neuroimaging techniques

Functional Magnetic Resonance Imaging (fMRI)

fMRI is commonly classified as a minimally-to-moderate risk due to its non-invasiveness compared to other imaging methods. fMRI uses blood oxygenation level dependent (BOLD)-contrast in order to produce its form of imaging. BOLD-contrast is a naturally occurring process in the body so fMRI is often preferred over imaging methods that require radioactive markers to produce similar imaging. A concern in the use of fMRI is its use in individuals with medical implants or devices and metallic items in the body. The magnetic resonance (MR) emitted from the equipment can cause failure of medical devices and attract metallic objects in the body if not properly screened for. Currently, the FDA classifies medical implants and devices into three categories, depending on MR-compatibility: MR-safe (safe in all MR environments), MR-unsafe (unsafe in any MR environment), and MR-conditional (MR-compatible in certain environments, requiring further information).

Computed Tomography (CT) Scan

The CT scan was introduced in the 1970s and quickly became one of the most widely used methods of imaging. A CT scan can be performed in under a second and produce rapid results for clinicians, with its ease of use leading to an increase in CT scans performed in the United States from 3 million in 1980 to 62 million in 2007. Clinicians oftentimes take multiple scans, with 30% of individuals undergoing at least 3 scans in one study of CT scan usage. CT scans can expose patients to levels of radiation 100-500 times higher than traditional x-rays, with higher radiation doses producing better resolution imaging.

While easy to use, increases in CT scan use, especially in asymptomatic patients, is a topic of concern since patients are exposed to significantly high levels of radiation.

Positron Emission Tomography (PET)

In PET scans, imaging does not rely on intrinsic biological processes, but relies on a foreign substance injected into the bloodstream traveling to the brain. Patients are injected with radioisotopes that are metabolized in the brain and emit positrons to produce a visualization of brain activity. The amount of radiation a patient is exposed to in a PET scan is relatively small, comparable to the amount of environmental radiation an individual is exposed to across a year. PET radioisotopes have limited exposure time in the body as they commonly have very short half-lives (~2 hours) and decay rapidly. Currently, fMRI is a preferred method of imaging brain activity compared to PET, since it does not involve radiation, has a higher temporal resolution than PET, and is more readily available in most medical settings.

Magnetoencephalography (MEG) and Electroencephalography (EEG)

The high temporal resolution of MEG and EEG allow these methods to measure brain activity down to the millisecond. Both MEG and EEG do not require exposure of the patient to radiation to function. EEG electrodes detect electrical signals produced by neurons to measure brain activity and MEG uses oscillations in the magnetic field produced by these electrical currents to measure activity. A barrier in the widespread usage of MEG is due to pricing, as MEG systems can cost millions of dollars. EEG is a much more widely used method to achieve such temporal resolution as EEG systems cost much less than MEG systems. A disadvantage of EEG and MEG is that both methods have poor spatial resolution when compared to fMRI.

Criticism and cautions

Some scientists have criticized the brain image-based claims made in scientific journals and the popular press, like the discovery of "the part of the brain responsible" for functions like talents, specific memories, or generating emotions such as love. Many mapping techniques have a relatively low resolution, including hundreds of thousands of neurons in a single voxel. Many functions also involve multiple parts of the brain, meaning that this type of claim is probably both unverifiable with the equipment used, and generally based on an incorrect assumption about how brain functions are divided. It may be that most brain functions will only be described correctly after being measured with much more fine-grained measurements that look not at large regions but instead at a very large number of tiny individual brain circuits. Many of these studies also have technical problems like small sample size or poor equipment calibration which means they cannot be reproduced - considerations which are sometimes ignored to produce a sensational journal article or news headline. In some cases the brain mapping techniques are used for commercial purposes, lie detection, or medical diagnosis in ways which have not been scientifically validated.

 

Friday, February 12, 2021

Neurolinguistics

From Wikipedia, the free encyclopedia

Surface of the human brain, with Brodmann areas numbered
 
An image of neural pathways in the brain taken using diffusion tensor imaging

Neurolinguistics is the study of the neural mechanisms in the human brain that control the comprehension, production, and acquisition of language. As an interdisciplinary field, neurolinguistics draws methods and theories from fields such as neuroscience, linguistics, cognitive science, communication disorders and neuropsychology. Researchers are drawn to the field from a variety of backgrounds, bringing along a variety of experimental techniques as well as widely varying theoretical perspectives. Much work in neurolinguistics is informed by models in psycholinguistics and theoretical linguistics, and is focused on investigating how the brain can implement the processes that theoretical and psycholinguistics propose are necessary in producing and comprehending language. Neurolinguists study the physiological mechanisms by which the brain processes information related to language, and evaluate linguistic and psycholinguistic theories, using aphasiology, brain imaging, electrophysiology, and computer modeling.

History

Neurolinguistics is historically rooted in the development in the 19th century of aphasiology, the study of linguistic deficits (aphasias) occurring as the result of brain damage. Aphasiology attempts to correlate structure to function by analyzing the effect of brain injuries on language processing. One of the first people to draw a connection between a particular brain area and language processing was Paul Broca, a French surgeon who conducted autopsies on numerous individuals who had speaking deficiencies, and found that most of them had brain damage (or lesions) on the left frontal lobe, in an area now known as Broca's area. Phrenologists had made the claim in the early 19th century that different brain regions carried out different functions and that language was mostly controlled by the frontal regions of the brain, but Broca's research was possibly the first to offer empirical evidence for such a relationship, and has been described as "epoch-making" and "pivotal" to the fields of neurolinguistics and cognitive science. Later, Carl Wernicke, after whom Wernicke's area is named, proposed that different areas of the brain were specialized for different linguistic tasks, with Broca's area handling the motor production of speech, and Wernicke's area handling auditory speech comprehension.

The work of Broca and Wernicke established the field of aphasiology and the idea that language can be studied through examining physical characteristics of the brain. Early work in aphasiology also benefited from the early twentieth-century work of Korbinian Brodmann, who "mapped" the surface of the brain, dividing it up into numbered areas based on each area's cytoarchitecture (cell structure) and function; these areas, known as Brodmann areas, are still widely used in neuroscience today.

The coining of the term "neurolinguistics" is attributed to Edith Crowell Trager, Henri Hecaen and Alexandr Luria, in the late 1940s and 1950s; Luria's book "Problems in Neurolinguistics" is likely the first book with Neurolinguistics in the title. Harry Whitaker popularized neurolinguistics in the United States in the 1970s, founding the journal "Brain and Language" in 1974.

Although aphasiology is the historical core of neurolinguistics, in recent years the field has broadened considerably, thanks in part to the emergence of new brain imaging technologies (such as PET and fMRI) and time-sensitive electrophysiological techniques (EEG and MEG), which can highlight patterns of brain activation as people engage in various language tasks; electrophysiological techniques, in particular, emerged as a viable method for the study of language in 1980 with the discovery of the N400, a brain response shown to be sensitive to semantic issues in language comprehension. The N400 was the first language-relevant event-related potential to be identified, and since its discovery EEG and MEG have become increasingly widely used for conducting language research.

Interaction with other fields

Neurolinguistics is closely related to the field of psycholinguistics, which seeks to elucidate the cognitive mechanisms of language by employing the traditional techniques of experimental psychology; today, psycholinguistic and neurolinguistic theories often inform one another, and there is much collaboration between the two fields.

Much work in neurolinguistics involves testing and evaluating theories put forth by psycholinguists and theoretical linguists. In general, theoretical linguists propose models to explain the structure of language and how language information is organized, psycholinguists propose models and algorithms to explain how language information is processed in the mind, and neurolinguists analyze brain activity to infer how biological structures (populations and networks of neurons) carry out those psycholinguistic processing algorithms. For example, experiments in sentence processing have used the ELAN, N400, and P600 brain responses to examine how physiological brain responses reflect the different predictions of sentence processing models put forth by psycholinguists, such as Janet Fodor and Lyn Frazier's "serial" model, and Theo Vosse and Gerard Kempen's "unification model". Neurolinguists can also make new predictions about the structure and organization of language based on insights about the physiology of the brain, by "generalizing from the knowledge of neurological structures to language structure".

Neurolinguistics research is carried out in all the major areas of linguistics; the main linguistic subfields, and how neurolinguistics addresses them, are given in the table below.

Subfield Description Research questions in neurolinguistics
Phonetics the study of speech sounds how the brain extracts speech sounds from an acoustic signal, how the brain separates speech sounds from background noise
Phonology the study of how sounds are organized in a language how the phonological system of a particular language is represented in the brain
Morphology and lexicology the study of how words are structured and stored in the mental lexicon how the brain stores and accesses words that a person knows
Syntax the study of how multiple-word utterances are constructed how the brain combines words into constituents and sentences; how structural and semantic information is used in understanding sentences
Semantics the study of how meaning is encoded in language

Topics considered

Neurolinguistics research investigates several topics, including where language information is processed, how language processing unfolds over time, how brain structures are related to language acquisition and learning, and how neurophysiology can contribute to speech and language pathology.

Localizations of language processes

Much work in neurolinguistics has, like Broca's and Wernicke's early studies, investigated the locations of specific language "modules" within the brain. Research questions include what course language information follows through the brain as it is processed, whether or not particular areas specialize in processing particular sorts of information, how different brain regions interact with one another in language processing, and how the locations of brain activation differ when a subject is producing or perceiving a language other than his or her first language.

Time course of language processes

Another area of neurolinguistics literature involves the use of electrophysiological techniques to analyze the rapid processing of language in time. The temporal ordering of specific patterns of brain activity may reflect discrete computational processes that the brain undergoes during language processing; for example, one neurolinguistic theory of sentence parsing proposes that three brain responses (the ELAN, N400, and P600) are products of three different steps in syntactic and semantic processing.

Language acquisition

Another topic is the relationship between brain structures and language acquisition. Research in first language acquisition has already established that infants from all linguistic environments go through similar and predictable stages (such as babbling), and some neurolinguistics research attempts to find correlations between stages of language development and stages of brain development, while other research investigates the physical changes (known as neuroplasticity) that the brain undergoes during second language acquisition, when adults learn a new language. Neuroplasticity is observed when both Second Language acquisition and Language Learning experience are induced, the result of this language exposure concludes that an increase of gray and white matter could be found in children, young adults and the elderly.

Ping Li, Jennifer Legault, Kaitlyn A. Litcofsky, May 2014. Neuroplasticity as a function of second language learning: Anatomical changes in the human brain Cortex: A Journal Devoted to the Study of the Nervous System & Behavior, 410.1016/j.cortex.2014.05.00124996640

Language pathology

Neurolinguistic techniques are also used to study disorders and breakdowns in language, such as aphasia and dyslexia, and how they relate to physical characteristics of the brain.

Technology used

Images of the brain recorded with PET (top) and fMRI (bottom). In the PET image, the red areas are the most active. In the fMRI image, the yellowest areas are the areas that show the greatest difference in activation between two tasks (watching a moving stimulus, versus watching a black screen).

Since one of the focuses of this field is the testing of linguistic and psycholinguistic models, the technology used for experiments is highly relevant to the study of neurolinguistics. Modern brain imaging techniques have contributed greatly to a growing understanding of the anatomical organization of linguistic functions. Brain imaging methods used in neurolinguistics may be classified into hemodynamic methods, electrophysiological methods, and methods that stimulate the cortex directly.

Hemodynamic

Hemodynamic techniques take advantage of the fact that when an area of the brain works at a task, blood is sent to supply that area with oxygen (in what is known as the Blood Oxygen Level-Dependent, or BOLD, response). Such techniques include PET and fMRI. These techniques provide high spatial resolution, allowing researchers to pinpoint the location of activity within the brain; temporal resolution (or information about the timing of brain activity), on the other hand, is poor, since the BOLD response happens much more slowly than language processing. In addition to demonstrating which parts of the brain may subserve specific language tasks or computations, hemodynamic methods have also been used to demonstrate how the structure of the brain's language architecture and the distribution of language-related activation may change over time, as a function of linguistic exposure.

In addition to PET and fMRI, which show which areas of the brain are activated by certain tasks, researchers also use diffusion tensor imaging (DTI), which shows the neural pathways that connect different brain areas, thus providing insight into how different areas interact. Functional near-infrared spectroscopy (fNIRS) is another hemodynamic method used in language tasks.

Electrophysiological

Brain waves recorded using EEG

Electrophysiological techniques take advantage of the fact that when a group of neurons in the brain fire together, they create an electric dipole or current. The technique of EEG measures this electric current using sensors on the scalp, while MEG measures the magnetic fields that are generated by these currents. In addition to these non-invasive methods, electrocorticography has also been used to study language processing. These techniques are able to measure brain activity from one millisecond to the next, providing excellent temporal resolution, which is important in studying processes that take place as quickly as language comprehension and production. On the other hand, the location of brain activity can be difficult to identify in EEG; consequently, this technique is used primarily to how language processes are carried out, rather than where. Research using EEG and MEG generally focuses on event-related potentials (ERPs), which are distinct brain responses (generally realized as negative or positive peaks on a graph of neural activity) elicited in response to a particular stimulus. Studies using ERP may focus on each ERP's latency (how long after the stimulus the ERP begins or peaks), amplitude (how high or low the peak is), or topography (where on the scalp the ERP response is picked up by sensors). Some important and common ERP components include the N400 (a negativity occurring at a latency of about 400 milliseconds), the mismatch negativity, the early left anterior negativity (a negativity occurring at an early latency and a front-left topography), the P600, and the lateralized readiness potential.

Experimental design

Experimental techniques

Neurolinguists employ a variety of experimental techniques in order to use brain imaging to draw conclusions about how language is represented and processed in the brain. These techniques include the subtraction paradigm, mismatch design, violation-based studies, various forms of priming, and direct stimulation of the brain.

Subtraction

Many language studies, particularly in fMRI, use the subtraction paradigm, in which brain activation in a task thought to involve some aspect of language processing is compared against activation in a baseline task thought to involve similar non-linguistic processes but not to involve the linguistic process. For example, activations while participants read words may be compared to baseline activations while participants read strings of random letters (in attempt to isolate activation related to lexical processing—the processing of real words), or activations while participants read syntactically complex sentences may be compared to baseline activations while participants read simpler sentences.

Mismatch paradigm

The mismatch negativity (MMN) is a rigorously documented ERP component frequently used in neurolinguistic experiments. It is an electrophysiological response that occurs in the brain when a subject hears a "deviant" stimulus in a set of perceptually identical "standards" (as in the sequence s s s s s s s d d s s s s s s d s s s s s d). Since the MMN is elicited only in response to a rare "oddball" stimulus in a set of other stimuli that are perceived to be the same, it has been used to test how speakers perceive sounds and organize stimuli categorically. For example, a landmark study by Colin Phillips and colleagues used the mismatch negativity as evidence that subjects, when presented with a series of speech sounds with acoustic parameters, perceived all the sounds as either /t/ or /d/ in spite of the acoustic variability, suggesting that the human brain has representations of abstract phonemes—in other words, the subjects were "hearing" not the specific acoustic features, but only the abstract phonemes. In addition, the mismatch negativity has been used to study syntactic processing and the recognition of word category.

Violation-based

Many studies in neurolinguistics take advantage of anomalies or violations of syntactic or semantic rules in experimental stimuli, and analyzing the brain responses elicited when a subject encounters these violations. For example, sentences beginning with phrases such as *the garden was on the worked, which violates an English phrase structure rule, often elicit a brain response called the early left anterior negativity (ELAN). Violation techniques have been in use since at least 1980, when Kutas and Hillyard first reported ERP evidence that semantic violations elicited an N400 effect. Using similar methods, in 1992, Lee Osterhout first reported the P600 response to syntactic anomalies. Violation designs have also been used for hemodynamic studies (fMRI and PET): Embick and colleagues, for example, used grammatical and spelling violations to investigate the location of syntactic processing in the brain using fMRI. Another common use of violation designs is to combine two kinds of violations in the same sentence and thus make predictions about how different language processes interact with one another; this type of crossing-violation study has been used extensively to investigate how syntactic and semantic processes interact while people read or hear sentences.

Priming

In psycholinguistics and neurolinguistics, priming refers to the phenomenon whereby a subject can recognize a word more quickly if he or she has recently been presented with a word that is similar in meaning or morphological makeup (i.e., composed of similar parts). If a subject is presented with a "prime" word such as doctor and then a "target" word such as nurse, if the subject has a faster-than-usual response time to nurse then the experimenter may assume that word nurse in the brain had already been accessed when the word doctor was accessed. Priming is used to investigate a wide variety of questions about how words are stored and retrieved in the brain and how structurally complex sentences are processed.

Stimulation

Transcranial magnetic stimulation (TMS), a new noninvasive technique for studying brain activity, uses powerful magnetic fields that are applied to the brain from outside the head. It is a method of exciting or interrupting brain activity in a specific and controlled location, and thus is able to imitate aphasic symptoms while giving the researcher more control over exactly which parts of the brain will be examined. As such, it is a less invasive alternative to direct cortical stimulation, which can be used for similar types of research but requires that the subject's scalp be removed, and is thus only used on individuals who are already undergoing a major brain operation (such as individuals undergoing surgery for epilepsy). The logic behind TMS and direct cortical stimulation is similar to the logic behind aphasiology: if a particular language function is impaired when a specific region of the brain is knocked out, then that region must be somehow implicated in that language function. Few neurolinguistic studies to date have used TMS; direct cortical stimulation and cortical recording (recording brain activity using electrodes placed directly on the brain) have been used with macaque monkeys to make predictions about the behavior of human brains.

Subject tasks

In many neurolinguistics experiments, subjects do not simply sit and listen to or watch stimuli, but also are instructed to perform some sort of task in response to the stimuli. Subjects perform these tasks while recordings (electrophysiological or hemodynamic) are being taken, usually in order to ensure that they are paying attention to the stimuli. At least one study has suggested that the task the subject does has an effect on the brain responses and the results of the experiment.

Lexical decision

The lexical decision task involves subjects seeing or hearing an isolated word and answering whether or not it is a real word. It is frequently used in priming studies, since subjects are known to make a lexical decision more quickly if a word has been primed by a related word (as in "doctor" priming "nurse").

Grammaticality judgment, acceptability judgment

Many studies, especially violation-based studies, have subjects make a decision about the "acceptability" (usually grammatical acceptability or semantic acceptability) of stimuli. Such a task is often used to "ensure that subjects [are] reading the sentences attentively and that they [distinguish] acceptable from unacceptable sentences in the way the [experimenter] expect[s] them to do."

Experimental evidence has shown that the instructions given to subjects in an acceptability judgment task can influence the subjects' brain responses to stimuli. One experiment showed that when subjects were instructed to judge the "acceptability" of sentences they did not show an N400 brain response (a response commonly associated with semantic processing), but that they did show that response when instructed to ignore grammatical acceptability and only judge whether or not the sentences "made sense".

Probe verification

Some studies use a "probe verification" task rather than an overt acceptability judgment; in this paradigm, each experimental sentence is followed by a "probe word", and subjects must answer whether or not the probe word had appeared in the sentence. This task, like the acceptability judgment task, ensures that subjects are reading or listening attentively, but may avoid some of the additional processing demands of acceptability judgments, and may be used no matter what type of violation is being presented in the study.

Truth-value judgment

Subjects may be instructed not to judge whether or not the sentence is grammatically acceptable or logical, but whether the proposition expressed by the sentence is true or false. This task is commonly used in psycholinguistic studies of child language.

Active distraction and double-task

Some experiments give subjects a "distractor" task to ensure that subjects are not consciously paying attention to the experimental stimuli; this may be done to test whether a certain computation in the brain is carried out automatically, regardless of whether the subject devotes attentional resources to it. For example, one study had subjects listen to non-linguistic tones (long beeps and buzzes) in one ear and speech in the other ear, and instructed subjects to press a button when they perceived a change in the tone; this supposedly caused subjects not to pay explicit attention to grammatical violations in the speech stimuli. The subjects showed a mismatch response (MMN) anyway, suggesting that the processing of the grammatical errors was happening automatically, regardless of attention—or at least that subjects were unable to consciously separate their attention from the speech stimuli.

Another related form of experiment is the double-task experiment, in which a subject must perform an extra task (such as sequential finger-tapping or articulating nonsense syllables) while responding to linguistic stimuli; this kind of experiment has been used to investigate the use of working memory in language processing.

Significant other

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Sig...