Search This Blog

Saturday, April 3, 2021

Lobotomy

From Wikipedia, the free encyclopedia

Lobotomy
Turning the Mind Inside Out Saturday Evening Post 24 May 1941 a detail 1.jpg
"Dr. Walter Freeman, left, and Dr. James W. Watts study an X ray before a psychosurgical operation. Psychosurgery is cutting into the brain to form new patterns and rid a patient of delusions, obsessions, nervous tensions and the like." Waldemar Kaempffert, "Turning the Mind Inside Out", Saturday Evening Post, 24 May 1941.
Other namesLeukotomy, leucotomy
ICD-9-CM01.32
MeSHD011612

A lobotomy, or leucotomy, is a form of psychosurgery, a neurosurgical treatment of a mental disorder that involves severing connections in the brain's prefrontal cortex.[2] Most of the connections to and from the prefrontal cortex, the anterior part of the frontal lobes of the brain, are severed. It was used for treating mental disorders and occasionally other conditions as a mainstream procedure in some Western countries for more than two decades, despite general recognition of frequent and serious side effects. Some patients improved in some ways after the operation, but complications and impairments – sometimes severe – were frequent. The procedure was controversial from its initial use, in part due to the balance between benefits and risks. Today, the lobotomy has become a disparaged procedure, a byword for medical barbarism and an exemplary instance of the medical trampling of patients' rights.

The originator of the procedure, Portuguese neurologist António Egas Moniz, shared the Nobel Prize for Physiology or Medicine of 1949 for the "discovery of the therapeutic value of leucotomy in certain psychoses", although the awarding of the prize has been subject to controversy.

The use of the procedure increased dramatically from the early 1940s and into the 1950s; by 1951, almost 20,000 lobotomies had been performed in the United States and proportionally more in the United Kingdom. The majority of lobotomies were performed on women; a 1951 study of American hospitals found nearly 60% of lobotomy patients were women; limited data shows 74% of lobotomies in Ontario from 1948–1952 were performed on women. From the 1950s onward, lobotomy began to be abandoned, first in the Soviet Union and Europe. The term is derived from Greek: λοβός lobos "lobe" and τομή tomē "cut, slice".

Effects

I fully realize that this operation will have little effect on her mental condition but am willing to have it done in the hope that she will be more comfortable and easier to care for.

— `Comments added to the consent form for a lobotomy operation on "Helaine Strauss", the pseudonym used for "a patient at an elite private hospital".

Historically, patients of lobotomy were, immediately following surgery, often stuporous, confused, and incontinent. Some developed an enormous appetite and gained considerable weight. Seizures were another common complication of surgery. Emphasis was put on the training of patients in the weeks and months following surgery.

The purpose of the operation was to reduce the symptoms of mental disorders, and it was recognized that this was accomplished at the expense of a person's personality and intellect. British psychiatrist Maurice Partridge, who conducted a follow-up study of 300 patients, said that the treatment achieved its effects by "reducing the complexity of psychic life". Following the operation, spontaneity, responsiveness, self-awareness, and self-control were reduced. The activity was replaced by inertia, and people were left emotionally blunted and restricted in their intellectual range.

The consequences of the operation have been described as "mixed". Some patients died as a result of the operation and others later committed suicide. Some were left severely brain damaged. Others were able to leave the hospital, or became more manageable within the hospital. A few people managed to return to responsible work, while at the other extreme, people were left with severe and disabling impairments. Most people fell into an intermediate group, left with some improvement of their symptoms but also with emotional and intellectual deficits to which they made a better or worse adjustment. On average, there was a mortality rate of approximately 5% during the 1940s.

The lobotomy procedure could have severe negative effects on a patient's personality and ability to function independently. Lobotomy patients often show a marked reduction in initiative and inhibition. They may also exhibit difficulty putting themselves in the position of others because of decreased cognition and detachment from society.

Walter Freeman coined the term "surgically induced childhood" and used it constantly to refer to the results of lobotomy. The operation left people with an "infantile personality"; a period of maturation would then, according to Freeman, lead to recovery. In an unpublished memoir, he described how the "personality of the patient was changed in some way in the hope of rendering him more amenable to the social pressures under which he is supposed to exist." He described one 29-year-old woman as being, following lobotomy, a "smiling, lazy and satisfactory patient with the personality of an oyster" who could not remember Freeman's name and endlessly poured coffee from an empty pot. When her parents had difficulty dealing with her behavior, Freeman advised a system of rewards (ice cream) and punishment (smacks).

History

Insulin shock therapy administered in Helsinki in the 1950s

In the early 20th century, the number of patients residing in mental hospitals increased significantly while little in the way of effective medical treatment was available. Lobotomy was one of a series of radical and invasive physical therapies developed in Europe at this time that signaled a break with a psychiatric culture of therapeutic nihilism that had prevailed since the late nineteenth-century. The new "heroic" physical therapies devised during this experimental era, including malarial therapy for general paresis of the insane (1917), deep sleep therapy (1920), insulin shock therapy (1933), cardiazol shock therapy (1934), and electroconvulsive therapy (1938), helped to imbue the then therapeutically moribund and demoralised psychiatric profession with a renewed sense of optimism in the curability of insanity and the potency of their craft. The success of the shock therapies, despite the considerable risk they posed to patients, also helped to accommodate psychiatrists to ever more drastic forms of medical intervention, including lobotomy.

The clinician-historian Joel Braslow argues that from malarial therapy onward to lobotomy, physical psychiatric therapies "spiral closer and closer to the interior of the brain" with this organ increasingly taking "center stage as a source of disease and site of cure". For Roy Porter, once the doyen of medical history, the often violent and invasive psychiatric interventions developed during the 1930s and 1940s are indicative of both the well-intentioned desire of psychiatrists to find some medical means of alleviating the suffering of the vast number of patients then in psychiatric hospitals and also the relative lack of social power of those same patients to resist the increasingly radical and even reckless interventions of asylum doctors. Many doctors, patients and family members of the period believed that despite potentially catastrophic consequences, the results of lobotomy were seemingly positive in many instances or, at least they were deemed as such when measured next to the apparent alternative of long-term institutionalisation. Lobotomy has always been controversial, but for a period of the medical mainstream, it was even feted and regarded as a legitimate last resort remedy for categories of patients who were otherwise regarded as hopeless. Today, lobotomy has become a disparaged procedure, a byword for medical barbarism and an exemplary instance of the medical trampling of patients' rights.

Early psychosurgery

Gottlieb Burckhardt (1836–1907)

Before the 1930s, individual doctors had infrequently experimented with novel surgical operations on the brains of those deemed insane. Most notably in 1888, the Swiss psychiatrist Gottlieb Burckhardt initiated what is commonly considered the first systematic attempt at modern human psychosurgery. He operated on six chronic patients under his care at the Swiss Préfargier Asylum, removing sections of their cerebral cortex. Burckhardt's decision to operate was informed by three pervasive views on the nature of mental illness and its relationship to the brain. First, the belief that mental illness was organic in nature, and reflected an underlying brain pathology; next, that the nervous system was organized according to an associationist model comprising an input or afferent system (a sensory center), a connecting system where information processing took place (an association center), and an output or efferent system (a motor center); and, finally, a modular conception of the brain whereby discrete mental faculties were connected to specific regions of the brain. Burckhardt's hypothesis was that by deliberately creating lesions in regions of the brain identified as association centers a transformation in behavior might ensue. According to his model, those mentally ill might experience "excitations abnormal in quality, quantity and intensity" in the sensory regions of the brain and this abnormal stimulation would then be transmitted to the motor regions giving rise to mental pathology. He reasoned, however, that removing material from either of the sensory or motor zones could give rise to "grave functional disturbance". Instead, by targeting the association centers and creating a "ditch" around the motor region of the temporal lobe, he hoped to break their lines of communication and thus alleviate both mental symptoms and the experience of mental distress.

Ludvig Puusepp c. 1920

Intending to ameliorate symptoms in those with violent and intractable conditions rather than effect a cure, Burckhardt began operating on patients in December 1888, but both his surgical methods and instruments were crude and the results of the procedure were mixed at best. He operated on six patients in total and, according to his own assessment, two experienced no change, two patients became quieter, one patient experienced epileptic convulsions and died a few days after the operation, and one patient improved. Complications included motor weakness, epilepsy, sensory aphasia and "word deafness". Claiming a success rate of 50 percent, he presented the results at the Berlin Medical Congress and published a report, but the response from his medical peers was hostile and he did no further operations.

In 1912, two physicians based in Saint Petersburg, the leading Russian neurologist Vladimir Bekhterev and his younger Estonian colleague, the neurosurgeon Ludvig Puusepp, published a paper reviewing a range of surgical interventions that had been performed on the mentally ill. While generally treating these endeavours favorably, in their consideration of psychosurgery they reserved unremitting scorn for Burckhardt's surgical experiments of 1888 and opined that it was extraordinary that a trained medical doctor could undertake such an unsound procedure.

We have quoted this data to show not only how groundless but also how dangerous these operations were. We are unable to explain how their author, holder of a degree in medicine, could bring himself to carry them out ...

The authors neglected to mention, however, that in 1910 Puusepp himself had performed surgery on the brains of three mentally ill patients, sectioning the cortex between the frontal and parietal lobes. He had abandoned these attempts because of unsatisfactory results and this experience probably inspired the invective that was directed at Burckhardt in the 1912 article. By 1937, Puusepp, despite his earlier criticism of Burckhardt, was increasingly persuaded that psychosurgery could be a valid medical intervention for the mentally disturbed. In the late 1930s, he worked closely with the neurosurgical team of the Racconigi Hospital near Turin to establish it as an early and influential centre for the adoption of leucotomy in Italy.

Development

Egas Moniz

Leucotomy was first undertaken in 1935 under the direction of the Portuguese neurologist (and inventor of the term psychosurgery) António Egas Moniz.First developing an interest in psychiatric conditions and their somatic treatment in the early 1930s, Moniz apparently conceived a new opportunity for recognition in the development of a surgical intervention on the brain as a treatment for mental illness.

Frontal lobes

The source of inspiration for Moniz's decision to hazard psychosurgery has been clouded by contradictory statements made on the subject by Moniz and others both contemporaneously and retrospectively. The traditional narrative addresses the question of why Moniz targeted the frontal lobes by way of reference to the work of the Yale neuroscientist John Fulton and, most dramatically, to a presentation Fulton made with his junior colleague Carlyle Jacobsen at the Second International Congress of Neurology held in London in 1935. Fulton's primary area of research was on the cortical function of primates and he had established America's first primate neurophysiology laboratory at Yale in the early 1930s. At the 1935 Congress, with Moniz in attendance, Fulton and Jacobsen presented two chimpanzees, named Becky and Lucy who had had frontal lobectomies and subsequent changes in behaviour and intellectual function. According to Fulton's account of the congress, they explained that before surgery, both animals, and especially Becky, the more emotional of the two, exhibited "frustrational behaviour" – that is, have tantrums that could include rolling on the floor and defecating – if, because of their poor performance in a set of experimental tasks, they were not rewarded. Following the surgical removal of their frontal lobes, the behaviour of both primates changed markedly and Becky was pacified to such a degree that Jacobsen apparently stated it was as if she had joined a "happiness cult". During the question and answer section of the paper, Moniz, it is alleged, "startled" Fulton by inquiring if this procedure might be extended to human subjects suffering from mental illness. Fulton stated that he replied that while possible in theory it was surely "too formidable" an intervention for use on humans.

Brain animation: left frontal lobe highlighted in red. Moniz targeted the frontal lobes in the leucotomy procedure he first conceived in 1933.

That Moniz began his experiments with leucotomy just three months after the congress has reinforced the apparent cause and effect relationship between the Fulton and Jacobsen presentation and the Portuguese neurologist's resolve to operate on the frontal lobes. As the author of this account Fulton, who has sometimes been claimed as the father of lobotomy, was later able to record that the technique had its true origination in his laboratory. Endorsing this version of events, in 1949, the Harvard neurologist Stanley Cobb remarked during his presidential address to the American Neurological Association that "seldom in the history of medicine has a laboratory observation been so quickly and dramatically translated into a therapeutic procedure". Fulton's report, penned ten years after the events described, is, however, without corroboration in the historical record and bears little resemblance to an earlier unpublished account he wrote of the congress. In this previous narrative he mentioned an incidental, private exchange with Moniz, but it is likely that the official version of their public conversation he promulgated is without foundation. In fact, Moniz stated that he had conceived of the operation some time before his journey to London in 1935, having told in confidence his junior colleague, the young neurosurgeon Pedro Almeida Lima, as early as 1933 of his psychosurgical idea. The traditional account exaggerates the importance of Fulton and Jacobsen to Moniz's decision to initiate frontal lobe surgery, and omits the fact that a detailed body of neurological research that emerged at this time suggested to Moniz and other neurologists and neurosurgeons that surgery on this part of the brain might yield significant personality changes in the mentally ill.

As the frontal lobes had been the object of scientific inquiry and speculation since the late 19th century, Fulton's contribution, while it may have functioned as source of intellectual support, is of itself unnecessary and inadequate as an explanation of Moniz's resolution to operate on this section of the brain. Under an evolutionary and hierarchical model of brain development it had been hypothesized that those regions associated with more recent development, such as the mammalian brain and, most especially, the frontal lobes, were responsible for more complex cognitive functions. However, this theoretical formulation found little laboratory support, as 19th-century experimentation found no significant change in animal behaviour following surgical removal or electrical stimulation of the frontal lobes. This picture of the so-called "silent lobe" changed in the period after World War I with the production of clinical reports of ex-servicemen who had suffered brain trauma. The refinement of neurosurgical techniques also facilitated increasing attempts to remove brain tumours, treat focal epilepsy in humans and led to more precise experimental neurosurgery in animal studies. Cases were reported where mental symptoms were alleviated following the surgical removal of diseased or damaged brain tissue. The accumulation of medical case studies on behavioural changes following damage to the frontal lobes led to the formulation of the concept of Witzelsucht, which designated a neurological condition characterised by a certain hilarity and childishness in the afflicted. The picture of frontal lobe function that emerged from these studies was complicated by the observation that neurological deficits attendant on damage to a single lobe might be compensated for if the opposite lobe remained intact. In 1922, the Italian neurologist Leonardo Bianchi published a detailed report on the results of bilateral lobectomies in animals that supported the contention that the frontal lobes were both integral to intellectual function and that their removal led to the disintegration of the subject's personality. This work, while influential, was not without its critics due to deficiencies in experimental design.

The first bilateral lobectomy of a human subject was performed by the American neurosurgeon Walter Dandy in 1930. The neurologist Richard Brickner reported on this case in 1932, relating that the recipient, known as "Patient A", while experiencing a blunting of affect, had suffered no apparent decrease in intellectual function and seemed, at least to the casual observer, perfectly normal. Brickner concluded from this evidence that "the frontal lobes are not 'centers' for the intellect". These clinical results were replicated in a similar operation undertaken in 1934 by the neurosurgeon Roy Glenwood Spurling and reported on by the neuropsychiatrist Spafford Ackerly. By the mid-1930s, interest in the function of the frontal lobes reached a high-water mark. This was reflected in the 1935 neurological congress in London, which hosted as part of its deliberations, "a remarkable symposium ... on the functions of the frontal lobes". The panel was chaired by Henri Claude, a French neuropsychiatrist, who commenced the session by reviewing the state of research on the frontal lobes, and concluded that "altering the frontal lobes profoundly modifies the personality of subjects". This parallel symposium contained numerous papers by neurologists, neurosurgeons and psychologists; amongst these was one by Brickner, which impressed Moniz greatly, that again detailed the case of "Patient A". Fulton and Jacobsen's paper, presented in another session of the conference on experimental physiology, was notable in linking animal and human studies on the function of the frontal lobes. Thus, at the time of the 1935 Congress, Moniz had available to him an increasing body of research on the role of the frontal lobes that extended well beyond the observations of Fulton and Jacobsen.

Nor was Moniz the only medical practitioner in the 1930s to have contemplated procedures directly targeting the frontal lobes. Although ultimately discounting brain surgery as carrying too much risk, physicians and neurologists such as William Mayo, Thierry de Martel, Richard Brickner, and Leo Davidoff had, before 1935, entertained the proposition. Inspired by Julius Wagner-Jauregg's development of malarial therapy for the treatment of general paresis of the insane, the French physician Maurice Ducosté reported in 1932 that he had injected 5 ml of malarial blood directly into the frontal lobes of over 100 paretic patients through holes drilled into the skull. He claimed that the injected paretics showed signs of "uncontestable mental and physical amelioration" and that the results for psychotic patients undergoing the procedure was also "encouraging". The experimental injection of fever-inducing malarial blood into the frontal lobes was also replicated during the 1930s in the work of Ettore Mariotti and M. Sciutti in Italy and Ferdière Coulloudon in France. In Switzerland, almost simultaneously with the commencement of Moniz's leucotomy programme, the neurosurgeon François Ody had removed the entire right frontal lobe of a catatonic schizophrenic patient. In Romania, Ody's procedure was adopted by Dimitri Bagdasar and Constantinesco working out of the Central Hospital in Bucharest. Ody, who delayed publishing his own results for several years, later rebuked Moniz for claiming to have cured patients through leucotomy without waiting to determine if there had been a "lasting remission".

Neurological model

The theoretical underpinnings of Moniz's psychosurgery were largely commensurate with the nineteenth-century ones that had informed Burckhardt's decision to excise matter from the brains of his patients. Although in his later writings Moniz referenced both the neuron theory of Ramón y Cajal and the conditioned reflex of Ivan Pavlov, in essence he simply interpreted this new neurological research in terms of the old psychological theory of associationism. He differed significantly from Burckhardt, however in that he did not think there was any organic pathology in the brains of the mentally ill, but rather that their neural pathways were caught in fixed and destructive circuits leading to "predominant, obsessive ideas". As Moniz wrote in 1936:

[The] mental troubles must have ... a relation with the formation of cellulo-connective groupings, which become more or less fixed. The cellular bodies may remain altogether normal, their cylinders will not have any anatomical alterations; but their multiple liaisons, very variable in normal people, may have arrangements more or less fixed, which will have a relation with persistent ideas and deliria in certain morbid psychic states.

For Moniz, "to cure these patients," it was necessary to "destroy the more or less fixed arrangements of cellular connections that exist in the brain, and particularly those which are related to the frontal lobes", thus removing their fixed pathological brain circuits. Moniz believed the brain would functionally adapt to such injury. Unlike the position adopted by Burckhardt, it was unfalsifiable according to the knowledge and technology of the time as the absence of a known correlation between physical brain pathology and mental illness could not disprove his thesis.

First leucotomies

The hypotheses underlying the procedure might be called into question; the surgical intervention might be considered very audacious; but such arguments occupy a secondary position because it can be affirmed now that these operations are not prejudicial to either physical or psychic life of the patient, and also that recovery or improvement may be obtained frequently in this way

Egas Moniz (1937)

On 12 November 1935 at the Hospital Santa Marta in Lisbon, Moniz initiated the first of a series of operations on the brains of the mentally ill. The initial patients selected for the operation were provided by the medical director of Lisbon's Miguel Bombarda Mental Hospital, José de Matos Sobral Cid. As Moniz lacked training in neurosurgery and his hands were crippled from gout, the procedure was performed under general anaesthetic by Pedro Almeida Lima, who had previously assisted Moniz with his research on cerebral angiography. The intention was to remove some of the long fibres that connected the frontal lobes to other major brain centres. To this end, it was decided that Lima would trephine into the side of the skull and then inject ethanol into the "subcortical white matter of the prefrontal area" so as to destroy the connecting fibres, or association tracts, and create what Moniz termed a "frontal barrier". After the first operation was complete, Moniz considered it a success and, observing that the patient's depression had been relieved, he declared her "cured" although she was never, in fact, discharged from the mental hospital. Moniz and Lima persisted with this method of injecting alcohol into the frontal lobes for the next seven patients but, after having to inject some patients on numerous occasions to elicit what they considered a favourable result, they modified the means by which they would section the frontal lobes. For the ninth patient they introduced a surgical instrument called a leucotome; this was a cannula that was 11 centimetres (4.3 in) in length and 2 centimetres (0.79 in) in diameter. It had a retractable wire loop at one end that, when rotated, produced a 1 centimetre (0.39 in) diameter circular lesion in the white matter of the frontal lobe. Typically, six lesions were cut into each lobe, but, if they were dissatisfied by the results, Lima might perform several procedures, each producing multiple lesions in the left and right frontal lobes.

By the conclusion of this first run of leucotomies in February 1936, Moniz and Lima had operated on twenty patients with an average period of one week between each procedure; Moniz published his findings with great haste in March of the same year. The patients were aged between 27 and 62 years of age; twelve were female and eight were male. Nine of the patients were diagnosed as suffering from depression, six from schizophrenia, two from panic disorder, and one each from mania, catatonia and manic-depression with the most prominent symptoms being anxiety and agitation. The duration of the illness before the procedure varied from as little as four weeks to as much as 22 years, although all but four had been ill for at least one year. Patients were normally operated on the day they arrived at Moniz's clinic and returned within ten days to the Miguel Bombarda Mental Hospital. A perfunctory post-operative follow-up assessment took place anywhere from one to ten weeks following surgery. Complications were observed in each of the leucotomy patients and included: "increased temperature, vomiting, bladder and bowel incontinence, diarrhea, and ocular affections such as ptosis and nystagmus, as well as psychological effects such as apathy, akinesia, lethargy, timing and local disorientation, kleptomania, and abnormal sensations of hunger". Moniz asserted that these effects were transitory and, according to his published assessment, the outcome for these first twenty patients was that 35%, or seven cases, improved significantly, another 35% were somewhat improved and the remaining 30% (six cases) were unchanged. There were no deaths and he did not consider that any patients had deteriorated following leucotomy.

Reception

Moniz rapidly disseminated his results through articles in the medical press and a monograph in 1936. Initially, however, the medical community appeared hostile to the new procedure. On 26 July 1936, one of his assistants, Diogo Furtado, gave a presentation at the Parisian meeting of the Société Médico-Psychologique on the results of the second cohort of patients leucotomised by Lima. Sobral Cid, who had supplied Moniz with the first set of patients for leucotomy from his own hospital in Lisbon, attended the meeting and denounced the technique, declaring that the patients who had been returned to his care post-operatively were "diminished" and had suffered a "degradation of personality". He also claimed that the changes Moniz observed in patients were more properly attributed to shock and brain trauma, and he derided the theoretical architecture that Moniz had constructed to support the new procedure as "cerebral mythology." At the same meeting the Parisian psychiatrist, Paul Courbon, stated he could not endorse a surgical technique that was solely supported by theoretical considerations rather than clinical observations. He also opined that the mutilation of an organ could not improve its function and that such cerebral wounds as were occasioned by leucotomy risked the later development of meningitis, epilepsy and brain abscesses. Nonetheless, Moniz's reported successful surgical treatment of 14 out of 20 patients led to the rapid adoption of the procedure on an experimental basis by individual clinicians in countries such as Brazil, Cuba, Italy, Romania and the United States during the 1930s.

Italian leucotomy

In the present state of affairs if some are critical about lack of caution in therapy, it is, on the other hand, deplorable and inexcusable to remain apathetic, with folded hands, content with learned lucubrations upon symptomatologic minutiae or upon psychopathic curiosities, or even worse, not even doing that.

Amarro Fiamberti

Throughout the remainder of the 1930s the number of leucotomies performed in most countries where the technique was adopted remained quite low. In Britain, which was later a major centre for leucotomy, only six operations had been undertaken before 1942. Generally, medical practitioners who attempted the procedure adopted a cautious approach and few patients were leucotomised before the 1940s. Italian neuropsychiatrists, who were typically early and enthusiastic adopters of leucotomy, were exceptional in eschewing such a gradualist course.

Leucotomy was first reported in the Italian medical press in 1936 and Moniz published an article in Italian on the technique in the following year. In 1937, he was invited to Italy to demonstrate the procedure and for a two-week period in June of that year he visited medical centres in Trieste, Ferrara, and one close to Turin – the Racconigi Hospital – where he instructed his Italian neuropsychiatric colleagues on leucotomy and also oversaw several operations. Leucotomy was featured at two Italian psychiatric conferences in 1937 and over the next two years a score of medical articles on Moniz's psychosurgery was published by Italian clinicians based in medical institutions located in Racconigi, Trieste, Naples, Genoa, Milan, Pisa, Catania and Rovigo. The major centre for leucotomy in Italy was the Racconigi Hospital, where the experienced neurosurgeon Ludvig Puusepp provided a guiding hand. Under the medical directorship of Emilio Rizzatti, the medical personnel at this hospital had completed at least 200 leucotomies by 1939. Reports from clinicians based at other Italian institutions detailed significantly smaller numbers of leucotomy operations.

Experimental modifications of Moniz's operation were introduced with little delay by Italian medical practitioners. Most notably, in 1937 Amarro Fiamberti, the medical director of a psychiatric institution in Varese, first devised the transorbital procedure whereby the frontal lobes were accessed through the eye sockets. Fiamberti's method was to puncture the thin layer of orbital bone at the top of the socket and then inject alcohol or formalin into the white matter of the frontal lobes through this aperture. Using this method, while sometimes substituting a leucotome for a hypodermic needle, it is estimated that he leucotomised about 100 patients in the period up to the outbreak of World War II. Fiamberti's innovation of Moniz's method would later prove inspirational for Walter Freeman's development of transorbital lobotomy.

American leucotomy

Site of borehole for the standard pre-frontal lobotomy/leucotomy operation as developed by Freeman and Watts

The first prefrontal leucotomy in the United States was performed at the George Washington University Hospital on 14 September 1936 by the neurologist Walter Freeman and his friend and colleague, the neurosurgeon, James W. Watts. Freeman had first encountered Moniz at the London-hosted Second International Congress of Neurology in 1935 where he had presented a poster exhibit of the Portuguese neurologist's work on cerebral angiography. Fortuitously occupying a booth next to Moniz, Freeman, delighted by their chance meeting, formed a highly favourable impression of Moniz, later remarking upon his "sheer genius". According to Freeman, if they had not met in person it is highly unlikely that he would have ventured into the domain of frontal lobe psychosurgery. Freeman's interest in psychiatry was the natural outgrowth of his appointment in 1924 as the medical director of the Research Laboratories of the Government Hospital for the Insane in Washington, known colloquially as St Elizabeth's. Ambitious and a prodigious researcher, Freeman, who favoured an organic model of mental illness causation, spent the next several years exhaustively, yet ultimately fruitlessly, investigating a neuropathological basis for insanity. Chancing upon a preliminary communication by Moniz on leucotomy in the spring of 1936, Freeman initiated a correspondence in May of that year. Writing that he had been considering psychiatric brain surgery previously, he informed Moniz that, "having your authority I expect to go ahead". Moniz, in return, promised to send him a copy of his forthcoming monograph on leucotomy and urged him to purchase a leucotome from a French supplier.

Upon receipt of Moniz's monograph, Freeman reviewed it anonymously for the Archives of Neurology and Psychiatry. Praising the text as one whose "importance can scarcely be overestimated", he summarised Moniz's rationale for the procedure as based on the fact that while no physical abnormality of cerebral cell bodies was observable in the mentally ill, their cellular interconnections may harbour a "fixation of certain patterns of relationship among various groups of cells" and that this resulted in obsessions, delusions and mental morbidity. While recognising that Moniz's thesis was inadequate, for Freeman it had the advantage of circumventing the search for diseased brain tissue in the mentally ill by instead suggesting that the problem was a functional one of the brain's internal wiring where relief might be obtained by severing problematic mental circuits.

In 1937 Freeman and Watts adapted Lima and Moniz's surgical procedure, and created the Freeman-Watts technique, also known as the Freeman-Watts standard prefrontal lobotomy, which they styled the "precision method".

Transorbital lobotomy

The Freeman-Watts prefrontal lobotomy still required drilling holes in the skull, so surgery had to be performed in an operating room by trained neurosurgeons. Walter Freeman believed this surgery would be unavailable to those he saw as needing it most: patients in state mental hospitals that had no operating rooms, surgeons, or anesthesia and limited budgets. Freeman wanted to simplify the procedure so that it could be carried out by psychiatrists in psychiatric hospitals.

Inspired by the work of Italian psychiatrist Amarro Fiamberti, Freeman at some point conceived of approaching the frontal lobes through the eye sockets instead of through drilled holes in the skull. In 1945 he took an icepick from his own kitchen and began testing the idea on grapefruit and cadavers. This new "transorbital" lobotomy involved lifting the upper eyelid and placing the point of a thin surgical instrument (often called an orbitoclast or leucotome, although quite different from the wire loop leucotome described above) under the eyelid and against the top of the eyesocket. A mallet was used to drive the orbitoclast through the thin layer of bone and into the brain along the plane of the bridge of the nose, around 15 degrees toward the interhemispherical fissure. The orbitoclast was malleted 5 centimeters (2 in) into the frontal lobe, and then pivoted 40 degrees at the orbit perforation so the tip cut toward the opposite side of the head (toward the nose). The instrument was returned to the neutral position and sent a further 2 centimeters (​45 in) into the brain, before being pivoted around 28 degrees each side, to cut outwards and again inwards. (In a more radical variation at the end of the last cut described, the butt of the orbitoclast was forced upwards so the tool cut vertically down the side of the cortex of the interhemispheric fissure; the "Deep Frontal Cut".) All cuts were designed to transect the white fibrous matter connecting the cortical tissue of the prefrontal cortex to the thalamus. The leucotome was then withdrawn and the procedure repeated on the other side.

Freeman performed the first transorbital lobotomy on a live patient in 1946. Its simplicity suggested the possibility of carrying it out in mental hospitals lacking the surgical facilities required for the earlier, more complex procedure. (Freeman suggested that, where conventional anesthesia was unavailable, electroconvulsive therapy be used to render the patient unconscious.) In 1947, the Freeman and Watts partnership ended, as the latter was disgusted by Freeman's modification of the lobotomy from a surgical operation into a simple "office" procedure. Between 1940 and 1944, 684 lobotomies were performed in the United States. However, because of the fervent promotion of the technique by Freeman and Watts, those numbers increased sharply towards the end of the decade. In 1949, the peak year for lobotomies in the US, 5,074 procedures were undertaken, and by 1951 over 18,608 individuals had been lobotomized in the US.

Prevalence

In the United States, approximately 40,000 people were lobotomized. In England, 17,000 lobotomies were performed, and the three Nordic countries of Denmark, Norway, and Sweden had a combined figure of approximately 9,300 lobotomies. Scandinavian hospitals lobotomized 2.5 times as many people per capita as hospitals in the US. Sweden lobotomized at least 4,500 people between 1944 and 1966, mainly women. This figure includes young children. In Norway, there were 2,005 known lobotomies. In Denmark, there were 4,500 known lobotomies. In Japan, the majority of lobotomies were performed on children with behavior problems. The Soviet Union banned the practice in 1950 on moral grounds. In Germany, it was performed only a few times. By the late 1970s, the practice of lobotomy had generally ceased, although it continued as late as the 1980s in France.

Criticism

As early as 1944 an author in the Journal of Nervous and Mental Disease remarked: "The history of prefrontal lobotomy has been brief and stormy. Its course has been dotted with both violent opposition and with slavish, unquestioning acceptance." Beginning in 1947 Swedish psychiatrist Snorre Wohlfahrt evaluated early trials, reporting that it is "distinctly hazardous to leucotomize schizophrenics" and that lobotomy was "still too imperfect to enable us, with its aid, to venture on a general offensive against chronic cases of mental disorder", stating further that "Psychosurgery has as yet failed to discover its precise indications and contraindications and the methods must unfortunately still be regarded as rather crude and hazardous in many respects." In 1948 Norbert Wiener, the author of Cybernetics: Or the Control and Communication in the Animal and the Machine, said: "[P]refrontal lobotomy ... has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier."

Concerns about lobotomy steadily grew. Soviet psychiatrist Vasily Gilyarovsky criticized lobotomy and the mechanistic brain localization assumption used to carry out lobotomy:

It is assumed that the transection of white substance of the frontal lobes impairs their connection with the thalamus and eliminates the possibility to receive from it stimuli which lead to irritation and on the whole derange mental functions. This explanation is mechanistic and goes back to the narrow localizationism characteristic of psychiatrists of America, from where leucotomy was imported to us.

The USSR officially banned the procedure in 1950 on the initiative of Gilyarovsky. Doctors in the Soviet Union concluded that the procedure was "contrary to the principles of humanity" and "'through lobotomy' an insane person is changed into an idiot". By the 1970s, numerous countries had banned the procedure, as had several US states.

In 1977 the US Congress, during the presidency of Jimmy Carter, created the National Committee for the Protection of Human Subjects of Biomedical and Behavioral Research to investigate allegations that psychosurgery—including lobotomy techniques—was used to control minorities and restrain individual rights. The committee concluded that some extremely limited and properly performed psychosurgery could have positive effects.

There have been calls in the early 21st century for the Nobel Foundation to rescind the prize it awarded to Moniz for developing lobotomy, a decision that has been called an astounding error of judgment at the time and one that psychiatry might still need to learn from, but the Foundation declined to take action and has continued to host an article defending the results of the procedure.

Notable cases

  • Rosemary Kennedy, sister of President John F. Kennedy, underwent a lobotomy in 1941 that left her incapacitated and institutionalized for the rest of her life.
  • Howard Dully wrote a memoir of his late-life discovery that he had been lobotomized in 1960 at age 12.
  • New Zealand author and poet Janet Frame received a literary award in 1951 the day before a scheduled lobotomy was to take place, and it was never performed.
  • Josef Hassid, a Polish violinist and composer, was diagnosed with schizophrenia and died at the age of 26 following a lobotomy.
  • Swedish modernist painter Sigrid Hjertén died following a lobotomy in 1948.
  • American playwright Tennessee Williams' older sister Rose received a lobotomy that left her incapacitated for life; the episode is said to have inspired characters and motifs in certain works of his.
  • It is often said that when an iron rod was accidentally driven through the head of Phineas Gage in 1848, this constituted an "accidental lobotomy", or that this event somehow inspired the development of surgical lobotomy a century later. According to the only book-length study of Gage, careful inquiry turns up no such link.
  • In 2011, Daniel Nijensohn, an Argentine-born neurosurgeon at Yale, examined X-rays of Eva Perón and concluded that she underwent a lobotomy for the treatment of pain and anxiety in the last months of her life.

Literary and cinematic portrayals

Lobotomies have been featured in several literary and cinematic presentations that both reflected society's attitude towards the procedure and, at times, changed it. Writers and film-makers have played a pivotal role in turning public sentiment against the procedure.

  • Robert Penn Warren's 1946 novel All the King's Men describes a lobotomy as making "a Comanche brave look like a tyro with a scalping knife", and portrays the surgeon as a repressed man who cannot change others with love, so he instead resorts to "high-grade carpentry work".
  • Tennessee Williams criticized lobotomy in his play Suddenly, Last Summer (1958) because it was sometimes inflicted on homosexuals—to render them "morally sane". In the play a wealthy matriarch offers the local mental hospital a substantial donation if the hospital will give her niece a lobotomy, which she hopes will stop the niece's shocking revelations about the matriarch's son. Warned that a lobotomy might not stop her niece's "babbling", she responds, "That may be, maybe not, but after the operation who would believe her, Doctor?"
  • In Ken Kesey's 1962 novel One Flew Over the Cuckoo's Nest and its 1975 film adaptation, lobotomy is described as "frontal-lobe castration", a form of punishment and control after which "There's nothin' in the face. Just like one of those store dummies." In one patient, "You can see by his eyes how they burned him out over there; his eyes are all smoked up and gray and deserted inside."
  • In Sylvia Plath's 1963 novel The Bell Jar, the protagonist reacts with horror to the "perpetual marble calm" of a lobotomized young woman.
  • Elliott Baker's 1964 novel and 1966 film version, A Fine Madness, portrays the dehumanizing lobotomy of a womanizing, quarrelsome poet who, afterwards, is just as aggressive as ever. The surgeon is depicted as an inhumane crackpot.
  • The 1982 biopic film Frances depicts actress Frances Farmer (the subject of the film) undergoing transorbital lobotomy (though the idea that a lobotomy was performed on Farmer, and that Freeman performed it, has been criticized as having little or no factual foundation).
  • The 2018 film The Mountain centers around lobotomization, its cultural significance in the context of 1950s America, and mid-century attitudes surrounding mental health in general. The film interrogates the ethical and social implications of the practice through the experiences of its protagonist, a young man whose late mother had been lobotomized. The protagonist takes a job as a medical photographer for the fictional Dr. Wallace Fiennes, portrayed by Jeff Goldblum. Fiennes is loosely based on Freeman.

Prefrontal cortex

From Wikipedia, the free encyclopedia
 
Prefrontal cortex
Gray726-Brodman-prefrontal.svg
Brodmann areas, 8, 9, 10, 11, 12, 13, 14, 24, 25, 32, 44, 45, 46, and 47 are all in the prefrontal cortex
Details
Part ofFrontal lobe
PartsSuperior frontal gyrus
Middle frontal gyrus
Inferior frontal gyrus
ArteryAnterior cerebral
Middle cerebral
VeinSuperior sagittal sinus
Identifiers
LatinCortex praefrontalis
MeSHD017397
NeuroNames2429
NeuroLex IDnlx_anat_090801
FMA224850

In mammalian brain anatomy, the prefrontal cortex (PFC) is the cerebral cortex which covers the front part of the frontal lobe. The PFC contains the Brodmann areas BA8, BA9, BA10, BA11, BA12, BA13, BA14, BA24, BA25, BA32, BA44, BA45, BA46, and BA47.

Many authors have indicated an integral link between a person's will to live, personality, and the functions of the prefrontal cortex. This brain region has been implicated in executive functions, such as planning, decision making, short-term memory, personality expression, moderating social behavior and controlling certain aspects of speech and language. The basic activity of this brain region is considered to be orchestration of thoughts and actions in accordance with internal goals.

Executive function relates to abilities to differentiate among conflicting thoughts, determine good and bad, better and best, same and different, future consequences of current activities, working toward a defined goal, prediction of outcomes, expectation based on actions, and social "control" (the ability to suppress urges that, if not suppressed, could lead to socially unacceptable outcomes).

The frontal cortex supports concrete rule learning. More anterior regions along the rostro-caudal axis of frontal cortex support rule learning at higher levels of abstraction.

Structure

Definition

There are three possible ways to define the prefrontal cortex:

  • as the granular frontal cortex
  • as the projection zone of the medial dorsal nucleus of the thalamus
  • as that part of the frontal cortex whose electrical stimulation does not evoke movements

Granular frontal cortex

The prefrontal cortex has been defined based on cytoarchitectonics by the presence of a cortical granular layer IV. It is not entirely clear who first used this criterion. Many of the early cytoarchitectonic researchers restricted the use of the term prefrontal to a much smaller region of cortex including the gyrus rectus and the gyrus rostralis (Campbell, 1905; G. E. Smith, 1907; Brodmann, 1909; von Economo and Koskinas, 1925). In 1935, however, Jacobsen used the term prefrontal to distinguish granular prefrontal areas from agranular motor and premotor areas. In terms of Brodmann areas, the prefrontal cortex traditionally includes areas 8, 9, 10, 11, 12, 13, 14, 24, 25, 32, 44, 45, 46, and 47, however, not all of these areas are strictly granular – 44 is dysgranular, caudal 11 and orbital 47 are agranular. The main problem with this definition is that it works well only in primates but not in nonprimates, as the latter lack a granular layer IV.

Projection zone

To define the prefrontal cortex as the projection zone of the mediodorsal nucleus of the thalamus builds on the work of Rose and Woolsey, who showed that this nucleus projects to anterior and ventral parts of the brain in nonprimates, however, Rose and Woolsey termed this projection zone "orbitofrontal." It seems to have been Akert, who, for the first time in 1964, explicitly suggested that this criterion could be used to define homologues of the prefrontal cortex in primates and nonprimates. This allowed the establishment of homologies despite the lack of a granular frontal cortex in nonprimates.

The projection zone definition is still widely accepted today (e.g. Fuster), although its usefulness has been questioned. Modern tract tracing studies have shown that projections of the mediodorsal nucleus of the thalamus are not restricted to the granular frontal cortex in primates. As a result, it was suggested to define the prefrontal cortex as the region of cortex that has stronger reciprocal connections with the mediodorsal nucleus than with any other thalamic nucleus. Uylings et al. acknowledge, however, that even with the application of this criterion, it might be rather difficult to define the prefrontal cortex unequivocally.

Electrically silent area of frontal cortex

A third definition of the prefrontal cortex is the area of frontal cortex whose electrical stimulation does not lead to observable movements. For example, in 1890 David Ferrier used the term in this sense. One complication with this definition is that the electrically "silent" frontal cortex includes both granular and non-granular areas.

Subdivisions

Brodmann areas.jpg

According to Striedter the PFC of humans can be delineated into two functionally, morphologically, and evolutionarily different regions: the ventromedial PFC (vmPFC) consisting of the ventral prefrontal cortex and the medial prefrontal cortex present in all mammals, and the lateral prefrontal cortex (LPFC), consisting of the dorsolateral prefrontal cortex and the ventrolateral prefrontal cortex, present only in primates.

The LPFC contains the Brodmann areas BA8, BA9, BA10, BA45, BA46, and BA47. Some researchers also include BA44. The vmPFC contains the Brodmann areas BA12, BA25, BA32, BA33, BA24, BA11, BA13, and BA14.

The table below shows different ways to subdivide parts of the human prefrontal cortex based upon Brodmann areas.

8
9
10
46
45
47
44
12
25
32
33
24
11
13
14
lateral
ventromedial
dorsolateral
ventrolateral
medial
ventral

Interconnections

The prefrontal cortex is highly interconnected with much of the brain, including extensive connections with other cortical, subcortical and brain stem sites. The dorsal prefrontal cortex is especially interconnected with brain regions involved with attention, cognition and action, while the ventral prefrontal cortex interconnects with brain regions involved with emotion. The prefrontal cortex also receives inputs from the brainstem arousal systems, and its function is particularly dependent on its neurochemical environment. Thus, there is coordination between our state of arousal and our mental state. The interplay between the prefrontal cortex and socioemotional system of the brain is relevant for adolescent development, as proposed by the Dual Systems Model.

The medial prefrontal cortex has been implicated in the generation of slow-wave sleep (SWS), and prefrontal atrophy has been linked to decreases in SWS. Prefrontal atrophy occurs naturally as individuals age, and it has been demonstrated that older adults experience impairments in memory consolidation as their medial prefrontal cortices degrade. In monkeys, significant atrophy has been found as a result of neuroleptic or antipsychotic psychiatric medication. In older adults, instead of being transferred and stored in the neocortex during SWS, memories start to remain in the hippocampus where they were encoded, as evidenced by increased hippocampal activation compared to younger adults during recall tasks, when subjects learned word associations, slept, and then were asked to recall the learned words.

The ventrolateral prefrontal cortex (VLPFC) has been implicated in various aspects of speech production and language comprehension. The VLPFC is richly connected to various regions of the brain including the lateral and medial temporal lobe, the superior temporal cortex, the infertemporal cortex, the perirhinal cortex, and the parahippoccampal cortex.These brain areas are implicated in memory retrieval and consolidation, language processing, and association of emotions. These connections allow the VLPFC to mediate explicit and implicit memory retrieval and integrate it with language stimulus to help plan coherent speech. In other words, choosing the correct words and staying “on topic” during conversation come from the VLPFC.

Function

Executive function

The original studies of Fuster and of Goldman-Rakic emphasized the fundamental ability of the prefrontal cortex to represent information not currently in the environment, and the central role of this function in creating the "mental sketch pad". Goldman-Rakic spoke of how this representational knowledge was used to intelligently guide thought, action, and emotion, including the inhibition of inappropriate thoughts, distractions, actions, and feelings. In this way, working memory can be seen as fundamental to attention and behavioral inhibition. Fuster speaks of how this prefrontal ability allows the wedding of past to future, allowing both cross-temporal and cross-modal associations in the creation of goal-directed, perception-action cycles. This ability to represent underlies all other higher executive functions.

Shimamura proposed Dynamic Filtering Theory to describe the role of the prefrontal cortex in executive functions. The prefrontal cortex is presumed to act as a high-level gating or filtering mechanism that enhances goal-directed activations and inhibits irrelevant activations. This filtering mechanism enables executive control at various levels of processing, including selecting, maintaining, updating, and rerouting activations. It has also been used to explain emotional regulation.

Miller and Cohen proposed an Integrative Theory of Prefrontal Cortex Function, that arises from the original work of Goldman-Rakic and Fuster. The two theorize that “cognitive control stems from the active maintenance of patterns of activity in the prefrontal cortex that represents goals and means to achieve them. They provide bias signals to other brain structures whose net effect is to guide the flow of activity along neural pathways that establish the proper mappings between inputs, internal states, and outputs needed to perform a given task”. In essence, the two theorize that the prefrontal cortex guides the inputs and connections, which allows for cognitive control of our actions.

The prefrontal cortex is of significant importance when top-down processing is needed. Top-down processing by definition is when behavior is guided by internal states or intentions. According to the two, “The PFC is critical in situations when the mappings between sensory inputs, thoughts, and actions either are weakly established relative to other existing ones or are rapidly changing”. An example of this can be portrayed in the Wisconsin Card Sorting Test (WCST). Subjects engaging in this task are instructed to sort cards according to the shape, color, or number of symbols appearing on them. The thought is that any given card can be associated with a number of actions and no single stimulus-response mapping will work. Human subjects with PFC damage are able to sort the card in the initial simple tasks, but unable to do so as the rules of classification change.

Miller and Cohen conclude that the implications of their theory can explain how much of a role the PFC has in guiding control of cognitive actions. In the researchers' own words, they claim that, “depending on their target of influence, representations in the PFC can function variously as attentional templates, rules, or goals by providing top-down bias signals to other parts of the brain that guide the flow of activity along the pathways needed to perform a task”.

Experimental data indicate a role for the prefrontal cortex in mediating normal sleep physiology, dreaming and sleep-deprivation phenomena.

When analyzing and thinking about attributes of other individuals, the medial prefrontal cortex is activated, however, it is not activated when contemplating the characteristics of inanimate objects.

Studies using fMRI have shown that the medial prefrontal cortex (mPFC), specifically the anterior medial prefrontal cortex (amPFC), may modulate mimicry behavior. Neuroscientists are suggesting that social priming influences activity and processing in the amPFC, and that this area of the prefrontal cortex modulates mimicry responses and behavior.

As of recent, researchers have used neuroimaging techniques to find that along with the basal ganglia, the prefrontal cortex is involved with learning exemplars, which is part of the exemplar theory, one of the three main ways our mind categorizes things. The exemplar theory states that we categorize judgements by comparing it to a similar past experience within our stored memories.

A 2014 meta-analysis by Professor Nicole P.Yuan from the University of Arizona found that larger prefrontal cortex volume and greater PFC cortical thickness were associated with better executive performance.

Attention and memory

Lebedev et al. experiment that dissociated representation of spatial attention from representation of spatial memory in prefrontal cortex 

A widely accepted theory regarding the function of the brain's prefrontal cortex is that it serves as a store of short-term memory. This idea was first formulated by Jacobsen, who reported in 1936 that damage to the primate prefrontal cortex caused short-term memory deficits. Karl Pribram and colleagues (1952) identified the part of the prefrontal cortex responsible for this deficit as area 46, also known as the dorsolateral prefrontal cortex (dlPFC). More recently, Goldman-Rakic and colleagues (1993) evoked short-term memory loss in localized regions of space by temporary inactivation of portions of the dlPFC. Once the concept of working memory was established in contemporary neuroscience by Alan Baddeley (1986), these neuropsychological findings contributed to the theory that the prefrontal cortex implements working memory and, in some extreme formulations, only working memory. In the 1990s this theory developed a wide following, and it became the predominant theory of PF function, especially for nonhuman primates. The concept of working memory used by proponents of this theory focused mostly on the short-term maintenance of information, and rather less on the manipulation or monitoring of such information or on the use of that information for decisions. Consistent with the idea that the prefrontal cortex functions predominantly in maintenance memory, delay-period activity in the PF has often been interpreted as a memory trace. (The phrase "delay-period activity" applies to neuronal activity that follows the transient presentation of an instruction cue and persists until a subsequent "go" or "trigger" signal.)

To explore alternative interpretations of delay-period activity in the prefrontal cortex, Lebedev et al. (2004) investigated the discharge rates of single prefrontal neurons as monkeys attended to a stimulus marking one location while remembering a different, unmarked location. Both locations served as potential targets of a saccadic eye movement. Although the task made intensive demands on short-term memory, the largest proportion of prefrontal neurons represented attended locations, not remembered ones. These findings showed that short-term memory functions cannot account for all, or even most, delay-period activity in the part of the prefrontal cortex explored. The authors suggested that prefrontal activity during the delay-period contributes more to the process of attentional selection (and selective attention) than to memory storage.

Speech production and language

Various areas of the prefrontal cortex have been implicated in a multitude of critical functions regarding speech production, language comprehension, and response planning before speaking. Cognitive neuroscience has shown that the left ventrolateral prefrontal cortex is vital in the processing of words and sentences.

The right prefrontal cortex has been found to be responsible for coordinating the retrieval of explicit memory for use in speech, whereas the deactivation of the left is responsible for mediating implicit memory retrieval to be used in verb generation. Impaired recollection of nouns (explicit memory) is impaired in some amnesic patients with damaged left prefrontal cortices, but verb generation remains intact because of its reliance on left prefrontal deactivation.

Many researchers now include BA45 in the prefrontal cortex because together with BA44 make up an area of the frontal lobe called Broca's Area. Broca's Area is the widely considered the output area of the language production pathway in the brain (as opposed to Wernike's area in the medial temporal lobe, which is seen as the language input area). BA45 has been shown to be implicated for the retrieval of relevant semantic knowledge to be used in conversation/speech. The right lateral prefrontal cortex (RLPFC) is implicated in the planning of complex behavior, and together with bilateral BA45, they act to maintain focus and coherence during speech production.  However, left BA45 has been shown to be activated significantly while maintaining speech coherence in young people. Older people have been shown to recruit the right BA45 more so than their younger counterparts.  This aligns with the evidence of decreased lateralization in other brain systems during aging.

In addition, this increase in BA45 and RLPFC activity in combination of BA47 in older patients has been shown to contribute to “off-topic utterances.” The BA47 area in the prefrontal cortex is implicated in “stimulus-driven” retrieval of less-salient knowledge than is required to contribute to a conversation. In other words, elevated activation of the BA47 together with altered activity in BA45 and the broader RLPFC has been shown to contribute to the inclusion of less relevant information and irrelevant tangential conversational speech patterns in older subjects.

Clinical significance

In the last few decades, brain imaging systems have been used to determine brain region volumes and nerve linkages. Several studies have indicated that reduced volume and interconnections of the frontal lobes with other brain regions is observed in patients diagnosed with mental disorders and prescribed potent antipsychotics; those subjected to repeated stressors; those who excessively consume sexually explicit materials; suicides; those incarcerated; criminals; sociopaths; those affected by lead poisoning; and daily male cannabis users (only 13 people were tested). It is believed that at least some of the human abilities to feel guilt or remorse, and to interpret reality, are dependent on a well-functioning prefrontal cortex. It is also widely believed that the size and number of connections in the prefrontal cortex relates directly to sentience, as the prefrontal cortex in humans occupies a far larger percentage of the brain than in any other animal. And it is theorized that, as the brain has tripled in size over five million years of human evolution, the prefrontal cortex has increased in size sixfold.

A review on executive functions in healthy exercising individuals noted that the left and right halves of the prefrontal cortex, which is divided by the medial longitudinal fissure, appears to become more interconnected in response to consistent aerobic exercise. Two reviews of structural neuroimaging research indicate that marked improvements in prefrontal and hippocampal gray matter volume occur in healthy adults that engage in medium intensity exercise for several months.

A functional neuroimaging review of meditation-based practices suggested that practicing mindfulness enhances prefrontal activation, which was noted to be correlated with increased well-being and reduced anxiety; however, the review noted the need for cohort studies in future research to better establish this.

Treatments with anti-cancer drugs often are toxic to the cells of the brain, leading to memory loss and cognitive dysfunction that can persist long after the period of exposure. Such a condition is referred to as chemo brain. To determine the basis of this condition, mice were treated with the chemotherapeutic agent mitomycin C. In the prefrontal cortex, this treatment resulted in an increase of the oxidative DNA damage 8-oxodG, a decrease in the enzyme OGG1 that ordinarily repairs such damage, and epigenetic alterations.

Chronic intake of alcohol leads to persistent alterations in brain function including altered decision making ability. The prefrontal cortex of chronic alcoholics has been shown to be vulnerable to oxidative DNA damage and neuronal cell death.

History

Perhaps the seminal case in prefrontal cortex function is that of Phineas Gage, whose left frontal lobe was destroyed when a large iron rod was driven through his head in an 1848 accident. The standard presentation (e.g.) is that, although Gage retained normal memory, speech and motor skills, his personality changed radically: He became irritable, quick-tempered, and impatient—characteristics he did not previously display — so that friends described him as "no longer Gage"; and, whereas he had previously been a capable and efficient worker, afterward he was unable to complete tasks. However, careful analysis of primary evidence shows that descriptions of Gage's psychological changes are usually exaggerated when held against the description given by Gage's doctor, the most striking feature being that changes described years after Gage's death are far more dramatic than anything reported while he was alive.

Subsequent studies on patients with prefrontal injuries have shown that the patients verbalized what the most appropriate social responses would be under certain circumstances. Yet, when actually performing, they instead pursued behavior aimed at immediate gratification, despite knowing the longer-term results would be self-defeating.

The interpretation of this data indicates that not only are skills of comparison and understanding of eventual outcomes harbored in the prefrontal cortex but the prefrontal cortex (when functioning correctly) controls the mental option to delay immediate gratification for a better or more rewarding longer-term gratification result. This ability to wait for a reward is one of the key pieces that define optimal executive function of the human brain.

There is much current research devoted to understanding the role of the prefrontal cortex in neurological disorders. Clinical trials have begun on certain drugs that have been shown to improve prefrontal cortex function, including guanfacine, which acts through the alpha-2A adrenergic receptor. A downstream target of this drug, the HCN channel, is one of the most recent areas of exploration in prefrontal cortex pharmacology.

Etymology

The term "prefrontal" as describing a part of the brain appears to have been introduced by Richard Owen in 1868. For him, the prefrontal area was restricted to the anterior-most part of the frontal lobe (approximately corresponding to the frontal pole). It has been hypothesized that his choice of the term was based on the prefrontal bone present in most amphibians and reptiles.

Limbic system

From Wikipedia, the free encyclopedia
 
Limbic system
Back Cover, STRESS R US.jpg
 
Cross section of the human brain showing parts of the limbic system from below.
Traité d'Anatomie et de Physiologie (1786)
 
1511 The Limbic Lobe.jpg
The limbic system largely consists of what was previously known as the limbic lobe.
 
Details
Identifiers
LatinSystema limbicum
MeSHD008032
NeuroNames2055
FMA242000
Anatomical terms of neuroanatomy

The limbic system, also known as the paleomammalian cortex, is a set of brain structures located on both sides of the thalamus, immediately beneath the medial temporal lobe of the cerebrum primarily in the forebrain.

It supports a variety of functions including emotion, behavior, long-term memory, and olfaction. Emotional life is largely housed in the limbic system, and it critically aids the formation of memories.

With a primordial structure, the limbic system is involved in lower order emotional processing of input from sensory systems and consists of the amygdaloid nuclear complex (amygdala), mammillary bodies, stria medullaris, central gray and dorsal and ventral nuclei of Gudden. This processed information is often relayed to a collection of structures from the telencephalon, diencephalon, and mesencephalon, including the prefrontal cortex, cingulate gyrus, limbic thalamus, hippocampus including the parahippocampal gyrus and subiculum, nucleus accumbens (limbic striatum), anterior hypothalamus, ventral tegmental area, midbrain raphe nuclei, habenular commissure, entorhinal cortex, and olfactory bulbs.

Structure

Anatomical components of the limbic system

The limbic system was originally defined by Paul D. MacLean as a series of cortical structures surrounding the boundary between the cerebral hemispheres and the brainstem. The name "limbic" comes from the Latin word for the border, limbus, and these structures were known together as the limbic lobe. Further studies began to associate these areas with emotional and motivational processes and linked them to subcortical components that were then grouped into the limbic system.

Currently, it is not considered an isolated entity responsible for the neurological regulation of emotion, but rather one of the many parts of the brain that regulate visceral autonomic processes. Therefore, the set of anatomical structures considered part of the limbic system is controversial. The following structures are, or have been considered, part of the limbic system:

Function

The structures and interacting areas of the limbic system are involved in motivation, emotion, learning, and memory. The limbic system is where the subcortical structures meet the cerebral cortex. The limbic system operates by influencing the endocrine system and the autonomic nervous system. It is highly interconnected with the nucleus accumbens, which plays a role in sexual arousal and the "high" derived from certain recreational drugs. These responses are heavily modulated by dopaminergic projections from the limbic system. In 1954, Olds and Milner found that rats with metal electrodes implanted into their nucleus accumbens, as well as their septal nuclei, repeatedly pressed a lever activating this region.

The limbic system also interacts with the basal ganglia. The basal ganglia are a set of subcortical structures that direct intentional movements. The basal ganglia are located near the thalamus and hypothalamus. They receive input from the cerebral cortex, which sends outputs to the motor centers in the brain stem. A part of the basal ganglia called the striatum controls posture and movement. Recent studies indicate that if there is an inadequate supply of dopamine in the striatum, this can lead to the symptoms of Parkinson's disease.

The limbic system is also tightly connected to the prefrontal cortex. Some scientists contend that this connection is related to the pleasure obtained from solving problems. To cure severe emotional disorders, this connection was sometimes surgically severed, a procedure of psychosurgery, called a prefrontal lobotomy (this is actually a misnomer). Patients having undergone this procedure often became passive and lacked all motivation.

The limbic system is often incorrectly classified as a cerebral structure, but simply interacts heavily with the cerebral cortex. These interactions are closely linked to olfaction, emotions, drives, autonomic regulation, memory, and pathologically to encephalopathy, epilepsy, psychotic symptoms, cognitive defects. The functional relevance of the limbic system has proven to serve many different functions such as affects/emotions, memory, sensory processing, time perception, attention, consciousness, instincts, autonomic/vegetative control, and actions/motor behavior. Some of the disorders associated with the limbic system and its interacting components are epilepsy and schizophrenia.

Hippocampus

Location and basic anatomy of the hippocampus, as a coronal section

The hippocampus is involved with various processes relating to cognition and is one of the most well understood and heavily involved limbic interacting structure.

Spatial memory

The first and most widely researched area concerns memory, particularly spatial memory. Spatial memory was found to have many sub-regions in the hippocampus, such as the dentate gyrus (DG) in the dorsal hippocampus, the left hippocampus, and the parahippocampal region. The dorsal hippocampus was found to be an important component for the generation of new neurons, called adult-born granules (GC), in adolescence and adulthood. These new neurons contribute to pattern separation in spatial memory, increasing the firing in cell networks, and overall causing stronger memory formations. This is thought to integrate spatial and episodic memories with the limbic system via a feedback loop that provides emotional context of a particular sensory input.

While the dorsal hippocampus is involved in spatial memory formation, the left hippocampus is a participant in the recall of these spatial memories. Eichenbaum and his team found, when studying the hippocampal lesions in rats, that the left hippocampus is “critical for effectively combining the ‘what, ‘when,’ and ‘where’ qualities of each experience to compose the retrieved memory.” This makes the left hippocampus a key component in the retrieval of spatial memory. However, Spreng found that the left hippocampus is a general concentrated region for binding together bits and pieces of memory composed not only by the hippocampus, but also by other areas of the brain to be recalled at a later time. Eichenbaum’s research in 2007 also demonstrates that the parahippocampal area of the hippocampus is another specialized region for the retrieval of memories just like the left hippocampus.

Learning

The hippocampus, over the decades, has also been found to have a huge impact in learning. Curlik and Shors examined the effects of neurogenesis in the hippocampus and its effects on learning. This researcher and his team employed many different types of mental and physical training on their subjects, and found that the hippocampus is highly responsive to these latter tasks. Thus, they discovered an upsurge of new neurons and neural circuits in the hippocampus as a result of the training, causing an overall improvement in the learning of the task. This neurogenesis contributes to the creation of adult-born granules cells (GC), cells also described by Eichenbaum in his own research on neurogenesis and its contributions to learning. The creation of these cells exhibited "enhanced excitability" in the dentate gyrus (DG) of the dorsal hippocampus, impacting the hippocampus and its contribution to the learning process.

Hippocampus damage

Damage related to the hippocampal region of the brain has reported vast effects on overall cognitive functioning, particularly memory such as spatial memory. As previously mentioned, spatial memory is a cognitive function greatly intertwined with the hippocampus. While damage to the hippocampus may be a result of a brain injury or other injuries of that sort, researchers particularly investigated the effects that high emotional arousal and certain types of drugs had on the recall ability in this specific memory type. In particular, in a study performed by Parkard, rats were given the task of correctly making their way through a maze. In the first condition, rats were stressed by shock or restraint which caused a high emotional arousal. When completing the maze task, these rats had an impaired effect on their hippocampal-dependent memory when compared to the control group. Then, in a second condition, a group of rats were injected with anxiogenic drugs. Like the former these results reported similar outcomes, in that hippocampal-memory was also impaired. Studies such as these reinforce the impact that the hippocampus has on memory processing, in particular the recall function of spatial memory. Furthermore, impairment to the hippocampus can occur from prolonged exposure to stress hormones such as glucocorticoids (GCs), which target the hippocampus and cause disruption in explicit memory.

In an attempt to curtail life-threatening epileptic seizures, 27-year-old Henry Gustav Molaison underwent bilateral removal of almost all of his hippocampus in 1953. Over the course of fifty years he participated in thousands of tests and research projects that provided specific information on exactly what he had lost. Semantic and episodic events faded within minutes, having never reached his long term memory, yet emotions, unconnected from the details of causation, were often retained. Dr. Suzanne Corkin, who worked with him for 46 years until his death, described the contribution of this tragic "experiment" in her 2013 book.

Amygdala

Episodic-autobiographical memory (EAM) networks

Another integrative part of the limbic system, the amygdala, which is the deepest part of the limbic system, is involved in many cognitive processes and is largely considered the most primordial and vital part of the limbic system. Like the hippocampus, processes in the amygdala seem to impact memory; however, it is not spatial memory as in the hippocampus but the semantic division of episodic-autobiographical memory (EAM) networks. Markowitsch's amygdala research shows it encodes, stores, and retrieves EAM memories. To delve deeper into these types of processes by the amygdala, Markowitsch and his team provided extensive evidence through investigations that the "amygdala's main function is to charge cues so that mnemonic events of a specific emotional significance can be successfully searched within the appropriate neural nets and re-activated." These cues for emotional events created by the amygdala encompass the EAM networks previously mentioned.

Attentional and emotional processes

Besides memory, the amygdala also seems to be an important brain region involved in attentional and emotional processes. First, to define attention in cognitive terms, attention is the ability to focus on some stimuli while ignoring others. Thus, the amygdala seems to be an important structure in this ability. Foremost, however, this structure was historically thought to be linked to fear, allowing the individual to take action in response to that fear. However, as time has gone by, researchers such as Pessoa, generalized this concept with help from evidence of EEG recordings, and concluded that the amygdala helps an organism to define a stimulus and therefore respond accordingly. However, when the amygdala was initially thought to be linked to fear, this gave way for research in the amygdala for emotional processes. Kheirbek demonstrated research that the amygdala is involved in emotional processes, in particular the ventral hippocampus. He described the ventral hippocampus as having a role in neurogenesis and the creation of adult-born granule cells (GC). These cells not only were a crucial part of neurogenesis and the strengthening of spatial memory and learning in the hippocampus but also appear to be an essential component to the function of the amygdala. A deficit of these cells, as Pessoa (2009) predicted in his studies, would result in low emotional functioning, leading to high retention rate of mental diseases, such as anxiety disorders.

Social processing

Social processing, specifically the evaluation of faces in social processing, is an area of cognition specific to the amygdala. In a study done by Todorov, fMRI tasks were performed with participants to evaluate whether the amygdala was involved in the general evaluation of faces. After the study, Todorov concluded from his fMRI results that the amygdala did indeed play a key role in the general evaluation of faces. However, in a study performed by researchers Koscik and his team, the trait of trustworthiness was particularly examined in the evaluation of faces. Koscik and his team demonstrated that the amygdala was involved in evaluating the trustworthiness of an individual. They investigated how brain damage to the amygdala played a role in trustworthiness, and found that individuals that suffered damage tended to confuse trust and betrayal, and thus placed trust in those having done them wrong. Furthermore, Rule, along with his colleagues, expanded on the idea of the amygdala in its critique of trustworthiness in others by performing a study in 2009 in which he examined the amygdala's role in evaluating general first impressions and relating them to real-world outcomes. Their study involved first impressions of CEOs. Rule demonstrated that while the amygdala did play a role in the evaluation of trustworthiness, as observed by Koscik in his own research two years later in 2011, the amygdala also played a generalized role in the overall evaluation of first impression of faces. This latter conclusion, along with Todorov's study on the amygdala's role in general evaluations of faces and Koscik's research on trustworthiness and the amygdala, further solidified evidence that the amygdala plays a role in overall social processing.

Klüver–Bucy syndrome

Based on experiments done on monkeys, the destruction of the temporal cortex almost always led to damage of the amygdala. This damage done to the amygdala led the physiologists Kluver and Bucy to pinpoint major changes in the behavior of the monkeys. The monkeys demonstrated the following changes:

  1. Monkeys were not afraid of anything.
  2. The animals (monkeys) had extreme curiosity about everything.
  3. The animal forgets rapidly.
  4. The animal has a tendency to place everything in its mouth.
  5. The animal often has a sexual drive so strong that it attempts to copulate with immature animals, animals of the opposite sex, or even animals of a different species.

This set of behavioral change came to be known as the Klüver–Bucy syndrome.

Evolution

Paul D. MacLean, as part of his triune brain theory, hypothesized that the limbic system is older than other parts of the forebrain, and that it developed to manage circuitry attributed to the fight or flight first identified by Hans Selye in his report of the General Adaptation Syndrome in 1936. It may be considered a part of survival adaptation in reptiles as well as mammals (including humans). MacLean postulated that the human brain has evolved three components, that evolved successively, with more recent components developing at the top/front. These components are, respectively:

  1. The archipallium or primitive ("reptilian") brain, comprising the structures of the brain stem – medulla, pons, cerebellum, mesencephalon, the oldest basal nuclei – the globus pallidus and the olfactory bulbs.
  2. The paleopallium or intermediate ("old mammalian") brain, comprising the structures of the limbic system.
  3. The neopallium, also known as the superior or rational ("new mammalian") brain, comprises almost the whole of the hemispheres (made up of a more recent type of cortex, called neocortex) and some subcortical neuronal groups. It corresponds to the brain of the superior mammals, thus including the primates and, as a consequence, the human species. Similar development of the neocortex in mammalian species unrelated to humans and primates has also occurred, for example in cetaceans and elephants; thus the designation of "superior mammals" is not an evolutionary one, as it has occurred independently in different species. The evolution of higher degrees of intelligence is an example of convergent evolution, and is also seen in non-mammals such as birds.

According to Maclean, each of the components, although connected with the others, retained "their peculiar types of intelligence, subjectivity, sense of time and space, memory, mobility and other less specific functions".

However, while the categorization into structures is reasonable, the recent studies of the limbic system of tetrapods, both living and extinct, have challenged several aspects of this hypothesis, notably the accuracy of the terms "reptilian" and "old mammalian". The common ancestors of reptiles and mammals had a well-developed limbic system in which the basic subdivisions and connections of the amygdalar nuclei were established. Further, birds, which evolved from the dinosaurs, which in turn evolved separately but around the same time as the mammals, have a well-developed limbic system. While the anatomic structures of the limbic system are different in birds and mammals, there are functional equivalents.

History

Etymology and history

The term limbic comes from the Latin limbus, for "border" or "edge", or, particularly in medical terminology, a border of an anatomical component. Paul Broca coined the term based on its physical location in the brain, sandwiched between two functionally different components.

The limbic system is a term that was introduced in 1949 by the American physician and neuroscientist, Paul D. MacLean. The French physician Paul Broca first called this part of the brain le grand lobe limbique in 1878. He examined the differentiation between deeply recessed cortical tissue and underlying, subcortical nuclei. However, most of its putative role in emotion was developed only in 1937 when the American physician James Papez described his anatomical model of emotion, the Papez circuit.

The first evidence that the limbic system was responsible for the cortical representation of emotions was discovered in 1939, by Heinrich Kluver and Paul Bucy. Kluver and Bucy, after much research, demonstrated that the bilateral removal of the temporal lobes in monkeys created an extreme behavioral syndrome. After performing a temporal lobectomy, the monkeys showed a decrease in aggression. The animals revealed a reduced threshold to visual stimuli, and were thus unable to recognize objects that were once familiar. MacLean expanded these ideas to include additional structures in a more dispersed "limbic system", more on the lines of the system described above. MacLean developed the intriguing theory of the "triune brain" to explain its evolution and to try to reconcile rational human behavior with its more primal and violent side. He became interested in the brain's control of emotion and behavior. After initial studies of brain activity in epileptic patients, he turned to cats, monkeys, and other models, using electrodes to stimulate different parts of the brain in conscious animals recording their responses.

In the 1950s, he began to trace individual behaviors like aggression and sexual arousal to their physiological sources. He analyzed the brain's center of emotions, the limbic system, and described an area that includes structures called the hippocampus and amygdala. Developing observations made by Papez, he determined that the limbic system had evolved in early mammals to control fight-or-flight responses and react to both emotionally pleasurable and painful sensations. The concept is now broadly accepted in neuroscience. Additionally, MacLean said that the idea of the limbic system leads to a recognition that its presence "represents the history of the evolution of mammals and their distinctive family way of life."

In the 1960s, Dr. MacLean enlarged his theory to address the human brain's overall structure and divided its evolution into three parts, an idea that he termed the triune brain. In addition to identifying the limbic system, he pointed to a more primitive brain called the R-complex, related to reptiles, which controls basic functions like muscle movement and breathing. The third part, the neocortex, controls speech and reasoning and is the most recent evolutionary arrival. The concept of the limbic system has since been further expanded and developed by Walle Nauta, Lennart Heimer, and others.

Academic dispute

There is controversy over the use of the term limbic system, with scientists such as LeDoux arguing that the term be considered obsolete and abandoned. Originally, the limbic system was believed to be the emotional center of the brain, with cognition being the business of the neocortex. However, cognition depends on acquisition and retention of memories, in which the hippocampus, a primary limbic interacting structure, is involved: hippocampus damage causes severe cognitive (memory) deficits. More important, the "boundaries" of the limbic system have been repeatedly redefined because of advances in neuroscience. Therefore, while it is true that limbic interacting structures are more closely related to emotion, the limbic system itself is best thought of as a component of a larger emotional processing plant. It is essentially responsible for sifting through and organizing lower order processing, and relaying sensory information to other brain areas for higher order emotional processing.

Entropy (statistical thermodynamics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(statistical_thermody...