Search This Blog

Tuesday, May 2, 2023

Lobotomy

From Wikipedia, the free encyclopedia
Lobotomy
Turning the Mind Inside Out Saturday Evening Post 24 May 1941 a detail 1.jpg
"Dr. Walter Freeman, left, and Dr. James W. Watts right, study an X-ray before a psychosurgical operation. Psychosurgery is cutting into the brain to form new patterns and rid a patient of delusions, obsessions, nervous tensions and the like." Waldemar Kaempffert, "Turning the Mind Inside Out", Saturday Evening Post, 24 May 1941.
Other namesLeukotomy, leucotomy
ICD-9-CM01.32
MeSHD011612

A lobotomy (from Greek λοβός (lobos) 'lobe', and τομή (tomē) 'cut, slice') or leucotomy is a form of neurosurgical treatment for psychiatric disorder or neurological disorder (e.g. epilepsy) that involves severing connections in the brain's prefrontal cortex. The surgery causes most of the connections to and from the prefrontal cortex, the anterior part of the frontal lobes of the brain, to be severed.

In the past, this treatment was used for treating psychiatric disorders as a mainstream procedure in some countries. The procedure was controversial from its initial use, in part due to a lack of recognition of the severity and chronicity of severe and enduring psychiatric illnesses, so it was said to be an inappropriate treatment.

The originator of the procedure, Portuguese neurologist António Egas Moniz, shared the Nobel Prize for Physiology or Medicine of 1949 for the "discovery of the therapeutic value of leucotomy in certain psychoses", although the awarding of the prize has been subject to controversy.

The use of the procedure increased dramatically from the early 1940s and into the 1950s; by 1951, almost 20,000 lobotomies had been performed in the United States and proportionally more in the United Kingdom. More lobotomies were performed on women than on men: a 1951 study found that nearly 60% of American lobotomy patients were women, and limited data shows that 74% of lobotomies in Ontario from 1948 to 1952 were performed on female patients. From the 1950s onward, lobotomy began to be abandoned, first in the Soviet Union and Europe.

Effects

I fully realize that this operation will have little effect on her mental condition but am willing to have it done in the hope that she will be more comfortable and easier to care for.

— Comments added to the consent form for a lobotomy operation on "Helaine Strauss", the pseudonym used for "a patient at an elite private hospital".

Historically, patients of lobotomy were, immediately following surgery, often stuporous, confused, and incontinent. Some developed an enormous appetite and gained considerable weight. Seizures were another common complication of surgery. Emphasis was put on the training of patients in the weeks and months following surgery.

The purpose of the operation was to reduce the symptoms of mental disorders, and it was recognized that this was accomplished at the expense of a person's personality and intellect. British psychiatrist Maurice Partridge, who conducted a follow-up study of 300 patients, said that the treatment achieved its effects by "reducing the complexity of psychic life". Following the operation, spontaneity, responsiveness, self-awareness, and self-control were reduced. The activity was replaced by inertia, and people were mostly left emotionally blunted and restricted in their intellectual range.

The consequences of the operation have been described as "mixed". Some patients died as a result of the operation and others later committed suicide. Some were left severely brain damaged. Others were able to leave the hospital, or became more manageable within the hospital. A few people managed to return to responsible work, while at the other extreme, people were left with severe and disabling impairments. Most people fell into an intermediate group, left with some improvement of their symptoms but also with emotional and intellectual deficits to which they made a better or worse adjustment. On average, there was a mortality rate of approximately 5% during the 1940s.

The lobotomy procedure could have severe negative effects on a patient's personality and ability to function independently. Lobotomy patients often show a marked reduction in initiative and inhibition. They may also exhibit difficulty imagining themselves in the position of others because of decreased cognition and detachment from society.

Walter Freeman coined the term "surgically induced childhood" and used it constantly to refer to the results of lobotomy. The operation left people with an "infantile personality"; a period of maturation would then, according to Freeman, lead to recovery. In an unpublished memoir, he described how the "personality of the patient was changed in some way in the hope of rendering him more amenable to the social pressures under which he is supposed to exist." He described one 29-year-old woman as being, following lobotomy, a "smiling, lazy and satisfactory patient with the personality of an oyster" who could not remember Freeman's name and endlessly poured coffee from an empty pot. When her parents had difficulty dealing with her behaviour, Freeman advised a system of rewards (ice-cream) and punishment (smacks).

History

Insulin shock therapy administered in Helsinki in the 1950s.

In the early 20th century, the number of patients residing in mental hospitals increased significantly while little in the way of effective medical treatment was available. Lobotomy was one of a series of radical and invasive physical therapies developed in Europe at this time that signaled a break with a psychiatric culture of therapeutic nihilism that had prevailed since the late nineteenth-century. The new "heroic" physical therapies devised during this experimental era, including malarial therapy for general paresis of the insane (1917), deep sleep therapy (1920), insulin shock therapy (1933), cardiazol shock therapy (1934), and electroconvulsive therapy (1938), helped to imbue the then therapeutically moribund and demoralised psychiatric profession with a renewed sense of optimism in the curability of insanity and the potency of their craft. The success of the shock therapies, despite the considerable risk they posed to patients, also helped to accommodate psychiatrists to ever more drastic forms of medical intervention, including lobotomy.

The clinician-historian Joel Braslow argues that from malarial therapy onward to lobotomy, physical psychiatric therapies "spiral closer and closer to the interior of the brain" with this organ increasingly taking "center stage as a source of disease and site of cure". For Roy Porter, once the doyen of medical history, the often violent and invasive psychiatric interventions developed during the 1930s and 1940s are indicative of both the well-intentioned desire of psychiatrists to find some medical means of alleviating the suffering of the vast number of patients then in psychiatric hospitals and also the relative lack of social power of those same patients to resist the increasingly radical and even reckless interventions of asylum doctors. Many doctors, patients and family members of the period believed that despite potentially catastrophic consequences, the results of lobotomy were seemingly positive in many instances or, were at least deemed as such when measured next to the apparent alternative of long-term institutionalisation. Lobotomy has always been controversial, but for a period of the medical mainstream, it was even feted and regarded as a legitimate last resort remedy for categories of patients who were otherwise regarded as hopeless. Today, lobotomy has become a disparaged procedure, a byword for medical barbarism and an exemplary instance of the medical trampling of patients' rights.

Early psychosurgery

The Swiss psychiatrist Gottlieb Burckhardt (1836–1907)

Before the 1930s, individual doctors had infrequently experimented with novel surgical operations on the brains of those deemed insane. Most notably in 1888, the Swiss psychiatrist Gottlieb Burckhardt initiated what is commonly considered the first systematic attempt at modern human psychosurgery. He operated on six chronic patients under his care at the Swiss Préfargier Asylum, removing sections of their cerebral cortex. Burckhardt's decision to operate was informed by three pervasive views on the nature of mental illness and its relationship to the brain. First, the belief that mental illness was organic in nature, and reflected an underlying brain pathology; next, that the nervous system was organized according to an associationist model comprising an input or afferent system (a sensory center), a connecting system where information processing took place (an association center), and an output or efferent system (a motor center); and, finally, a modular conception of the brain whereby discrete mental faculties were connected to specific regions of the brain. Burckhardt's hypothesis was that by deliberately creating lesions in regions of the brain identified as association centers a transformation in behaviour might ensue. According to his model, those mentally ill might experience "excitations abnormal in quality, quantity and intensity" in the sensory regions of the brain and this abnormal stimulation would then be transmitted to the motor regions giving rise to mental pathology. He reasoned, however, that removing material from either of the sensory or motor zones could give rise to "grave functional disturbance". Instead, by targeting the association centers and creating a "ditch" around the motor region of the temporal lobe, he hoped to break their lines of communication and thus alleviate both mental symptoms and the experience of mental distress.

The Estonian neurosurgeon Ludvig Puusepp c. 1920

Intending to ameliorate symptoms in those with violent and intractable conditions rather than effect a cure, Burckhardt began operating on patients in December 1888, but both his surgical methods and instruments were crude and the results of the procedure were mixed at best. He operated on six patients in total and, according to his own assessment, two experienced no change, two patients became quieter, one patient experienced epileptic convulsions and died a few days after the operation, and one patient improved. Complications included motor weakness, epilepsy, sensory aphasia and "word deafness". Claiming a success rate of 50 percent, he presented the results at the Berlin Medical Congress and published a report, but the response from his medical peers was hostile and he did no further operations.

In 1912, two physicians based in Saint Petersburg, the leading Russian neurologist Vladimir Bekhterev and his younger Estonian colleague, the neurosurgeon Ludvig Puusepp, published a paper reviewing a range of surgical interventions that had been performed on the mentally ill. While generally treating these endeavours favorably, in their consideration of psychosurgery they reserved unremitting scorn for Burckhardt's surgical experiments of 1888 and opined that it was extraordinary that a trained medical doctor could undertake such an unsound procedure.

We have quoted this data to show not only how groundless but also how dangerous these operations were. We are unable to explain how their author, holder of a degree in medicine, could bring himself to carry them out ...

The authors neglected to mention, however, that in 1910 Puusepp himself had performed surgery on the brains of three mentally ill patients, sectioning the cortex between the frontal and parietal lobes. He had abandoned these attempts because of unsatisfactory results and this experience probably inspired the invective that was directed at Burckhardt in the 1912 article. By 1937, Puusepp, despite his earlier criticism of Burckhardt, was increasingly persuaded that psychosurgery could be a valid medical intervention for the mentally disturbed. In the late 1930s, he worked closely with the neurosurgical team of the Racconigi Hospital near Turin to establish it as an early and influential centre for the adoption of leucotomy in Italy.

Development

The pioneer of lobotomies, the Portuguese neurologist and Nobel Laureate António Egas Moniz

Leucotomy was first undertaken in 1935 under the direction of the Portuguese neurologist (and inventor of the term psychosurgery) António Egas Moniz. First developing an interest in psychiatric conditions and their somatic treatment in the early 1930s, Moniz apparently conceived a new opportunity for recognition in the development of a surgical intervention on the brain as a treatment for mental illness.

Frontal lobes

The source of inspiration for Moniz's decision to hazard psychosurgery has been clouded by contradictory statements made on the subject by Moniz and others both contemporaneously and retrospectively. The traditional narrative addresses the question of why Moniz targeted the frontal lobes by way of reference to the work of the Yale neuroscientist John Fulton and, most dramatically, to a presentation Fulton made with his junior colleague Carlyle Jacobsen at the Second International Congress of Neurology held in London in 1935. Fulton's primary area of research was on the cortical function of primates and he had established America's first primate neurophysiology laboratory at Yale in the early 1930s. At the 1935 Congress, with Moniz in attendance, Fulton and Jacobsen presented two chimpanzees, named Becky and Lucy who had had frontal lobectomies and subsequent changes in behaviour and intellectual function. According to Fulton's account of the congress, they explained that before surgery, both animals, and especially Becky, the more emotional of the two, exhibited "frustrational behaviour" – that is, have tantrums that could include rolling on the floor and defecating – if, because of their poor performance in a set of experimental tasks, they were not rewarded. Following the surgical removal of their frontal lobes, the behaviour of both primates changed markedly and Becky was pacified to such a degree that Jacobsen apparently stated it was as if she had joined a "happiness cult". During the question and answer section of the paper, Moniz, it is alleged, "startled" Fulton by inquiring if this procedure might be extended to human subjects suffering from mental illness. Fulton stated that he replied that while possible in theory it was surely "too formidable" an intervention for use on humans.

Brain animation: left frontal lobe highlighted in red. Moniz targeted the frontal lobes in the leucotomy procedure he first conceived in 1933.

Moniz began his experiments with leucotomy just three months after the congress had reinforced the apparent cause and effect relationship between the Fulton and Jacobsen presentation and the Portuguese neurologist's resolve to operate on the frontal lobes. As the author of this account Fulton, who has sometimes been claimed as the father of lobotomy, was later able to record that the technique had its true origination in his laboratory. Endorsing this version of events, in 1949, the Harvard neurologist Stanley Cobb remarked during his presidential address to the American Neurological Association that "seldom in the history of medicine has a laboratory observation been so quickly and dramatically translated into a therapeutic procedure". Fulton's report, penned ten years after the events described, is, however, without corroboration in the historical record and bears little resemblance to an earlier unpublished account he wrote of the congress. In this previous narrative he mentioned an incidental, private exchange with Moniz, but it is likely that the official version of their public conversation he promulgated is without foundation. In fact, Moniz stated that he had conceived of the operation some time before his journey to London in 1935, having told in confidence his junior colleague, the young neurosurgeon Pedro Almeida Lima, as early as 1933 of his psychosurgical idea. The traditional account exaggerates the importance of Fulton and Jacobsen to Moniz's decision to initiate frontal lobe surgery, and omits the fact that a detailed body of neurological research that emerged at this time suggested to Moniz and other neurologists and neurosurgeons that surgery on this part of the brain might yield significant personality changes in the mentally ill.

The frontal lobes had been the object of scientific inquiry and speculation since the late 19th century. Fulton's contribution, while it may have functioned as source of intellectual support, is of itself unnecessary and inadequate as an explanation of Moniz's resolution to operate on this section of the brain. Under an evolutionary and hierarchical model of brain development it had been hypothesized that those regions associated with more recent development, such as the mammalian brain and, most especially, the frontal lobes, were responsible for more complex cognitive functions. However, this theoretical formulation found little laboratory support, as 19th-century experimentation found no significant change in animal behaviour following surgical removal or electrical stimulation of the frontal lobes. This picture of the so-called "silent lobe" changed in the period after World War I with the production of clinical reports of ex-servicemen with brain trauma. The refinement of neurosurgical techniques also facilitated increasing attempts to remove brain tumours, treat focal epilepsy in humans and led to more precise experimental neurosurgery in animal studies. Cases were reported where mental symptoms were alleviated following the surgical removal of diseased or damaged brain tissue. The accumulation of medical case studies on behavioural changes following damage to the frontal lobes led to the formulation of the concept of Witzelsucht, which designated a neurological condition characterised by a certain hilarity and childishness in those with the condition. The picture of frontal lobe function that emerged from these studies was complicated by the observation that neurological deficits attendant on damage to a single lobe might be compensated for if the opposite lobe remained intact. In 1922, the Italian neurologist Leonardo Bianchi published a detailed report on the results of bilateral lobectomies in animals that supported the contention that the frontal lobes were both integral to intellectual function and that their removal led to the disintegration of the subject's personality. This work, while influential, was not without its critics due to deficiencies in experimental design.

The first bilateral lobectomy of a human subject was performed by the American neurosurgeon Walter Dandy in 1930. The neurologist Richard Brickner reported on this case in 1932, relating that the recipient, known as "Patient A", while experiencing a blunting of affect, had no apparent decrease in intellectual function and seemed, at least to the casual observer, perfectly normal. Brickner concluded from this evidence that "the frontal lobes are not 'centers' for the intellect". These clinical results were replicated in a similar operation undertaken in 1934 by the neurosurgeon Roy Glenwood Spurling and reported on by the neuropsychiatrist Spafford Ackerly. By the mid-1930s, interest in the function of the frontal lobes reached a high-water mark. This was reflected in the 1935 neurological congress in London, which hosted as part of its deliberations, "a remarkable symposium ... on the functions of the frontal lobes". The panel was chaired by Henri Claude, a French neuropsychiatrist, who commenced the session by reviewing the state of research on the frontal lobes, and concluded that "altering the frontal lobes profoundly modifies the personality of subjects". This parallel symposium contained numerous papers by neurologists, neurosurgeons and psychologists; amongst these was one by Brickner, which impressed Moniz greatly, that again detailed the case of "Patient A". Fulton and Jacobsen's paper, presented in another session of the conference on experimental physiology, was notable in linking animal and human studies on the function of the frontal lobes. Thus, at the time of the 1935 Congress, Moniz had available to him an increasing body of research on the role of the frontal lobes that extended well beyond the observations of Fulton and Jacobsen.

Nor was Moniz the only medical practitioner in the 1930s to have contemplated procedures directly targeting the frontal lobes. Although ultimately discounting brain surgery as carrying too much risk, physicians and neurologists such as William Mayo, Thierry de Martel, Richard Brickner, and Leo Davidoff had, before 1935, entertained the proposition. Inspired by Julius Wagner-Jauregg's development of malarial therapy for the treatment of general paresis of the insane, the French physician Maurice Ducosté reported in 1932 that he had injected 5 ml of malarial blood directly into the frontal lobes of over 100 paretic patients through holes drilled into the skull. He claimed that the injected paretics showed signs of "uncontestable mental and physical amelioration" and that the results for psychotic patients undergoing the procedure was also "encouraging". The experimental injection of fever-inducing malarial blood into the frontal lobes was also replicated during the 1930s in the work of Ettore Mariotti and M. Sciutti in Italy and Ferdière Coulloudon in France. In Switzerland, almost simultaneously with the commencement of Moniz's leucotomy programme, the neurosurgeon François Ody had removed the entire right frontal lobe of a catatonic schizophrenic patient. In Romania, Ody's procedure was adopted by Dimitri Bagdasar and Constantinesco working out of the Central Hospital in Bucharest. Ody, who delayed publishing his own results for several years, later rebuked Moniz for claiming to have cured patients through leucotomy without waiting to determine if there had been a "lasting remission".

Neurological model

The theoretical underpinnings of Moniz's psychosurgery were largely commensurate with the nineteenth-century ones that had informed Burckhardt's decision to excise matter from the brains of his patients. Although in his later writings Moniz referenced both the neuron theory of Ramón y Cajal and the conditioned reflex of Ivan Pavlov, in essence he simply interpreted this new neurological research in terms of the old psychological theory of associationism. He differed significantly from Burckhardt, however in that he did not think there was any organic pathology in the brains of the mentally ill, but rather that their neural pathways were caught in fixed and destructive circuits leading to "predominant, obsessive ideas". As Moniz wrote in 1936:

[The] mental troubles must have ... a relation with the formation of cellulo-connective groupings, which become more or less fixed. The cellular bodies may remain altogether normal, their cylinders will not have any anatomical alterations; but their multiple liaisons, very variable in normal people, may have arrangements more or less fixed, which will have a relation with persistent ideas and deliria in certain morbid psychic states.

For Moniz, "to cure these patients", it was necessary to "destroy the more or less fixed arrangements of cellular connections that exist in the brain, and particularly those which are related to the frontal lobes", thus removing their fixed pathological brain circuits. Moniz believed the brain would functionally adapt to such injury. Unlike the position adopted by Burckhardt, it was unfalsifiable according to the knowledge and technology of the time as the absence of a known correlation between physical brain pathology and mental illness could not disprove his thesis.

First leucotomies

The hypotheses underlying the procedure might be called into question; the surgical intervention might be considered very audacious; but such arguments occupy a secondary position because it can be affirmed now that these operations are not prejudicial to either physical or psychic life of the patient, and also that recovery or improvement may be obtained frequently in this way.

Egas Moniz (1937)

On 12 November 1935 at the Hospital Santa Marta in Lisbon, Moniz initiated the first of a series of operations on the brains of people with mental illnesses. The initial patients selected for the operation were provided by the medical director of Lisbon's Miguel Bombarda Mental Hospital, José de Matos Sobral Cid. As Moniz lacked training in neurosurgery and his hands were impaired by gout, the procedure was performed under general anaesthetic by Pedro Almeida Lima, who had previously assisted Moniz with his research on cerebral angiography. The intention was to remove some of the long fibres that connected the frontal lobes to other major brain centres. To this end, it was decided that Lima would trephine into the side of the skull and then inject ethanol into the "subcortical white matter of the prefrontal area" so as to destroy the connecting fibres, or association tracts, and create what Moniz termed a "frontal barrier". After the first operation was complete, Moniz considered it a success and, observing that the patient's depression had been relieved, he declared her "cured" although she was never, in fact, discharged from the mental hospital. Moniz and Lima persisted with this method of injecting alcohol into the frontal lobes for the next seven patients but, after having to inject some patients on numerous occasions to elicit what they considered a favourable result, they modified the means by which they would section the frontal lobes. For the ninth patient they introduced a surgical instrument called a leucotome; this was a cannula that was 11 centimetres (4.3 in) in length and 2 centimetres (0.79 in) in diameter. It had a retractable wire loop at one end that, when rotated, produced a 1 centimetre (0.39 in) diameter circular lesion in the white matter of the frontal lobe. Typically, six lesions were cut into each lobe, but, if they were dissatisfied by the results, Lima might perform several procedures, each producing multiple lesions in the left and right frontal lobes.

By the conclusion of this first run of leucotomies in February 1936, Moniz and Lima had operated on twenty patients with an average period of one week between each procedure; Moniz published his findings with great haste in March of the same year. The patients were aged between 27 and 62 years of age; twelve were female and eight were male. Nine of the patients were diagnosed with depression, six with schizophrenia, two with panic disorder, and one each with mania, catatonia and manic-depression. Their most prominent symptoms were anxiety and agitation. The duration of their illness before the procedure varied from as little as four weeks to as much as 22 years, although all but four had been ill for at least one year. Patients were normally operated on the day they arrived at Moniz's clinic and returned within ten days to the Miguel Bombarda Mental Hospital. A perfunctory post-operative follow-up assessment took place anywhere from one to ten weeks following surgery. Complications were observed in each of the leucotomy patients and included: "increased temperature, vomiting, bladder and bowel incontinence, diarrhea, and ocular affections such as ptosis and nystagmus, as well as psychological effects such as apathy, akinesia, lethargy, timing and local disorientation, kleptomania, and abnormal sensations of hunger". Moniz asserted that these effects were transitory and, according to his published assessment, the outcome for these first twenty patients was that 35%, or seven cases, improved significantly, another 35% were somewhat improved and the remaining 30% (six cases) were unchanged. There were no deaths and he did not consider that any patients had deteriorated following leucotomy.

Reception

Moniz rapidly disseminated his results through articles in the medical press and a monograph in 1936. Initially, however, the medical community appeared hostile to the new procedure. On 26 July 1936, one of his assistants, Diogo Furtado, gave a presentation at the Parisian meeting of the Société Médico-Psychologique on the results of the second cohort of patients leucotomised by Lima. Sobral Cid, who had supplied Moniz with the first set of patients for leucotomy from his own hospital in Lisbon, attended the meeting and denounced the technique, declaring that the patients who had been returned to his care post-operatively were "diminished" and had experienced a "degradation of personality". He also claimed that the changes Moniz observed in patients were more properly attributed to shock and brain trauma, and he derided the theoretical architecture that Moniz had constructed to support the new procedure as "cerebral mythology." At the same meeting the Parisian psychiatrist, Paul Courbon, stated he could not endorse a surgical technique that was solely supported by theoretical considerations rather than clinical observations. He also opined that the mutilation of an organ could not improve its function and that such cerebral wounds as were occasioned by leucotomy risked the later development of meningitis, epilepsy and brain abscesses. Nonetheless, Moniz's reported successful surgical treatment of 14 out of 20 patients led to the rapid adoption of the procedure on an experimental basis by individual clinicians in countries such as Brazil, Cuba, Italy, Romania and the United States during the 1930s.

Italian leucotomy

In the present state of affairs if some are critical about lack of caution in therapy, it is, on the other hand, deplorable and inexcusable to remain apathetic, with folded hands, content with learned lucubrations upon symptomatologic minutiae or upon psychopathic curiosities, or even worse, not even doing that.

Amarro Fiamberti

Throughout the remainder of the 1930s the number of leucotomies performed in most countries where the technique was adopted remained quite low. In Britain, which was later a major centre for leucotomy, only six operations had been undertaken before 1942. Generally, medical practitioners who attempted the procedure adopted a cautious approach and few patients were leucotomised before the 1940s. Italian neuropsychiatrists, who were typically early and enthusiastic adopters of leucotomy, were exceptional in eschewing such a gradualist course.

Leucotomy was first reported in the Italian medical press in 1936 and Moniz published an article in Italian on the technique in the following year. In 1937, he was invited to Italy to demonstrate the procedure and for a two-week period in June of that year he visited medical centres in Trieste, Ferrara, and one close to Turin – the Racconigi Hospital – where he instructed his Italian neuropsychiatric colleagues on leucotomy and also oversaw several operations. Leucotomy was featured at two Italian psychiatric conferences in 1937 and over the next two years a score of medical articles on Moniz's psychosurgery was published by Italian clinicians based in medical institutions located in Racconigi, Trieste, Naples, Genoa, Milan, Pisa, Catania and Rovigo. The major centre for leucotomy in Italy was the Racconigi Hospital, where the experienced neurosurgeon Ludvig Puusepp provided a guiding hand. Under the medical directorship of Emilio Rizzatti, the medical personnel at this hospital had completed at least 200 leucotomies by 1939. Reports from clinicians based at other Italian institutions detailed significantly smaller numbers of leucotomy operations.

Experimental modifications of Moniz's operation were introduced with little delay by Italian medical practitioners. Most notably, in 1937 Amarro Fiamberti, the medical director of a psychiatric institution in Varese, first devised the transorbital procedure whereby the frontal lobes were accessed through the eye sockets. Fiamberti's method was to puncture the thin layer of orbital bone at the top of the socket and then inject alcohol or formalin into the white matter of the frontal lobes through this aperture. Using this method, while sometimes substituting a leucotome for a hypodermic needle, it is estimated that he leucotomised about 100 patients in the period up to the outbreak of World War II. Fiamberti's innovation of Moniz's method would later prove inspirational for Walter Freeman's development of transorbital lobotomy.

American leucotomy

Site of borehole for the standard pre-frontal lobotomy/leucotomy operation as developed by Freeman and Watts

The first prefrontal leucotomy in the United States was performed at the George Washington University Hospital on 14 September 1936 by the neurologist Walter Freeman and his friend and colleague, the neurosurgeon, James W. Watts. Freeman had first encountered Moniz at the London-hosted Second International Congress of Neurology in 1935 where he had presented a poster exhibit of the Portuguese neurologist's work on cerebral angiography. Fortuitously occupying a booth next to Moniz, Freeman, delighted by their chance meeting, formed a highly favourable impression of Moniz, later remarking upon his "sheer genius". According to Freeman, if they had not met in person it is highly unlikely that he would have ventured into the domain of frontal lobe psychosurgery. Freeman's interest in psychiatry was the natural outgrowth of his appointment in 1924 as the medical director of the Research Laboratories of the Government Hospital for the Insane in Washington, known colloquially as St Elizabeth's. Ambitious and a prodigious researcher, Freeman, who favoured an organic model of mental illness causation, spent the next several years exhaustively, yet ultimately fruitlessly, investigating a neuropathological basis for insanity. Chancing upon a preliminary communication by Moniz on leucotomy in the spring of 1936, Freeman initiated a correspondence in May of that year. Writing that he had been considering psychiatric brain surgery previously, he informed Moniz that, "having your authority I expect to go ahead". Moniz, in return, promised to send him a copy of his forthcoming monograph on leucotomy and urged him to purchase a leucotome from a French supplier.

Upon receipt of Moniz's monograph, Freeman reviewed it anonymously for the Archives of Neurology and Psychiatry. Praising the text as one whose "importance can scarcely be overestimated", he summarised Moniz's rationale for the procedure as based on the fact that while no physical abnormality of cerebral cell bodies was observable in the mentally ill, their cellular interconnections may harbour a "fixation of certain patterns of relationship among various groups of cells" and that this resulted in obsessions, delusions and mental morbidity. While recognising that Moniz's thesis was inadequate, for Freeman it had the advantage of circumventing the search for diseased brain tissue in the mentally ill by instead suggesting that the problem was a functional one of the brain's internal wiring where relief might be obtained by severing problematic mental circuits.

In 1937 Freeman and Watts adapted Lima and Moniz's surgical procedure, and created the Freeman-Watts technique, also known as the Freeman-Watts standard prefrontal lobotomy, which they styled the "precision method".

Transorbital lobotomy

The Freeman–Watts prefrontal lobotomy still required drilling holes in the skull, so surgery had to be performed in an operating room by trained neurosurgeons. Walter Freeman believed this surgery would be unavailable to those he saw as needing it most: patients in state mental hospitals that had no operating rooms, surgeons, or anesthesia and limited budgets. Freeman wanted to simplify the procedure so that it could be carried out by psychiatrists in psychiatric hospitals.

Inspired by the work of Italian psychiatrist Amarro Fiamberti, Freeman at some point conceived of approaching the frontal lobes through the eye sockets instead of through drilled holes in the skull. In 1945 he took an icepick from his own kitchen and began testing the idea on grapefruit and cadavers. This new "transorbital" lobotomy involved lifting the upper eyelid and placing the point of a thin surgical instrument (often called an orbitoclast or leucotome, although quite different from the wire loop leucotome described above) under the eyelid and against the top of the eyesocket. A mallet was used to drive the orbitoclast through the thin layer of bone and into the brain along the plane of the bridge of the nose, around 15 degrees toward the interhemispherical fissure. The orbitoclast was malleted 5 centimeters (2 in) into the frontal lobe, and then pivoted 40 degrees at the orbit perforation so the tip cut toward the opposite side of the head (toward the nose). The instrument was returned to the neutral position and sent a further 2 centimeters (45 in) into the brain, before being pivoted around 28 degrees each side, to cut outward and again inward. (In a more radical variation at the end of the last cut described, the butt of the orbitoclast was forced upward so the tool cut vertically down the side of the cortex of the interhemispheric fissure; the "Deep Frontal Cut".) All cuts were designed to transect the white fibrous matter connecting the cortical tissue of the prefrontal cortex to the thalamus. The leucotome was then withdrawn and the procedure repeated on the other side.

Freeman performed the first transorbital lobotomy on a live patient in 1946. Its simplicity suggested the possibility of carrying it out in mental hospitals lacking the surgical facilities required for the earlier, more complex procedure. (Freeman suggested that, where conventional anesthesia was unavailable, electroconvulsive therapy be used to render the patient unconscious.) In 1947, the Freeman and Watts partnership ended, as the latter was disgusted by Freeman's modification of the lobotomy from a surgical operation into a simple "office" procedure. Between 1940 and 1944, 684 lobotomies were performed in the United States. However, because of the fervent promotion of the technique by Freeman and Watts, those numbers increased sharply toward the end of the decade. In 1949, the peak year for lobotomies in the US, 5,074 procedures were undertaken, and by 1951 over 18,608 individuals had been lobotomized in the US.

Prevalence

Lobotomy (by Lennart Nilsson) underway at Södersjukhuset, Stockholm, in 1949

In the United States, approximately 40,000 people were lobotomized and in England, 17,000 lobotomies were performed. According to one estimate, in the three Nordic countries of Denmark, Norway, and Sweden a combined figure of approximately 9,300 lobotomies were performed. Scandinavian hospitals lobotomized 2.5 times as many people per capita as hospitals in the US. According to another estimate, Sweden lobotomized at least 4,500 people between 1944 and 1966, mainly women. This figure includes young children. And in Norway, there were 2,005 known lobotomies. In Denmark, there were 4,500 known lobotomies. In Japan, the majority of lobotomies were performed on children with behaviour problems. The Soviet Union banned the practice in 1950 on moral grounds. In Germany, it was performed only a few times. By the late 1970s, the practice of lobotomy had generally ceased, although it continued as late as the 1980s in France.

Criticism

As early as 1944, an author in the Journal of Nervous and Mental Disease remarked: "The history of prefrontal lobotomy has been brief and stormy. Its course has been dotted with both violent opposition and with slavish, unquestioning acceptance." Beginning in 1947 Swedish psychiatrist Snorre Wohlfahrt evaluated early trials, reporting that it is "distinctly hazardous to leucotomize schizophrenics" and that lobotomy was "still too imperfect to enable us, with its aid, to venture on a general offensive against chronic cases of mental disorder", stating further that "Psychosurgery has as yet failed to discover its precise indications and contraindications and the methods must unfortunately still be regarded as rather crude and hazardous in many respects." In 1948 Norbert Wiener, the author of Cybernetics: Or the Control and Communication in the Animal and the Machine, said: "[P]refrontal lobotomy ... has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier."

Concerns about lobotomy steadily grew. Soviet psychiatrist Vasily Gilyarovsky criticized lobotomy and the mechanistic brain localization assumption used to carry out lobotomy:

It is assumed that the transection of white substance of the frontal lobes impairs their connection with the thalamus and eliminates the possibility to receive from it stimuli which lead to irritation and on the whole derange mental functions. This explanation is mechanistic and goes back to the narrow localizationism characteristic of psychiatrists of America, from where leucotomy was imported to us.

The Soviet Union officially banned the procedure in 1950 on the initiative of Gilyarovsky. Doctors in the Soviet Union concluded that the procedure was "contrary to the principles of humanity" and "'through lobotomy' an insane person is changed into an idiot". By the 1970s, numerous countries had banned the procedure, as had several US states.

In 1977, the US Congress, during the presidency of Jimmy Carter, created the National Committee for the Protection of Human Subjects of Biomedical and Behavioral Research to investigate allegations that psychosurgery – including lobotomy techniques – was used to control minorities and restrain individual rights. The committee concluded that some extremely limited and properly performed psychosurgery could have positive effects.

Torsten Wiesel has called the award of the Nobel Prize to Moniz an "astounding [error] of judgment .. a terrible mistake", and there have been calls for the Nobel Foundation to rescind the award; The Foundation has not done so, and its website still hosts an article defending lobotomy.

Notable cases

  • Rosemary Kennedy, sister of US President John F. Kennedy, underwent a lobotomy in 1941 that left her incapacitated and institutionalized for the rest of her life.
  • Howard Dully wrote a memoir of his late-life discovery that he had been lobotomized in 1960 at age 12.
  • New Zealand author and poet Janet Frame received a literary award in 1951 the day before a scheduled lobotomy was to take place, and it was never performed.
  • Josef Hassid, a Polish violinist and composer, was diagnosed with schizophrenia and died at the age of 26 following a lobotomy performed on him in England.
  • Swedish modernist painter Sigrid Hjertén died following a lobotomy in 1948.
  • American playwright Tennessee Williams' older sister Rose received a lobotomy that left her incapacitated for life; the episode is said to have inspired characters and motifs in some of his works.
  • It is often said that when an iron rod was accidentally driven through the head of Phineas Gage in 1848, this constituted an "accidental lobotomy", or that this event somehow inspired the development of surgical lobotomy a century later. According to the only book-length study of Gage, careful inquiry turns up no such link.
  • In 2011, Daniel Nijensohn, an Argentine-born neurosurgeon at Yale, examined X-rays of Eva Perón and concluded that she underwent a lobotomy for the treatment of pain and anxiety in the last months of her life.

Literary and cinematic portrayals

Lobotomies have been featured in several literary and cinematic presentations that both reflected society's attitude toward the procedure and, at times, changed it. Writers and film-makers have played a pivotal role in turning public sentiment against the procedure.

  • Robert Penn Warren's 1946 novel All the King's Men describes a lobotomy as making "a Comanche brave look like a tyro with a scalping knife", and portrays the surgeon as a repressed man who cannot change others with love, so he instead resorts to "high-grade carpentry work".
  • Tennessee Williams criticized lobotomy in his play Suddenly, Last Summer (1958) because it was sometimes inflicted on homosexuals – to render them "morally sane". In the play, a wealthy matriarch offers the local mental hospital a substantial donation if the hospital will give her niece a lobotomy, which she hopes will stop the niece's shocking revelations about the matriarch's son. Warned that a lobotomy might not stop her niece's "babbling", she responds, "That may be, maybe not, but after the operation, who would believe her, Doctor?".
  • In Ken Kesey's 1962 novel One Flew Over the Cuckoo's Nest and its 1975 film adaptation, lobotomy is described as "frontal-lobe castration", a form of punishment and control after which "There's nothin' in the face. Just like one of those store dummies." In one patient, "You can see by his eyes how they burned him out over there; his eyes are all smoked up and gray and deserted inside."
  • In Sylvia Plath's 1963 novel The Bell Jar, the protagonist reacts with horror to the "perpetual marble calm" of a lobotomized young woman.
  • Elliott Baker's 1964 novel and 1966 film version, A Fine Madness, portrays the dehumanizing lobotomy of a womanizing, quarrelsome poet who, afterward, is just as aggressive as ever. The surgeon is depicted as an inhumane crackpot.
  • The 1982 biopic film Frances depicts actress Frances Farmer (the subject of the film) undergoing transorbital lobotomy (though the idea that a lobotomy was performed on Farmer, and that Freeman performed it, has been criticized as having little or no factual foundation).
  • The 2018 film The Mountain centers on lobotomization, its cultural significance in the context of 1950s America, and mid-century attitudes surrounding mental health in general. The film interrogates the ethical and social implications of the practice through the experiences of its protagonist, a young man whose late mother had been lobotomized. The protagonist takes a job as a medical photographer for the fictional Dr. Wallace Fiennes, portrayed by Jeff Goldblum. Fiennes is loosely based on Freeman.

Behavior

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Behavior

Behavior (American English) or behaviour (British English) is the range of actions and mannerisms made by individuals, organisms, systems or artificial entities in some environment. These systems can include other systems or organisms as well as the inanimate physical environment. It is the computed response of the system or organism to various stimuli or inputs, whether internal or external, conscious or subconscious, overt or covert, and voluntary or involuntary.

Taking a behavior informatics perspective, a behavior consists of actor, operation, interactions, and their properties. This can be represented as a behavior vector.

Models

Biology

Although disagreement exists as to how to precisely define behavior in a biological context, one common interpretation based on a meta-analysis of scientific literature states that "behavior is the internally coordinated responses (actions or inactions) of whole living organisms (individuals or groups) to internal or external stimuli".

A broader definition of behavior, applicable to plants and other organisms, is similar to the concept of phenotypic plasticity. It describes behavior as a response to an event or environment change during the course of the lifetime of an individual, differing from other physiological or biochemical changes that occur more rapidly, and excluding changes that are a result of development (ontogeny).

Behaviors can be either innate or learned from the environment.

Behavior can be regarded as any action of an organism that changes its relationship to its environment. Behavior provides outputs from the organism to the environment.

Human behavior

The endocrine system and the nervous system likely influence human behavior. Complexity in the behavior of an organism may be correlated to the complexity of its nervous system. Generally, organisms with more complex nervous systems have a greater capacity to learn new responses and thus adjust their behavior.

Animal behavior

Ethology is the scientific and objective study of animal behavior, usually with a focus on behavior under natural conditions, and viewing behavior as an evolutionarily adaptive trait. behaviorism is a term that also describes the scientific and objective study of animal behavior, usually referring to measured responses to stimuli or trained behavioral responses in a laboratory context, without a particular emphasis on evolutionary adaptivity.

Consumer behavior

Consumers behavior

Consumer behavior involves the processes consumers go through, and reactions they have towards products or services. It has to do with consumption, and the processes consumers go through around purchasing and consuming goods and services. Consumers recognise needs or wants, and go through a process to satisfy these needs. Consumer behavior is the process they go through as customers, which includes types of products purchased, amount spent, frequency of purchases and what influences them to make the purchase decision or not.

Circumstances that influence consumer behavior are varied, with contributions from both internal and external factors. Internal factors include attitudes, needs, motives, preferences and perceptual processes, whilst external factors include marketing activities, social and economic factors, and cultural aspects. Doctor Lars Perner of the University of Southern California claims that there are also physical factors that influence consumer behavior, for example if a consumer is hungry, then this physical feeling of hunger will influence them so that they go and purchase a sandwich to satisfy the hunger.

Consumer decision making

There is a model described by Lars Perner which illustrates the decision-making process with regards to consumer behavior. It begins with recognition of a problem, the consumer recognises a need or want which has not been satisfied. This leads the consumer to search for information, if it is a low involvement product then the search will be internal, identifying alternatives purely from memory. If the product is high involvement then the search be more thorough, such as reading reviews or reports or asking friends.

The consumer will then evaluate his or her alternatives, comparing price, quality, doing trade-offs between products and narrowing down the choice by eliminating the less appealing products until there is one left. After this has been identified, the consumer will purchase the product.

Finally the consumer will evaluate the purchase decision, and the purchased product, bringing in factors such as value for money, quality of goods and purchase experience. However, this logical process does not always happen this way, people are emotional and irrational creatures. People make decisions with emotion and then justify it with logic according to Robert Caladini Ph.D. Psychology.

How the 4P's influence consumer behavior

The Marketing mix (4 P's) are a marketing tool, and stand for Price, Promotion, Product and Placement.

Because consumer behavior is influenced greatly by business to consumer marketing, the 4 P's will have an effect on consumer's behavior. The price of a good or service is largely determined by the market, as businesses will set their prices to be similar to that of other business so as to remain competitive whilst making a profit. When market prices for a product are high, it will cause consumers to purchase less and use purchased goods for longer periods of time, meaning they are purchasing the product less often. Alternatively, when market prices for a product are low, consumers are more likely to purchase more of the product, and more often.

The way that promotion influences consumer behavior has changed over time. In the past, large promotional campaigns and heavy advertising would convert into sales for a business, but nowadays businesses can have success on products with little or no advertising. This is due to the Internet, and in particular social media. They rely on word of mouth from consumers using social media, and as products trend online, so sales increase as products effectively promote themselves. Thus, promotion by businesses does not necessarily result in consumer behavior trending towards purchasing products.

The way that product influences consumer behavior is through consumer willingness to pay, and consumer preferences. This means that even if a company were to have a long history of products in the market, consumers will still pick a cheaper product over the company in question's product if it means they will pay less for something that is very similar. This is due to consumer willingness to pay, or their willingness to part with their money they have earned. Product also influences consumer behavior through customer preferences. For example, take Pepsi vs Coca-Cola, a Pepsi-drinker is less likely to purchase Coca-Cola, even if it is cheaper and more convenient. This is due to the preference of the consumer, and no matter how hard the opposing company tries they will not be able to force the customer to change their mind.

Product placement in the modern era has little influence on consumer behavior, due to the availability of goods online. If a customer can purchase a good from the comfort of their home instead of purchasing in-store, then the placement of products is not going to influence their purchase decision.

In management

Behavior outside of psychology includes

Organizational

In management, behaviors are associated with desired or undesired focuses. Managers generally note what the desired outcome is, but behavioral patterns can take over. These patterns are the reference to how often the desired behavior actually occurs. Before a behavior actually occurs, antecedents focus on the stimuli that influence the behavior that is about to happen. After the behavior occurs, consequences fall into place. Consequences consist of rewards or punishments.

Social behavior

Social behavior is behavior among two or more organisms within the same species, and encompasses any behavior in which one member affects the other. This is due to an interaction among those members. Social behavior can be seen as similar to an exchange of goods, with the expectation that when one gives, one will receive the same. This behavior can be affected by both the qualities of the individual and the environmental (situational) factors. Therefore, social behavior arises as a result of an interaction between the two—the organism and its environment. This means that, in regards to humans, social behavior can be determined by both the individual characteristics of the person, and the situation they are in.

Behavior informatics

Behavior informatics also called behavior computing, explores behavior intelligence and behavior insights from the informatics and computing perspectives.

Different from applied behavior analysis from the psychological perspective, BI builds computational theories, systems and tools to qualitatively and quantitatively model, represent, analyze, and manage behaviors of individuals, groups and/or organizations.

Health

Health behavior refers to a person's beliefs and actions regarding their health and well-being. Health behaviors are direct factors in maintaining a healthy lifestyle. Health behaviors are influenced by the social, cultural, and physical environments in which we live. They are shaped by individual choices and external constraints. Positive behaviors help promote health and prevent disease, while the opposite is true for risk behaviors. Health behaviors are early indicators of population health. Because of the time lag that often occurs between certain behaviors and the development of disease, these indicators may foreshadow the future burdens and benefits of health-risk and health-promoting behaviors.

Correlates

A variety of studies have examined the relationship between health behaviors and health outcomes (e.g., Blaxter 1990) and have demonstrated their role in both morbidity and mortality.

These studies have identified seven features of lifestyle which were associated with lower morbidity and higher subsequent long-term survival (Belloc and Breslow 1972):

  • Avoiding snacks
  • Eating breakfast regularly
  • Exercising regularly
  • Maintaining a desirable body weight
  • Moderate alcohol intake
  • Not smoking
  • Sleeping 7–8h per night

Health behaviors impact upon individuals' quality of life, by delaying the onset of chronic disease and extending active lifespan. Smoking, alcohol consumption, diet, gaps in primary care services and low screening uptake are all significant determinants of poor health, and changing such behaviors should lead to improved health. For example, in US, Healthy People 2000, United States Department of Health and Human Services, lists increased physical activity, changes in nutrition and reductions in tobacco, alcohol and drug use as important for health promotion and disease prevention.

Treatment approach

Any interventions done are matched with the needs of each individual in an ethical and respected manner. Health belief model encourages increasing individuals' perceived susceptibility to negative health outcomes and making individuals aware of the severity of such negative health behavior outcomes. E.g. through health promotion messages. In addition, the health belief model suggests the need to focus on the benefits of health behaviors and the fact that barriers to action are easily overcome. The theory of planned behavior suggests using persuasive messages for tackling behavioral beliefs to increase the readiness to perform a behavior, called intentions. The theory of planned behavior advocates the need to tackle normative beliefs and control beliefs in any attempt to change behavior. Challenging the normative beliefs is not enough but to follow through the intention with self-efficacy from individual's mastery in problem solving and task completion is important to bring about a positive change. Self efficacy is often cemented through standard persuasive techniques.

Cognitive revolution

From Wikipedia, the free encyclopedia

The cognitive revolution was an intellectual movement that began in the 1950s as an interdisciplinary study of the mind and its processes. It later became known collectively as cognitive science. The relevant areas of interchange were between the fields of psychology, linguistics, computer science, anthropology, neuroscience, and philosophy. The approaches used were developed within the then-nascent fields of artificial intelligence, computer science, and neuroscience. In the 1960s, the Harvard Center for Cognitive Studies and the Center for Human Information Processing at the University of California, San Diego were influential in developing the academic study of cognitive science. By the early 1970s, the cognitive movement had surpassed behaviorism as a psychological paradigm. Furthermore, by the early 1980s the cognitive approach had become the dominant line of research inquiry across most branches in the field of psychology.

A key goal of early cognitive psychology was to apply the scientific method to the study of human cognition. Some of the main ideas and developments from the cognitive revolution were the use of the scientific method in cognitive science research, the necessity of mental systems to process sensory input, the innateness of these systems, and the modularity of the mind. Important publications in triggering the cognitive revolution include psychologist George Miller's 1956 article "The Magical Number Seven, Plus or Minus Two" (one of the most frequently cited papers in psychology), linguist Noam Chomsky's Syntactic Structures (1957) and "Review of B. F. Skinner's Verbal Behavior" (1959), and foundational works in the field of artificial intelligence by John McCarthy, Marvin Minsky, Allen Newell, and Herbert Simon, such as the 1958 article "Elements of a Theory of Human Problem Solving". Ulric Neisser's 1967 book Cognitive Psychology was also a landmark contribution.

Historical background

Prior to the cognitive revolution, behaviorism was the dominant trend in psychology in the United States. Behaviorists were interested in "learning," which was seen as "the novel association of stimuli with responses." Animal experiments played a significant role in behaviorist research, and prominent behaviorist J. B. Watson, interested in describing the responses of humans and animals as one group, stated that there was no need to distinguish between the two. Watson hoped to learn to predict and control behavior through his research. The popular Hull-Spence stimulus-response approach was, according to George Mandler, impossible to use to research topics that held the interest of cognitive scientists, like memory and thought, because both the stimulus and the response were thought of as completely physical events. Behaviorists typically did not research these subjects. B. F. Skinner, a functionalist behaviorist, criticized certain mental concepts like instinct as "explanatory fiction(s)," ideas that assume more than humans actually know about a mental concept. Various types of behaviorists had different views on the exact role (if any) that consciousness and cognition played in behavior. Although behaviorism was popular in the United States, Europe was not particularly influenced by it, and research on cognition could easily be found in Europe during this time.

Noam Chomsky has framed the cognitive and behaviorist positions as rationalist and empiricist, respectively, which are philosophical positions that arose long before behaviorism became popular and the cognitive revolution occurred. Empiricists believe that humans acquire knowledge only through sensory input, while rationalists believe that there is something beyond sensory experience that contributes to human knowledge. However, whether Chomsky's position on language fits into the traditional rationalist approach has been questioned by philosopher John Cottingham.

George Miller, one of the scientists involved in the cognitive revolution, sets the date of its beginning as September 11, 1956, when several researchers from fields like experimental psychology, computer science, and theoretical linguistics presented their work on cognitive science-related topics at a meeting of the ‘Special Interest Group in Information Theory’ at the Massachusetts Institute of Technology. This interdisciplinary cooperation went by several names like cognitive studies and information-processing psychology but eventually came to be known as cognitive science. Grants from the Alfred P. Sloan Foundation in the 1970s advanced interdisciplinary understanding in the relevant fields and supported the research that led to the field of cognitive neuroscience.

Main ideas

George Miller states that six fields participated in the development of cognitive science: psychology, linguistics, computer science, anthropology, neuroscience, and philosophy, with the first three playing the main roles.

The Scientific Method

A key goal of early cognitive psychology was to apply the scientific method to the study of human cognition. This was done by designing experiments that used computational models of artificial intelligence to systematically test theories about human mental processes in a controlled laboratory setting.

Mediation and information processing

When defining the "Cognitive Approach," Ulric Neisser says that humans can only interact with the "real world" through intermediary systems that process information like sensory input. As understood by a cognitive scientist, the study of cognition is the study of these systems and the ways they process information from the input. The processing includes not just the initial structuring and interpretation of the input but also the storage and later use.

Steven Pinker claims that the cognitive revolution bridged the gap between the physical world and the world of ideas, concepts, meanings and intentions. It unified the two worlds with a theory that mental life can be explained in terms of information, computation and feedback.

Innateness

In his 1975 book Reflections on Language, Noam Chomsky questions how humans can know so much, despite relatively limited input. He argues that they must have some kind of innate learning mechanism that processes input, and that mechanism must be domain-specific and innate. Chomsky observes that physical organs do not develop based on their experience, but based on some inherent genetic coding, and wrote that the mind should be treated the same way. He says that there is no question that there is some kind of innate structure in the mind, but it is less agreed upon whether the same structure is used by all organisms for different types of learning. He compares humans to rats in the task of maze running to show that the same learning theory cannot be used for different species because they would be equally good at what they are learning, which is not the case. He also says that even within humans, using the same learning theory for multiple types of learning could be possible, but there is no solid evidence to suggest it. He proposes a hypothesis that claims that there is a biologically based language faculty that organizes the linguistic information in the input and constrains human language to a set of particular types of grammars. He introduces universal grammar, a set of inherent rules and principles that all humans have to govern language, and says that the components of universal grammar are biological. To support this, he points out that children seem to know that language has a hierarchical structure, and they never make mistakes that one would expect from a hypothesis that language is linear.

Steven Pinker has also written on this subject from the perspective of modern-day cognitive science. He says that modern cognitive scientists, like figures in the past such as Gottfried Wilhelm Leibniz (1646-1716), don't believe in the idea of the mind starting a "blank slate." Though they have disputes on the nature-nurture diffusion, they all believe that learning is based on something innate to humans. Without this innateness, there will be no learning process. He points out that humans' acts are non-exhaustive, even though basic biological functions are finite. An example of this from linguistics is the fact that humans can produce infinite sentences, most of which are brand new to the speaker themselves, even though the words and phrases they have heard are not infinite.

Pinker, who agrees with Chomsky's idea of innate universal grammar, claims that although humans speak around six thousand mutually unintelligible languages, the grammatical programs in their minds differ far less than the actual speech. Many different languages can be used to convey the same concepts or ideas, which suggests there may be a common ground for all the languages.

Modularity of the mind

Pinker claims another important idea from the cognitive revolution was that the mind is modular, with many parts cooperating to generate a train of thought or an organized action. It has different distinct systems for different specific missions. Behaviors can vary across cultures, but the mental programs that generate the behaviors don't need to be varied.

Criticism

There have been criticisms of the typical characterization of the shift from behaviorism to cognitivism.

Henry L. Roediger III argues that the common narrative most people believe about the cognitive revolution is inaccurate. The narrative he describes states that psychology started out well but lost its way and fell into behaviorism, but this was corrected by the Cognitive Revolution, which essentially put an end to behaviorism. He claims that behavior analysis is actually still an active area of research that produces successful results in psychology and points to the Association for Behavior Analysis International as evidence. He claims that behaviorist research is responsible for successful treatments of autism, stuttering, and aphasia, and that most psychologists actually study observable behavior, even if they interpret their results cognitively. He believes that the change from behaviorism to cognitivism was gradual, slowly evolving by building on behaviorism.

Lachman and Butterfield were among the first to imply that cognitive psychology has a revolutionary origin. Thomas H. Leahey has criticized the idea that the introduction of behaviorism and the cognitive revolution were actually revolutions and proposed an alternative history of American psychology as "a narrative of research traditions."

Other authors criticize behaviorism, but they also criticize the cognitive revolution for having adopted new forms of anti-mentalism.

Cognitive psychologist Jerome Bruner criticized the adoption of the computational theory of mind and the exclusion of meaning from cognitive science, and he characterized one of the primary objects of the cognitive revolution as changing the study of psychology so that meaning was its core.

His understanding of the cognitive revolution revolves entirely around "meaning-making" and the hermeneutic description of how people go about this. He believes that the cognitive revolution steered psychology away from behaviorism and this was good, but then another form of anti-mentalism took its place: computationalism. Bruner states that the cognitive revolution should replace behaviorism rather than only modify it.

Neuroscientist Gerald Edelman argues in his book Bright Air, Brilliant Fire (1991) that a positive result of the emergence of "cognitive science" was the departure from "simplistic behaviorism”. However, he adds, a negative result was the growing popularity of a total misconception of the nature of thought: the computational theory of mind or cognitivism, which asserts that the brain is a computer that processes symbols whose meanings are entities of the objective world. In this view, the symbols of the mind correspond exactly to entities or categories in the world defined by criteria of necessary and sufficient conditions, that is, classical categories. The representations would be manipulated according to certain rules that constitute a syntax.

Edelman rejects the idea that objects of the world come in classical categories, and also rejects the idea that the brain/mind is a computer. The author rejects behaviorism (a points he also makes in his 2006 book Second Nature. Brain science and human knowledge), but also cognitivism (the computational-representational theory of the mind), since the latter conceptualizes the mind as a computer and meaning as objective correspondence. Furthermore, Edelman criticizes "functionalism", the idea that formal and abstract functional properties of the mind can be analyzed without making direct reference to the brain and its processes. 

Edelman asserts that most of those who work in the field of cognitive psychology and cognitive science seem to adhere to this computational view, but he mentions some important exceptions. Exceptions include John Searle, Jerome Bruner, George Lakoff, Ronald Langacker, Alan Gauld, Benny Shanon, Claes von Hofsten, and others. Edelman argues that he agrees with the critical and dissenting approaches of these authors that are exceptions to the majority view of cognitivism. 

Perceptual symbols, imagery and the cognitive neuroscience revolution

In their paper “The cognitive neuroscience revolution”, Gualtiero Piccinini and Worth Boone argue that cognitive neuroscience emerged as a discipline in the late 1980s. Prior to that time, cognitive science and neuroscience had largely developed in isolation. Cognitive science developed between the 1950s and 1970s as an interdisciplinary field composed primarily of aspects of psychology, linguistics, and computer science. However, both classical symbolic computational theories and connectionist models developed largely independently of biological considerations. The authors argue that connectionist models were closer to symbolic models than to neurobiology.

Piccinini and Boone state that a revolutionary change is currently taking place: the move from cognitive science (autonomous from neuroscience) to cognitive neuroscience. The authors point out that many researchers who previously carried out psychological and behavioral studies now give properly cognitive neuroscientific explanations. They mention the example of Stephen Kosslyn, who postulated his theory of the pictorial format of mental images in the 1980s based on behavioral studies. Later, with the advent of magnetic resonance imaging technology, Kosslyn was able to show that when people imagine, the visual cortex is activated. This lent strong neuroscientific evidence to his theory of the pictorial format, refuting speculations about a supposed non-pictorial format of mental images.

According to Canales Johnson et al. (2021):

Many studies using imaging and neurophysiological techniques have shown several similarities in brain activity between visual imagery and visual perception, and have identified frontoparietal, occipital and temporal neural components of visual imagery.

— Canales Johnson et al.

Neuroscientist Joseph LeDoux in his book The Emotional Brain argues that cognitive science emerged around the middle of the 20th century, and is often described as 'the new science of the mind.' However, in fact, cognitive science is actually a science of only one part of the mind, the part that has to do with thinking, reasoning, and intellect. It leaves emotions out. “And minds without emotions are not really minds at all…”

Psychologist Lawrence Barsalou argues that human cognitive processing involves the simulation of perceptual, motor, and emotional states. The classical and ‘intellectualist’ view of cognition, considers that it is essentially processing propositional information of a verbal or numerical type. However, Barsalou's theory explains human conceptual processing by the activation of regions of the sensory cortices of different modalities, as well as of the motor cortex, and by the simulation of embodied experiences –visual, auditory, emotional, motor–, that ground meaning in experience situated in the world.

Modal symbols are those analogical mental representations linked to a specific sensory channel: for example, the representation of 'dog' through a visual image similar to a dog or through an auditory image of the barking of dogs, based on the memory of the experiences of seeing a dog or hearing its barking. Lawrence Barsalou's 'perceptual symbols' theory asserts that mental processes operate with modal symbols that maintain the sensory properties of perceptual experiences.

According to Barsalou (2020), the “grounded cognition” perspective in which his theory is framed asserts that cognition emerges from the interaction between amodal symbols, modal symbols, the body and the world. Therefore, this perspective does not rule out 'classical' symbols –amodal ones, such as those typical of verbal language or numerical reasoning– but rather considers that these interact with imagination, perception and action situated in the world.

God Is Not Great

From Wikipedia, the free encyclopedia
 
God is Not Great, first edition.jpg
Cover of the U.S. hard-cover edition
AuthorChristopher Hitchens
CountryUnited States
LanguageEnglish
SubjectCriticism of religion
PublisherTwelve Books imprint of the Hachette Book Group USA
Publication date
May 1, 2007
Media typePrint (Hardcover and Paperback), and audiobook
Pages307
ISBN978-0-446-57980-3
OCLC70630426
200 22
LC ClassBL2775.3 .H58 2007

God Is Not Great (sometimes stylized as god is not Great) is a 2007 book by British-American author and journalist Christopher Hitchens, in which he makes a case against organized religion. It was originally published in the United Kingdom by Atlantic Books as God Is Not Great: The Case Against Religion and in the United States by Twelve as God Is Not Great: How Religion Poisons Everything, but was republished by Atlantic Books in 2017 with no subtitle.

Hitchens posited that organized religion is "violent, irrational, intolerant, allied to racism, tribalism, and bigotry, invested in ignorance and hostile to free inquiry, contemptuous of women and coercive toward children" and sectarian, and that accordingly it "ought to have a great deal on its conscience". He supports his position with a mixture of personal stories, documented historical anecdotes and critical analysis of religious texts. His commentary focuses mainly on the Abrahamic religions, although it also touches on other religions, such as Eastern religions. The book received mixed reviews and sold well.

Summary

Chapter One: Putting It Mildly

Hitchens writes that, at the age of nine, he began to question the teachings of his Bible instructor, and began to see critical flaws in apologetic arguments, most notably the argument from design. He discusses people who become atheists, describing some as people who have never believed, and others as those who have separately discarded religious traditions. He asserts that atheists who disagree with each other will eventually side together on whatever the evidence most strongly supports. He discusses why human beings have a tendency towards being "faithful" and argues that religion will remain entrenched in the human consciousness as long as human beings cannot overcome their primitive fears, particularly that of their own mortality. He concludes by saying that he would not want to eradicate religion if the faithful would "leave him alone", but ultimately they are incapable of this.

Chapter Two: Religion Kills

Hitchens lays out his central thesis for this chapter: religion is not content with claims about the next world and must seek to interfere with the lives of nonbelievers.

In this vein, Hitchens addresses a hypothetical question that he was asked while on a panel with radio host Dennis Prager: if he were alone in an unfamiliar city at night, and a group of strangers began to approach him, would he feel safer, or less safe, knowing that these men had just come from a prayer meeting? Hitchens answers,

Just to stay within the letter 'B', I have actually had that experience in Belfast, Beirut, Bombay, Belgrade, Bethlehem and Baghdad. In each case ... I would feel immediately threatened if I thought that the group of men approaching me in the dusk were coming from a religious observance.

He gives detailed descriptions of the tense social and political situations within these cities, which he personally experienced and attributes to religion. He has thus "not found it a prudent rule to seek help as the prayer meeting breaks up".

He discusses the 1989 fatwa issued on author and friend Salman Rushdie by the Ayatollah Khomeini because of the contents of Rushdie's book The Satanic Verses. He criticises several public figures for laying the blame for the incident on Rushdie himself. He also writes about the events following the September 11 attacks, describing how religion, particularly major religious figures, allowed matters to "deteriorate in the interval between the removal of the Taliban and the overthrow of Saddam Hussein".

Chapter Three: A Short Digression On The Pig; or, Why Heaven Hates Ham

Hitchens discusses the prohibition on eating pigs ("porcophobia" as Hitchens calls it) in Judaism, also adopted by Islam. He says that this proscription is not just Biblical or dietary. He reports that even today, Muslim zealots demand that the Three Little Pigs, Miss Piggy, Piglet from Winnie-the-Pooh and other traditional pets and characters be "removed from the innocent gaze of their children". Hitchens suggests that the pork prohibition found in Semitic religions may be based in the proscription of human sacrifice, extended to pigs because of the similarities in appearance and flavor between pork and human flesh.

Chapter Four: A Note On Health, To Which Religion May Be Hazardous

Hitchens explains how some religions can be hostile to disease treatment. He writes that many Muslims saw the polio vaccine as a conspiracy, and thus allowed polio to spread. He discusses the Catholic Church's response to the spread of HIV in Africa, telling people that condoms are ineffective, which, he argues, contributed to the death toll. He notes with examples that some in both the Catholic and the Muslim communities believe irrationally that HIV and HPV are punishment for sexual sin—particularly homosexuality. He describes religious leaders as "faith healers", and opines that they are hostile to medicine because it undermines their position of power.

He criticises the Jewish ritual of circumcision that would have him "take a baby boy's penis in my hand, cut around the prepuce, and complete the action by taking his penis in my mouth, sucking off the foreskin, and spitting out the amputated flap along with a mouthful of blood and saliva", and denounces the traditional African practice of female genital mutilation. He concludes the chapter writing of the religious "wish for obliteration"—for a death in the form of the day of the Apocalypse.

Chapter Five: The Metaphysical Claims of Religion Are False

Hitchens begins by saying that the strong faith that could stand up to any form of reason is long gone. He compares the popular knowledge of the world in Thomas Aquinas's time to what we now know about the world. He uses the example of Laplace—"It works well enough without that [God] hypothesis"—to demonstrate that we do not need God to explain things; he claims that religion becomes obsolete as an explanation when it becomes optional or one among many different beliefs. He concludes that the leap of faith is not just one leap; it is a leap repeatedly made, and a leap that becomes more difficult to take the more it is taken: which is why so many religionists now feel the need to move beyond mere faith and cite evidence for their beliefs.

Chapter Six: Arguments From Design

Hitchens says that Abrahamic religions are used to making people feel like lowly sinners, encouraging low self-esteem, while at the same time leading them to believe that their creator genuinely cares for them, thus inflating their sense of self-importance. He says that superstition to some extent has a "natural advantage", being that it was contrived many centuries before the modern age of human reason and scientific understanding, and discusses a few examples as well as so-called miracles.

He discusses the design arguments, using examples such as the human body wearing out in old age as bad design. He writes that if evolution had taken a slightly different course, there would be no guarantee at all that organisms remotely like humans would ever have existed.

Chapter Seven: The Nightmare Of The Old Testament

Hitchens lists anachronisms and inconsistencies in the Old Testament, stating that many of the "gruesome, disordered events ... never took place". He says the Pentateuch is "an ill-carpentered fiction, bolted into place well after the non-events that it fails to describe convincingly or even plausibly". He points out that when Moses orders parents to have their children stoned to death (see also List of capital crimes in the Torah) for indiscipline (citing Deuteronomy) it is probably a violation of at least one of the very commandments which Moses received from God. He notes that Moses "continually makes demented pronouncements ('He that is wounded in the stones, or hath his privy member cut off, shall not enter into the congregation of the Lord')."

Chapter Eight: The "New" Testament Exceeds The Evil Of The "Old" One

On the subject of a mythical Jesus and the possibility of a historical Jesus in the Gospels, a number of sources on the Internet attribute the controversial quote "Jesus is Santa Claus for adults"' to Hitchens and God Is Not Great, but those words do not appear in this chapter or this book. Hitchens does argue that the "multiple authors—none of whom published anything until many decades after the Crucifixion—cannot agree on anything of importance", "the gospels are most certainly not literal truth", and there is "little or no evidence for the life of Jesus". To Hitchens, the best argument for the "highly questionable existence of Jesus", however, is biblical inconsistency, explaining the "very attempts to bend and stretch the story may be inverse proof that someone of later significance was indeed born".

Hitchens first connects the Book of Isaiah in the Old Testament with its prediction that "a virgin shall conceive, and bear a son" (see Isaiah 7:14), pointing out where the stories converge, Old Testament to New. Comparing the Testaments, he considers the New Testament "also a work of crude carpentry, hammered together long after its purported events, and full of improvised attempts to make things come out right". He points out that, while H. L. Mencken considered some of the New Testament events to be historically verifiable, Mencken maintained that "most of them ... show unmistakable signs of having been tampered with".

Hitchens also outlines the inaccuracy in Luke's attempt to triangulate three world events of the time with Jesus's birth: the census ordered by Augustus of the entire Roman world, the reign of King Herod in Judea and that of Quirinius as governor of Syria (see the Census of Quirinius). He says that there is no record by any Roman historian of any Augustan census, and that, although "the Jewish chronicler Josephus mentions one that did occur—without the onerous requirement for people to return to their places of birth", it was undertaken "six years after the birth of Jesus is supposed to have taken place". He also notes that Herod died in 4 BC, and that Quirinius was not governor of Syria during his tenure.

Hitchens refers to The Passion of the Christ as "a soap-opera film about the death of Jesus ... produced by an Australian fascist and ham actor named Mel Gibson", who "adheres to a crackpot and schismatic Catholic sect". In Hitchens's view, the film attempts tirelessly to blame the death of Jesus on the Jews. He claims that Gibson did not realize that the four Gospels were not at all historical records, and that they had multiple authors, all being written many decades after the crucifixion—and, moreover, that they do not agree on anything "of importance" (e.g., the virgin birth and the genealogy of Jesus). He cites many contradictions of this type.

He further contends that the many "contradictions and illiteracies" of the New Testament, while extensively covered by other authors, have never been explained except as "metaphor" and "a Christ of faith". He states that the "feebleness" of the Bible is a result of the fact that until recently, Christians faced with arguments against the logic or factualness of the Bible "could simply burn or silence anybody who asked any inconvenient questions".

Hitchens points out the problematic implications of the scriptural proclamation "he that is without sin among you, let him cast a first stone" with regard to the practical legislation of retributive justice: "if only the non-sinners have the right to punish, then how could an imperfect society ever determine how to prosecute offenders?" Of the adulterous woman whom Jesus saved from stoning, the author contends that Jesus thus forgives her of sheer sexual promiscuity, and, if this be the case, that the lesson has ever since been completely misunderstood. Closing the chapter, he suggests that advocates of religion have faith alone to rely on—nothing else—and calls on them to "be brave enough" to admit it.

Chapter Nine: The Koran Is Borrowed From Both Jewish and Christian Myths

Chapter nine assesses the religion of Islam, and examines the origin of its holy book, the Quran. Hitchens asserts that there is no evidence for any of the "miraculous" claims about Muhammad, and that the Koran's origin was not supernatural. He contends that the religion was fabricated by Muhammad or his followers and that it was borrowed from other religious texts, and the hadith was taken from common maxims and sayings which developed throughout Arabia and Persia at the time. He identifies similarities between Islam and Christianity, and notes several plagiarisms of the Jewish faith.

Chapter Ten: The Tawdriness Of The Miraculous And The Decline Of Hell

Chapter ten discusses miracles. Hitchens says that no supernatural miracles occur, nor have occurred in history. He says that evidence of miracles is fabricated, or based on the unreliable testimony of people who are mistaken or biased. He notes that no verifiable miracle has been documented since cameras have become commonplace. Hitchens uses a specific purported miracle by Mother Teresa to show how miracles can become perceived as true, when in fact they are based on myth or falsehood.

Chapter Eleven: Religion's Corrupt Beginnings

Chapter eleven discusses how religions form, and claims that most religions are founded by corrupt, immoral individuals. The chapter specifically discusses cargo cults, Pentecostal minister Marjoe Gortner, and Mormonism. Hitchens discusses Joseph Smith, the founder of Mormonism, citing a March 1826 Bainbridge, New York court examination accusing him of being a "disorderly person and impostor" who Hitchens claims admitted there that he had supernatural powers and was "defrauding citizens". Four years later Smith claimed to obtain gold tablets containing the Book of Mormon. When the neighbor's skeptical wife buried 116 pages of the translation and challenged Smith to reproduce it, Smith claimed God, knowing this would happen, told him to instead translate a different section of the same plates.

Chapter Twelve: A Coda: How Religions End

Chapter twelve discusses the termination of several religions, to illustrate that some religions are not everlasting, as they claim. The religions addressed include Millerism and Sabbatai Sevi.

Chapter Thirteen: Does Religion Make People Behave Better?

Hitchens addresses the question of whether religious people behave more virtuously than non-religious people (atheists, agnostics, or freethinkers). He uses the battle against slavery in the United States, and Abraham Lincoln, to support his claim that non-religious people battle for moral causes with as much vigor and effect as religious advocates.

Chapter Fourteen: There Is No "Eastern" Solution

Hitchens dismisses the idea of seeking enlightenment through nirvana as a conceit that asks adherents to "put their reason to sleep, and to discard their minds along with their sandals" in chapter fourteen, which focuses on maladaptive and immiserating Hindu and Buddhist feudalism and violence in Tibet and Sri Lanka. It touches on the lucrative careers of Chandra Mohan Jain and Sathyanarayana Raju, and details his observations of a "brisk fleecing" and the unstable devotees witnessed during the author's staged pilgrimage to an ashram in Pune, which was undertaken as part of a BBC documentary. He suggests that the BBC has no longer a "standard of fairness". He suggests that image of "imperial-way buddhism" is not that of the original Gautama Buddha, and looks at the Japanese Buddhists who joined the Axis forces in World War II.

Hitchens seeks to answer the question "How might one easily prove that 'Eastern' faith was identical with the unverifiable assumptions of 'Western' religion?" He concludes:

It ought to be possible for me to pursue my studies and researches in one house, and for the Buddhist to spin his wheel in another. But contempt for the intellect has a strange way of not being passive. One of two things may happen: those who are innocently credulous may become easy prey for those who are less scrupulous and who seek to "lead" and "inspire" them. Or those whose credulity has led their own society into stagnation may seek a solution, not in true self-examination, but in blaming others for their backwardness. Both these things happened in the most consecratedly "spiritual" society of them all."

Chapter Fifteen: Religion As An Original Sin

Chapter 15 discusses five aspects of religions that Hitchens maintains are "positively immoral":

Chapter Sixteen: Is Religion Child Abuse?

Hitchens discusses how religion has been used to cause harm to children. He cites examples such as genital mutilation or circumcision, and imposition of fear of healthy sexual activities such as masturbation. He criticizes the way that adults use religion to terrorize children.

Chapter Seventeen: An Objection Anticipated

Chapter seventeen addresses the most common counter-argument that Hitchens says he hears, namely that the most immoral acts in human history were performed by atheists like Joseph Stalin. He says "it is interesting that people of faith now seek defensively to say they are no worse than fascists or Nazis or Stalinists". Hitchens began his rebuttal by tracing the understanding of the Nazis or Stalinists, to the concept of totalitarianism probably first used by Victor Serge and then popularized by Hannah Arendt. He appreciates the difference between totalitarianism and despotism, with the former being absolutist systems that demand total surrender of the private lives and personalities of their subjects. On this definition of totalitarianism, Hitchens finds the totalitarian principle laden in many non-secular states and regimes.

He analyzes those examples of immorality, and shows that although the individual leaders may have been atheist or agnostic, that religion played a key role in these events, and religious people and religious leaders fully participated in the wars and crimes.

Chapter Eighteen: A Finer Tradition: The Resistance Of The Rational

Chapter eighteen discusses several important intellectuals, including Socrates, Albert Einstein, Voltaire, Spinoza, Thomas Paine, Charles Darwin, and Isaac Newton. Hitchens claims that many of these people were atheists, agnostics, or pantheists, except for Socrates and Newton. He says that religious advocates have attempted to misrepresent some of these icons as religious, and describes how some of these individuals fought against the negative influences of religion.

Chapter Nineteen: In Conclusion: The Need for a New Enlightenment

Hitchens argues that the human race no longer needs religion to the extent it has in the past. He says the time has come for science and reason to take a more prominent role in the life of individuals and larger cultures; that de-emphasizing religion will improve the quality of life of individuals, and assist the progress of civilization. It is in effect a rallying call to atheists to fight the theocratic encroachment on free society.

Critical reception

Positive critique

Michael Kinsley, in The New York Times Book Review, lauded Hitchens's "logical flourishes and conundrums, many of them entertaining to the nonbeliever". He concluded that "Hitchens has outfoxed the Hitchens watchers by writing a serious and deeply felt book, totally consistent with his beliefs of a lifetime".

Bruce DeSilva considered the book to be the best piece of atheist writing since Bertrand Russell's Why I Am Not a Christian (1927), with Hitchens using "elegant yet biting prose". He concludes that "Hitchens has nothing new to say, although it must be acknowledged that he says it exceptionally well".

The book was praised in Kirkus Reviews as a "pleasingly intemperate assault on organized religion" that "like-minded readers will enjoy".

In The Sydney Morning Herald, Matt Buchanan dubbed it "a thundering 300-page cannonade; a thrillingly fearless, impressively wide-ranging, thoroughly bilious and angry book against the idea of God"; Buchanan found the work to be "easily the most impressive of the present crop of atheistic and anti-theistic books: clever, broad, witty and brilliantly argued".

Jason Cowley in the Financial Times called the book "elegant but derivative".

Negative critique

David Bentley Hart, reviewing the book in the Christian journal First Things, interpreted the book as a "rollicking burlesque, without so much as a pretense of logical order or scholarly rigor". Hart says "On matters of simple historical and textual fact, moreover, Hitchens' book is so extraordinarily crowded with errors that one soon gives up counting them." Hart claims that Hitchens conflates the histories of the 1st and 4th crusades, restates the discredited assertion that the early church destroyed ancient pagan texts, and asserts that Myles Coverdale and John Wycliffe were burned alive when both men died of old age.

Stephen Prothero of The Washington Post considered Hitchens correct on many points but found the book "maddeningly dogmatic" and criticized Hitchens's condemnation of religion altogether, writing that "If this is religion, then by all means we should have less of it. But the only people who believe that religion is about believing blindly in a God who blesses and curses on demand and sees science and reason as spawns of Satan are unlettered fundamentalists and their atheistic doppelgangers."

Responding to Hitchens's claim that "all attempts to reconcile faith with science and reason are consigned to failure and ridicule", Peter Berkowitz of the Hoover Institution quotes paleontologist Stephen Jay Gould. Referencing a number of scientists with religious faith, Gould wrote, "Either half my colleagues are enormously stupid, or else the science of Darwinism is fully compatible with conventional religious beliefs—and equally compatible with atheism."

William J. Hamblin of the FARMS Review criticized Hitchens for implying unanimity among biblical scholars on controversial points and overlooking alternative scholarly positions, and felt that Hitchens's understanding of biblical studies was "flawed at best." "[F]or Hitchens, it is sufficient to dismiss the most extreme, literalistic, and inerrantist interpretations of the Bible to demonstrate not only that the Bible itself is thoroughly flawed, false, and poisonous but that God does not exist." Hamblin felt that he misrepresented the Bible "at the level of a confused undergraduate", failing to contextualise it. Hamblin concluded that the book "should certainly not be seen as reasonable grounds for rejecting belief in God".

Daniel C. Peterson attacked the accuracy of Hitchens's claims in a lengthy essay, describing it as "crammed to the bursting point with errors, and the striking thing about this is that the errors are always, always, in Hitchens’s favor. [...] In many cases, Hitchens is 180 degrees wrong. He is so far wrong that, if he moved at all, he would be coming back toward right."

Curtis White criticized the book as "intellectually shameful" due to its alleged lack of intellectual rigor. White, an atheist critic of religion, asserts that "one enormous problem with Hitchens’s book is that it reduces religion to a series of criminal anecdotes. In the process, however, virtually all of the real history of religious thought, as well as historical and textual scholarship, is simply ignored as if it never existed."

Sales history

The book was published on May 1, 2007, and within a week had reached No. 2 on the Amazon bestsellers list (behind Harry Potter and the Deathly Hallows), and reached No. 1 on the New York Times Bestseller list in its third week.

Archetype

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Archetype The concept of an archetyp...