Search This Blog

Wednesday, November 15, 2023

Dissection

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Dissection

Dissection
Dissection of a pregnant rat in a biology class
 
Ginkgo seed in dissection, showing embryo and gametophyte.
 
Identifiers
MeSHD004210

Dissection (from Latin dissecare "to cut to pieces"; also called anatomization) is the dismembering of the body of a deceased animal or plant to study its anatomical structure. Autopsy is used in pathology and forensic medicine to determine the cause of death in humans. Less extensive dissection of plants and smaller animals preserved in a formaldehyde solution is typically carried out or demonstrated in biology and natural science classes in middle school and high school, while extensive dissections of cadavers of adults and children, both fresh and preserved are carried out by medical students in medical schools as a part of the teaching in subjects such as anatomy, pathology and forensic medicine. Consequently, dissection is typically conducted in a morgue or in an anatomy lab.

Dissection has been used for centuries to explore anatomy. Objections to the use of cadavers have led to the use of alternatives including virtual dissection of computer models.

In the field of surgery, the term "dissection" or "dissecting" means more specifically to the practice of separating an anatomical structure (an organ, nerve or blood vessel) from its surrounding connective tissue in order to minimize unwanted damage during a surgical procedure.

Overview

Plant and animal bodies are dissected to analyze the structure and function of its components. Dissection is practised by students in courses of biology, botany, zoology, and veterinary science, and sometimes in arts studies. In medical schools, students dissect human cadavers to learn anatomy. Zoötomy is sometimes used to describe "dissection of an animal".

Human dissection

A key principle in the dissection of human cadavers (sometimes called androtomy) is the prevention of human disease to the dissector. Prevention of transmission includes the wearing of protective gear, ensuring the environment is clean, dissection technique and pre-dissection tests to specimens for the presence of HIV and hepatitis viruses. Specimens are dissected in morgues or anatomy labs. When provided, they are evaluated for use as a "fresh" or "prepared" specimen. A "fresh" specimen may be dissected within some days, retaining the characteristics of a living specimen, for the purposes of training. A "prepared" specimen may be preserved in solutions such as formalin and pre-dissected by an experienced anatomist, sometimes with the help of a diener. This preparation is sometimes called prosection.

Dissection tools. Left to right: scalpels with No. 20 and No. 12 blades, two forceps and scissors

Most dissection involves the careful isolation and removal of individual organs, called the Virchow technique. An alternative more cumbersome technique involves the removal of the entire organ body, called the Letulle technique. This technique allows a body to be sent to a funeral director without waiting for the sometimes time-consuming dissection of individual organs. The Rokitansky method involves an in situ dissection of the organ block, and the technique of Ghon involves dissection of three separate blocks of organs - the thorax and cervical areas, gastrointestinal and abdominal organs, and urogenital organs. Dissection of individual organs involves accessing the area in which the organ is situated, and systematically removing the anatomical connections of that organ to its surroundings. For example, when removing the heart, connects such as the superior vena cava and inferior vena cava are separated. If pathological connections exist, such as a fibrous pericardium, then this may be deliberately dissected along with the organ.

Autopsy and necropsy

Dissection is used to help to determine the cause of death in autopsy (called necropsy in other animals) and is an intrinsic part of forensic medicine.

History

Galen (129–c.200 AD), Opera omnia, dissection of a pig. Engraving made in Venice, 1565

Classical antiquity

Human dissections were carried out by the Greek physicians Herophilus of Chalcedon and Erasistratus of Chios in the early part of the third century BC. During this period, the first exploration into full human anatomy was performed rather than a base knowledge gained from 'problem-solution' delving. While there was a deep taboo in Greek culture concerning human dissection, there was at the time a strong push by the Ptolemaic government to build Alexandria into a hub of scientific study. For a time, Roman law forbade dissection and autopsy of the human body, so anatomists relied on the cadavers of animals or made observations of human anatomy from injuries of the living. Galen, for example, dissected the Barbary macaque and other primates, assuming their anatomy was basically the same as that of humans, and supplemented these observations with knowledge of human anatomy which he acquired while tending to wounded gladiators.

Celsus wrote in On Medicine I Proem 23, "Herophilus and Erasistratus proceeded in by far the best way: they cut open living men - criminals they obtained out of prison from the kings and they observed, while their subjects still breathed, parts that nature had previously hidden, their position, color, shape, size, arrangement, hardness, softness, smoothness, points of contact, and finally the processes and recesses of each and whether any part is inserted into another or receives the part of another into itself."

Galen was another such writer who was familiar with the studies of Herophilus and Erasistratus.

India

The Ayurvedic Man., c. 18th century

The ancient societies that were rooted in India left behind artwork on how to kill animals during a hunt. The images showing how to kill most effectively depending on the game being hunted relay an intimate knowledge of both external and internal anatomy as well as the relative importance of organs. The knowledge was mostly gained through hunters preparing the recently captured prey. Once the roaming lifestyle was no longer necessary it was replaced in part by the civilization that formed in the Indus Valley. Unfortunately, there is little that remains from this time to indicate whether or not dissection occurred, the civilization was lost to the Aryan people migrating.

Early in the history of India (2nd to 3rd century), the Arthashastra described the 4 ways that death can occur and their symptoms: drowning, hanging, strangling, or asphyxiation. According to that source, an autopsy should be performed in any case of untimely demise.

The practice of dissection flourished during the 7th and 8th century. It was under their rule that medical education was standardized. This created a need to better understand human anatomy, so as to have educated surgeons. Dissection was limited by the religious taboo on cutting the human body. This changed the approach taken to accomplish the goal. The process involved the loosening of the tissues in streams of water before the outer layers were sloughed off with soft implements to reach the musculature. To perfect the technique of slicing, the prospective students used gourds and squash. These techniques of dissection gave rise to an advanced understanding of the anatomy and the enabled them to complete procedures used today, such as rhinoplasty.

During medieval times the anatomical teachings from India spread throughout the known world; however, the practice of dissection was stunted by Islam. The practice of dissection at a university level was not seen again until 1827, when it was performed by the student Pandit Madhusudan Gupta. Through the 1900s, the university teachers had to continually push against the social taboos of dissection, until around 1850 when the universities decided that it was more cost effective to train Indian doctors than bring them in from Britain. Indian medical schools were, however, training female doctors well before those in England.

The current state of dissection in India is deteriorating. The number of hours spent in dissection labs during medical school has decreased substantially over the last twenty years. The future of anatomy education will probably be an elegant mix of traditional methods and integrative computer learning. The use of dissection in early stages of medical training has been shown more effective in the retention of the intended information than their simulated counterparts. However, there is use for the computer-generated experience as review in the later stages. The combination of these methods is intended to strengthen the students' understanding and confidence of anatomy, a subject that is infamously difficult to master. There is a growing need for anatomist—seeing as most anatomy labs are taught by graduates hoping to complete degrees in anatomy—to continue the long tradition of anatomy education.

Islamic world

Page from a 1531 Latin translation by Peter Argellata of Al-Zahrawi's c. 1000 treatise on surgical and medical instruments

From the beginning of the Islamic faith in 610 A.D., Shari'ah law has applied to a greater or lesser extent within Muslim countries, supported by Islamic scholars such as Al-Ghazali. Islamic physicians such as Ibn Zuhr (Avenzoar) (1091–1161) in Al-Andalus, Saladin's physician Ibn Jumay during the 12th century, Abd el-Latif in Egypt c. 1200, and Ibn al-Nafis in Syria and Egypt in the 13th century may have practiced dissection, but it remains ambiguous whether or not human dissection was practiced. Ibn al-Nafis, a physician and Muslim jurist, suggested that the "precepts of Islamic law have discouraged us from the practice of dissection, along with whatever compassion is in our temperament", indicating that while there was no law against it, it was nevertheless uncommon. Islam dictates that the body be buried as soon as possible, barring religious holidays, and that there be no other means of disposal such as cremation. Prior to the 10th century, dissection was not performed on human cadavers. The book Al-Tasrif, written by Al-Zahrawi in 1000 A.D., details surgical procedure that differed from the previous standards. The book was an educational text of medicine and surgery which included detailed illustrations. It was later translated and took the place of Avicenna's The Canon of Medicine as the primary teaching tool in Europe from the 12th century to the 17th century. There were some that were willing to dissect humans up to the 12th century, for the sake of learning, after which it was forbidden. This attitude remained constant until 1952, when the Islamic School of Jurisprudence in Egypt ruled that "necessity permits the forbidden". This decision allowed for the investigation of questionable deaths by autopsy. In 1982, the decision was made by a fatwa that if it serves justice, autopsy is worth the disadvantages. Though Islam now approves of autopsy, the Islamic public still disapproves. Autopsy is prevalent in most Muslim countries for medical and judicial purposes. In Egypt it holds an important place within the judicial structure, and is taught at all the country's medical universities. In Saudi Arabia, whose law is completely dictated by Shari'ah, autopsy is viewed poorly by the population but can be compelled in criminal cases; human dissection is sometimes found at university level. Autopsy is performed for judicial purposes in Qatar and Tunisia. Human dissection is present in the modern day Islamic world, but is rarely published on due to the religious and social stigma.

Tibet

Tibetan medicine developed a rather sophisticated knowledge of anatomy, acquired from long-standing experience with human dissection. Tibetans had adopted the practice of sky burial because of the country's hard ground, frozen for most of the year, and the lack of wood for cremation. A sky burial begins with a ritual dissection of the deceased, and is followed by the feeding of the parts to vultures on the hill tops. Over time, Tibetan anatomical knowledge found its way into Ayurveda and to a lesser extent into Chinese medicine.

Christian Europe

A dissection in Realdo Colombo's De Re Anatomica, 1559

Throughout the history of Christian Europe, the dissection of human cadavers for medical education has experienced various cycles of legalization and proscription in different countries. Dissection was rare during the Middle Ages, but it was practised, with evidence from at least as early as the 13th century. The practice of autopsy in Medieval Western Europe is "very poorly known" as few surgical texts or conserved human dissections have survived. A modern Jesuit scholar has claimed that the Christian theology contributed significantly to the revival of human dissection and autopsy by providing a new socio-religious and cultural context in which the human cadaver was no longer seen as sacrosanct.

An edict of the 1163 Council of Tours, and an early 14th-century decree of Pope Boniface VIII have mistakenly been identified as prohibiting dissection and autopsy, misunderstanding or extrapolation from these edicts may have contributed to reluctance to perform such procedures. The Middle Ages witnessed the revival of an interest in medical studies, including human dissection and autopsy.

Mondino de Luzzi's Anathomia, 1541

Frederick II (1194–1250), the Holy Roman emperor, ruled that any that were studying to be a physician or a surgeon must attend a human dissection, which would be held no less than every five years. Some European countries began legalizing the dissection of executed criminals for educational purposes in the late 13th and early 14th centuries. Mondino de Luzzi carried out the first recorded public dissection around 1315. At this time, autopsies were carried out by a team consisting of a Lector, who lectured, the Sector, who did the dissection, and the Ostensor who pointed to features of interest.

The Italian Galeazzo di Santa Sofia made the first public dissection north of the Alps in Vienna in 1404.

Vesalius with a dissected cadaver in his De humani corporis fabrica, 1543

Vesalius in the 16th century carried out numerous dissections in his extensive anatomical investigations. He was attacked frequently for his disagreement with Galen's opinions on human anatomy. Vesalius was the first to lecture and dissect the cadaver simultaneously.

The Catholic Church is known to have ordered an autopsy on conjoined twins Joana and Melchiora Ballestero in Hispaniola in 1533 to determine whether they shared a soul. They found that there were two distinct hearts, and hence two souls, based on the ancient Greek philosopher Empedocles, who believed the soul resided in the heart.

Renaissance artists such as Antonio del Pollaiuolo studied anatomy to improve their artwork, as seen in this figurine of Hercules, 1470

Human dissection was also practised by Renaissance artists. Though most chose to focus on the external surfaces of the body, some like Michelangelo Buonarotti, Antonio del Pollaiuolo, Baccio Bandinelli, and Leonardo da Vinci sought a deeper understanding. However, there were no provisions for artists to obtain cadavers, so they had to resort to unauthorised means, as indeed anatomists sometimes did, such as grave robbing, body snatching, and murder.

Anatomization was sometimes ordered as a form of punishment, as, for example, in 1806 to James Halligan and Dominic Daley after their public hanging in Northampton, Massachusetts.

In modern Europe, dissection is routinely practised in biological research and education, in medical schools, and to determine the cause of death in autopsy. It is generally considered a necessary part of learning and is thus accepted culturally. It sometimes attracts controversy, as when Odense Zoo decided to dissect lion cadavers in public before a "self-selected audience".

Britain

Body snatching headstone of an 1823 grave in Stirling

In Britain, dissection remained entirely prohibited from the end of the Roman conquest and through the Middle Ages to the 16th century, when a series of royal edicts gave specific groups of physicians and surgeons some limited rights to dissect cadavers. The permission was quite limited: by the mid-18th century, the Royal College of Physicians and Company of Barber-Surgeons were the only two groups permitted to carry out dissections, and had an annual quota of ten cadavers between them. As a result of pressure from anatomists, especially in the rapidly growing medical schools, the Murder Act 1752 allowed the bodies of executed murderers to be dissected for anatomical research and education. By the 19th century this supply of cadavers proved insufficient, as the public medical schools were growing, and the private medical schools lacked legal access to cadavers. A thriving black market arose in cadavers and body parts, leading to the creation of the profession of body snatching, and the infamous Burke and Hare murders in 1828, when 16 people were murdered for their cadavers, to be sold to anatomists. The resulting public outcry led to the passage of the Anatomy Act 1832, which increased the legal supply of cadavers for dissection.

By the 21st century, the availability of interactive computer programs and changing public sentiment led to renewed debate on the use of cadavers in medical education. The Peninsula College of Medicine and Dentistry in the UK, founded in 2000, became the first modern medical school to carry out its anatomy education without dissection.

United States

A teenage school pupil dissecting an eye

In the United States, dissection of frogs became common in college biology classes from the 1920s, and were gradually introduced at earlier stages of education. By 1988, some 75 to 80 percent of American high school biology students were participating in a frog dissection, with a trend towards introduction in elementary schools. The frogs are most commonly from the genus Rana. Other popular animals for high-school dissection at the time of that survey were, among vertebrates, fetal pigs, perch, and cats; and among invertebrates, earthworms, grasshoppers, crayfish, and starfish. About six million animals are dissected each year in United States high schools (2016), not counting medical training and research. Most of these are purchased already dead from slaughterhouses and farms.

Dissection in U.S. high schools became prominent in 1987, when a California student, Jenifer Graham, sued to require her school to let her complete an alternative project. The court ruled that mandatory dissections were permissible, but that Graham could ask to dissect a frog that had died of natural causes rather than one that was killed for the purposes of dissection; the practical impossibility of procuring a frog that had died of natural causes in effect let Graham opt out of the required dissection. The suit gave publicity to anti-dissection advocates. Graham appeared in a 1987 Apple Computer commercial for the virtual-dissection software Operation Frog. The state of California passed a Student's Rights Bill in 1988 requiring that objecting students be allowed to complete alternative projects. Opting out of dissection increased through the 1990s.

In the United States, 17 states along with Washington, D.C. have enacted dissection-choice laws or policies that allow students in primary and secondary education to opt out of dissection. Other states including Arizona, Hawaii, Minnesota, Texas, and Utah have more general policies on opting out on moral, religious, or ethical grounds. To overcome these concerns, J. W. Mitchell High School in New Port Richey, Florida, in 2019 became the first US high school to use synthetic frogs for dissection in its science classes, instead of preserved real frogs.

As for the dissection of cadavers in undergraduate and medical school, traditional dissection is supported by professors and students, with some opposition, limiting the availability of dissection. Upper-level students who have experienced this method along with their professors agree that "Studying human anatomy with colorful charts is one thing. Using a scalpel and an actual, recently-living person is an entirely different matter."

Acquisition of cadavers

The way in which cadaveric specimens are obtained differs greatly according to country. In the UK, donation of a cadaver is wholly voluntary. Involuntary donation plays a role in about 20 percent of specimens in the US and almost all specimens donated in some countries such as South Africa and Zimbabwe. Countries that practice involuntary donation may make available the bodies of dead criminals or unclaimed or unidentified bodies for the purposes of dissection. Such practices may lead to a greater proportion of the poor, homeless and social outcasts being involuntarily donated. Cadavers donated in one jurisdiction may also be used for the purposes of dissection in another, whether across states in the US, or imported from other countries, such as with Libya. As an example of how a cadaver is donated voluntarily, a funeral home in conjunction with a voluntary donation program identifies a body who is part of the program. After broaching the subject with relatives in a diplomatic fashion, the body is then transported to a registered facility. The body is tested for the presence of HIV and hepatitis viruses. It is then evaluated for use as a "fresh" or "prepared" specimen.

Disposal of specimens

Cadaveric specimens for dissection are, in general, disposed of by cremation. The deceased may then be interred at a local cemetery. If the family wishes, the ashes of the deceased are then returned to the family. Many institutes have local policies to engage, support and celebrate the donors. This may include the setting up of local monuments at the cemetery.

Use in education

Cadaveric dissection at Siriraj Medical School, Thailand

Human cadavers are often used in medicine to teach anatomy or surgical instruction. Cadavers are selected according to their anatomy and availability. They may be used as part of dissection courses involving a "fresh" specimen so as to be as realistic as possible—for example, when training surgeons. Cadavers may also be pre-dissected by trained instructors. This form of dissection involves the preparation and preservation of specimens for a longer time period and is generally used for the teaching of anatomy.

Alternatives

Some alternatives to dissection may present educational advantages over the use of animal cadavers, while eliminating perceived ethical issues. These alternatives include computer programs, lectures, three dimensional models, films, and other forms of technology. Concern for animal welfare is often at the root of objections to animal dissection. Studies show that some students reluctantly participate in animal dissection out of fear of real or perceived punishment or ostracism from their teachers and peers, and many do not speak up about their ethical objections.

One alternative to the use of cadavers is computer technology. At Stanford Medical School, software combines X-ray, ultrasound and MRI imaging for display on a screen as large as a body on a table. In a variant of this, a "virtual anatomy" approach being developed at New York University, students wear three dimensional glasses and can use a pointing device to "[swoop] through the virtual body, its sections as brightly colored as living tissue." This method is claimed to be "as dynamic as Imax [cinema]".

Advantages and disadvantages

Proponents of animal-free teaching methodologies argue that alternatives to animal dissection can benefit educators by increasing teaching efficiency and lowering instruction costs while affording teachers an enhanced potential for the customization and repeat-ability of teaching exercises. Those in favor of dissection alternatives point to studies which have shown that computer-based teaching methods "saved academic and nonacademic staff time … were considered to be less expensive and an effective and enjoyable mode of student learning [and] … contributed to a significant reduction in animal use" because there is no set-up or clean-up time, no obligatory safety lessons, and no monitoring of misbehavior with animal cadavers, scissors, and scalpels.

With software and other non-animal methods, there is also no expensive disposal of equipment or hazardous material removal. Some programs also allow educators to customize lessons and include built-in test and quiz modules that can track student performance. Furthermore, animals (whether dead or alive) can be used only once, while non-animal resources can be used for many years—an added benefit that could result in significant cost savings for teachers, school districts, and state educational systems.

Several peer-reviewed comparative studies examining information retention and performance of students who dissected animals and those who used an alternative instruction method have concluded that the educational outcomes of students who are taught basic and advanced biomedical concepts and skills using non-animal methods are equivalent or superior to those of their peers who use animal-based laboratories such as animal dissection.

Some reports state that students' confidence, satisfaction, and ability to retrieve and communicate information was much higher for those who participated in alternative activities compared to dissection. Three separate studies at universities across the United States found that students who modeled body systems out of clay were significantly better at identifying the constituent parts of human anatomy than their classmates who performed animal dissection.

Another study found that students preferred using clay modeling over animal dissection and performed just as well as their cohorts who dissected animals.

In 2008, the National Association of Biology Teachers (NABT) affirmed its support for classroom animal dissection stating that they "Encourage the presence of live animals in the classroom with appropriate consideration to the age and maturity level of the students …NABT urges teachers to be aware that alternatives to dissection have their limitations. NABT supports the use of these materials as adjuncts to the educational process but not as exclusive replacements for the use of actual organisms."

The National Science Teachers Association (NSTA) "supports including live animals as part of instruction in the K-12 science classroom because observing and working with animals firsthand can spark students' interest in science as well as a general respect for life while reinforcing key concepts" of biological sciences. NSTA also supports offering dissection alternatives to students who object to the practice.

The NORINA database lists over 3,000 products which may be used as alternatives or supplements to animal use in education and training. These include alternatives to dissection in schools. InterNICHE has a similar database and a loans system.

Vivisection

From Wikipedia, the free encyclopedia
Mice are the most commonly used mammal species for live animal research. Such research is sometimes described as vivisection.

Vivisection (from Latin vivus 'alive', and sectio 'cutting') is surgery conducted for experimental purposes on a living organism, typically animals with a central nervous system, to view living internal structure. The word is, more broadly, used as a pejorative catch-all term for experimentation on live animals by organizations opposed to animal experimentation, but the term is rarely used by practising scientists. Human vivisection, such as live organ harvesting, has been perpetrated as a form of torture.

Animal vivisection

An anesthetized pig used for training a surgeon

Research requiring vivisection techniques that cannot be met through other means is often subject to an external ethics review in conception and implementation, and in many jurisdictions use of anesthesia is legally mandated for any surgery likely to cause pain to any vertebrate.

In the United States, the Animal Welfare Act explicitly requires that any procedure that may cause pain use "tranquilizers, analgesics, and anesthetics" with exceptions when "scientifically necessary". The act does not define "scientific necessity" or regulate specific scientific procedures, but approval or rejection of individual techniques in each federally funded lab is determined on a case-by-case basis by the Institutional Animal Care and Use Committee, which contains at least one veterinarian, one scientist, one non-scientist, and one other individual from outside the university.

In the United Kingdom, any experiment involving vivisection must be licensed by the Home Secretary. The Animals (Scientific Procedures) Act 1986 "expressly directs that, in determining whether to grant a licence for an experimental project, 'the Secretary of State shall weigh the likely adverse effects on the animals concerned against the benefit likely to accrue.'"

In Australia, the Code of Practice "requires that all experiments must be approved by an Animal Experimentation Ethics Committee" that includes a "person with an interest in animal welfare who is not employed by the institution conducting the experiment, and an additional independent person not involved in animal experimentation."

Anti-vivisectionists have played roles in the emergence of the animal welfare and animal rights movements, arguing that animals and humans have the same natural rights as living creatures, and that it is inherently immoral to inflict pain or injury on another living creature, regardless of the purpose or potential benefit to mankind.

Vivisection and anti-vivisection in the 19th century

At the turn of the 19th century, medicine was undergoing a transformation. The emergence of hospitals and the development of more advanced medical tools such as the stethoscope are but a few of the changes in the medical field. There was also an increased recognition that medical practices needed to be improved, as many of the current therapeutics were based on unproven, traditional theories that may or may not have helped the patient recover. The demand for more effective treatment shifted emphasis to research with the goal of understanding disease mechanisms and anatomy. This shift had a few effects, one of which was the rise in patient experimentation, leading to some moral questions about what was acceptable in clinical trials and what was not. An easy solution to the moral problem was to use animals in vivisection experiments, so as not to endanger human patients. This, however, had its own set of moral obstacles, leading to the anti-vivisection movement.

François Magendie (1783–1855)

Pro-vivisection cartoon in 1911

One polarizing figure in the anti-vivisection movement was François Magendie. Magendie was a physiologist at the Académie Royale de Médecine in France, established in the first half of the 19th century. Magendie made several groundbreaking medical discoveries, but was far more aggressive than some of his other contemporaries with his use of animal experimentation. For example, the discovery of the different functionalities of dorsal and ventral spinal nerve roots was achieved by both Magendie, as well as a Scottish anatomist named Charles Bell. Bell used an unconscious rabbit because of "the protracted cruelty of the dissection", which caused him to miss that the dorsal roots were also responsible for sensory information. Magendie, on the other hand, used conscious, six-week-old puppies for his own experiments. While Magendie's approach was more of an infringement on what would today be referred to as animal rights, both Bell and Magendie used the same rationalization for vivisection: the cost of animal lives and experimentation was well worth it for the benefit of humanity.

Many viewed Magendie's work as cruel and unnecessarily torturous. One note is that Magendie carried out many of his experiments before the advent of anesthesia, but even after ether was discovered it was not used in any of his experiments or classes. Even during the period before anesthesia, other physiologists expressed their disgust with how he conducted his work. One such visiting American physiologist describes the animals as "victims" and the apparent sadism that Magendie displayed when teaching his classes. The cruelty in such experiments actually even led to Magendie's role as an important figure in animal-rights legislation, such as his experiments being cited in the drafting of the British Cruelty to Animals Act 1876 and Cruel Treatment of Cattle Act 1822, otherwise known as Martin's Act, with its namesake, Irish MP and well known anti-cruelty campaigner Richard Martin describing Magendle as "disgrace to Society" after one of Magendle's public vivisections, described by Martin as "anatomical theatres", which was widely commented on at the time reportedly involving a greyhound's dissection potentially over two days. Magendle faced widespread opposition in British society, among the general public but also his contemporaries, including William Sharpey who described his experiments aside from cruel as "purposeless" and "without sufficient object", a feeling he claimed was shared among other physiologists.

David Ferrier and the Cruelty to Animals Act 1876

Prior to vivisection for educational purposes, chloroform was administered as an anesthetic to this common sand frog.

The Cruelty to Animals Act, 1876 in Britain determined that one could only conduct vivisection on animals with the appropriate license from the state, and that the work the physiologist was doing had to be original and absolutely necessary. The stage was set for such legislation by physiologist David Ferrier. Ferrier was a pioneer in understanding the brain and used animals to show that certain locales of the brain corresponded to bodily movement elsewhere in the body in 1873. He put these animals to sleep, and caused them to move unconsciously with a probe. Ferrier was successful, but many decried his use of animals in his experiments. Some of these arguments came from a religious standpoint. Some were concerned that Ferrier's experiments would separate God from the mind of man in the name of science. Some of the anti-vivisection movement in England had its roots in Evangelicalism and Quakerism. These religions already had a distrust for science, only intensified by the recent publishing of Darwin's Theory of Evolution in 1859.

Neither side was pleased with how the Cruelty to Animals Act 1876 was passed. The scientific community felt as though the government was restricting their ability to compete with the quickly advancing France and Germany with new regulations. The anti-vivisection movement was also unhappy, but because they believed that it was a concession to scientists for allowing vivisection to continue at all. Ferrier would continue to vex the anti-vivisection movement in Britain with his experiments when he had a debate with his German opponent, Friedrich Goltz. They would effectively enter the vivisection arena, with Ferrier presenting a monkey, and Goltz presenting a dog, both of which had already been operated on. Ferrier won the debate, but did not have a license, leading the anti-vivisection movement to sue him in 1881. Ferrier was not found guilty, as his assistant was the one operating, and his assistant did have a license. Ferrier and his practices gained public support, leaving the anti-vivisection movement scrambling. They made the moral argument that given recent developments, scientists would venture into more extreme practices to operating on "the cripple, the mute, the idiot, the convict, the pauper, to enhance the 'interest' of [the physiologist's] experiments".

Human vivisection

It is possible that human vivisection was practised by some Greek anatomists in Alexandria in the 3rd century BCE. Celsus in De Medicina states that Herophilos of Alexandria vivisected some criminals sent by the king. The early Christian writer Tertullian states that Herophilos vivisected at least 600 live prisoners, although the accuracy of this claim is disputed by many historians.

In the 12th century CE, Andalusian Arab Ibn Tufail elaborated on human vivisection in his treatise called Hayy ibn Yaqzan. In an extensive article on the subject, Iranian academic Nadia Maftouni believes him to be among the early supporters of autopsy and vivisection.

Unit 731, a biological and chemical warfare research and development unit of the Imperial Japanese Army, undertook lethal human experimentation during the period that comprised both the Second Sino-Japanese War and the Second World War (1937–1945). In the Filipino island of Mindanao, Moro Muslim prisoners of war were subjected to various forms of vivisection by the Japanese, in many cases without anesthesia.

Nazi human experimentation involved many medical experiments on live subjects, such as vivisections by Josef Mengele, usually without anesthesia.

Informed consent

From Wikipedia, the free encyclopedia

Within the US, definitions of informed consent vary, and the standard required is generally determined by the state. Informed consent requires a clear appreciation and understanding of the facts, implications, and consequences of an action. To give informed consent, the individual concerned must have adequate reasoning faculties and possess all relevant facts. Impairments to reasoning and judgment that may preclude informed consent include intellectual or emotional immaturity, high levels of stress such as post-traumatic stress disorder or a severe intellectual disability, severe mental disorder, intoxication, severe sleep deprivation, dementia, or coma.

Obtaining informed consent is not always required. If an individual is considered unable to give informed consent, another person is generally authorized to give consent on their behalf—for example, the parents or legal guardians of a child (though in this circumstance the child may be required to provide informed assent) and conservators for the mentally disordered. Alternatively, the doctrine of implied consent permits treatment in limited cases, for example when an unconscious person will die without immediate intervention. Cases in which an individual is provided insufficient information to form a reasoned decision raise serious ethical issues. When these issues occur, or are anticipated to occur, in a clinical trial, they are subject to review by an ethics committee or institutional review board.

Informed consent is codified in both national and international law. 'Free consent' is a cognate term in the International Covenant on Civil and Political Rights, adopted in 1966 by the United Nations, and intended to be in force by 23 March 1976. Article 7 of the covenant prohibits experiments conducted without the "free consent to medical or scientific experimentation" of the subject. As of September 2019, the covenant has 173 parties and six more signatories without ratification.

Assessment

Informed consent can be complex to evaluate, because neither expressions of consent, nor expressions of understanding of implications, necessarily mean that full adult consent was in fact given, nor that full comprehension of relevant issues is internally digested. Consent may be implied within the usual subtleties of human communication, rather than explicitly negotiated verbally or in writing. If a doctor asks a patient to take their blood pressure, for example, a patient may demonstrate consent by offering their arm for a blood pressure cuff. In some cases consent cannot legally be possible, even if the person protests he does indeed understand and wish. There are also structured instruments for evaluating capacity to give informed consent, although no ideal instrument presently exists.

Thus, there is always a degree to which informed consent must be assumed or inferred based upon observation, or knowledge, or legal reliance. This especially is the case in sexual or relational issues. In medical or formal circumstances, explicit agreement by means of signature—normally relied on legally—regardless of actual consent, is the norm. This is the case with certain procedures, such as a "do not resuscitate" directive that a patient signed before onset of their illness.

Brief examples of each of the above:

  1. A person may verbally agree to something from fear, perceived social pressure, or psychological difficulty in asserting true feelings. The person requesting the action may honestly be unaware of this and believe the consent is genuine, and rely on it. Consent is expressed, but not internally given.
  2. A person may claim to understand the implications of some action, as part of consent, but in fact has failed to appreciate the possible consequences fully and may later deny the validity of the consent for this reason. Understanding needed for informed consent is present but is, in fact (through ignorance), not present.
  3. A person signs a legal release form for a medical procedure, and later feels he did not really consent. Unless he can show actual misinformation, the release is usually persuasive or conclusive in law, in that the clinician may rely legally upon it for consent. In formal circumstances, a written consent usually legally overrides later denial of informed consent (unless obtained by misrepresentation).
  4. Informed consent in the U.S. can be overridden in emergency medical situations pursuant to 21CFR50.24, which was first brought to the general public's attention via the controversy surrounding the study of Polyheme.

Valid elements

For an individual to give valid informed consent, three components must be present: disclosure, capacity and voluntariness.

  • Disclosure requires the researcher to supply each prospective subject with the information necessary to make an autonomous decision and also to ensure that the subject adequately understands the information provided. This latter requirement implies that a written consent form be written in lay language suited for the comprehension skills of subject population, as well as assessing the level of understanding through conversation (to be informed).
  • Capacity pertains to the ability of the subject to both understand the information provided and form a reasonable judgment based on the potential consequences of his/her decision.
  • Voluntariness refers to the subject's right to freely exercise his/her decision making without being subjected to external pressure such as coercion, manipulation, or undue influence.

From children

As children often lack the decision-making ability or legal power (competence) to provide true informed consent for medical decisions, it often falls on parents or legal guardians to provide informed permission for medical decisions. This "consent by proxy" usually works reasonably well, but can lead to ethical dilemmas when the judgment of the parents or guardians and the medical professional differ with regard to what constitutes appropriate decisions "in the best interest of the child". Children who are legally emancipated, and certain situations such as decisions regarding sexually transmitted diseases or pregnancy, or for unemancipated minors who are deemed to have medical decision making capacity, may be able to provide consent without the need for parental permission depending on the laws of the jurisdiction the child lives in. The American Academy of Pediatrics encourages medical professionals also to seek the assent of older children and adolescents by providing age appropriate information to these children to help empower them in the decision-making process.

Research on children has benefited society in many ways. The only effective way to establish normal patterns of growth and metabolism is to do research on infants and young children. When addressing the issue of informed consent with children, the primary response is parental consent. This is valid, although only legal guardians are able to consent for a child, not adult siblings. Additionally, parents may not order the termination of a treatment that is required to keep a child alive, even if they feel it is in the best interest. Guardians are typically involved in the consent of children, however a number of doctrines have developed that allow children to receive health treatments without parental consent. For example, emancipated minors may consent to medical treatment, and minors can also consent in an emergency.

Waiver of requirement

Waiver of the consent requirement may be applied in certain circumstances where no foreseeable harm is expected to result from the study or when permitted by law, federal regulations, or if an ethical review committee has approved the non-disclosure of certain information.

Besides studies with minimal risk, waivers of consent may be obtained in a military setting. According to 10 USC 980, the United States Code for the Armed Forces, Limitations on the Use of Humans as Experimental Subjects, a waiver of advanced informed consent may be granted by the Secretary of Defense if a research project would:

  1. Directly benefit subjects.
  2. Advance the development of a medical product necessary to the military.
  3. Be carried out under all laws and regulations (i.e., Emergency Research Consent Waiver) including those pertinent to the FDA.

While informed consent is a basic right and should be carried out effectively, if a patient is incapacitated due to injury or illness, it is still important that patients benefit from emergency experimentation. The Food and Drug Administration (FDA) and the Department of Health and Human Services (DHHS) joined to create federal guidelines to permit emergency research, without informed consent. However, they can only proceed with the research if they obtain a waiver of informed consent (WIC) or an emergency exception from informed consent (EFIC).

21st Century Cures Act

The 21st Century Cures Act enacted by the 114th United States Congress in December 2016 allows researchers to waive the requirement for informed consent when clinical testing "poses no more than minimal risk" and "includes appropriate safeguards to protect the rights, safety, and welfare of the human subject."

Medical sociology

Medical sociologists have studied informed consent as well bioethics more generally. Oonagh Corrigan, looking at informed consent for research in patients, argues that much of the conceptualization of informed consent comes from research ethics and bioethics with a focus on patient autonomy, and notes that this aligns with a neoliberal worldview. Corrigan argues that a model based solely around individual decision making does not accurately describe the reality of consent because of social processes: a view that has started to be acknowledged in bioethics. She feels that the liberal principles of informed consent are often in opposition with autocratic medical practices such that norms values and systems of expertise often shape and individuals ability to apply choice.

Patients who agree to participate in trials often do so because they feel that the trial was suggested by a doctor as the best intervention. Patients may find being asked to consent within a limited time frame a burdensome intrusion on their care when it arises because a patient has to deal with a new condition. Patients involved in trials may not be fully aware of the alternative treatments, and an awareness that there is uncertainty in the best treatment can help make patients more aware of this. Corrigan notes that patients generally expect that doctors are acting exclusively in their interest in interactions and that this combined with "clinical equipose" where a healthcare practitioner does not know which treatment is better in a randomized control trial can be harmful to the doctor-patient relationship.

Medical procedures

The doctrine of informed consent relates to professional negligence and establishes a breach of the duty of care owed to the patient (see duty of care, breach of the duty, and respect for persons). The doctrine of informed consent also has significant implications for medical trials of medications, devices, or procedures.

Requirements of the professional

Until 2015 in the United Kingdom and in countries such as Malaysia and Singapore, informed consent in medical procedures requires proof as to the standard of care to expect as a recognised standard of acceptable professional practice (the Bolam Test), that is, what risks would a medical professional usually disclose in the circumstances (see Loss of right in English law). Arguably, this is "sufficient consent" rather than "informed consent." The UK has since departed from the Bolam test for judging standards of informed consent, due to the landmark ruling in Montgomery v Lanarkshire Health Board. This moves away from the concept of a reasonable physician and instead uses the standard of a reasonable patient, and what risks an individual would attach significance to.

Medicine in the United States, Australia, and Canada takes this patient-centric approach to "informed consent." Informed consent in these jurisdictions requires healthcare providers to disclose significant risks, as well as risks of particular importance to that patient. This approach combines an objective (a hypothetical reasonable patient) and subjective (this particular patient) approach.

The doctrine of informed consent should be contrasted with the general doctrine of medical consent, which applies to assault or battery. The consent standard here is only that the person understands, in general terms, the nature of and purpose of the intended intervention. As the higher standard of informed consent applies to negligence, not battery, the other elements of negligence must be made out. Significantly, causation must be shown: That had the individual been made aware of the risk he would not have proceeded with the operation (or perhaps with that surgeon).

Optimal establishment of an informed consent requires adaptation to cultural or other individual factors of the patient. As of 2011, for example, people from Mediterranean and Arab appeared to rely more on the context of the delivery of the information, with the information being carried more by who is saying it and where, when, and how it is being said, rather than what is said, which is of relatively more importance in typical "Western" countries.

The informed consent doctrine is generally implemented through good healthcare practice: pre-operation discussions with patients and the use of medical consent forms in hospitals. However, reliance on a signed form should not undermine the basis of the doctrine in giving the patient an opportunity to weigh and respond to the risk. In one British case, a doctor performing routine surgery on a woman noticed that she had cancerous tissue in her womb. He took the initiative to remove the woman's womb; however, as she had not given informed consent for this operation, the doctor was judged by the General Medical Council to have acted negligently. The council stated that the woman should have been informed of her condition, and allowed to make her own decision.

Obtaining informed consents

To document that informed consent has been given for a procedure, healthcare organisations have traditionally used paper-based consent forms on which the procedure and its risks and benefits are noted, and is signed by both patient and clinician. In a number of healthcare organisations consent forms are scanned and maintained in an electronic document store. The paper consent process has been demonstrated to be associated with significant errors of omission, and therefore increasing numbers of organisations are using digital consent applications where the risk of errors can be minimised, a patient's decision making and comprehension can be supported by additional lay-friendly and accessible information, consent can be completed remotely, and the process can become paperless. One form of digital consent is dynamic consent, which invites participants to provide consent in a granular way, and makes it easier for them to withdraw consent if they wish.

Electronic consent methods have been used to support indexing and retrieval of consent data, thus enhancing the ability to honor to patient intent and identify willing research participants. More recently, Health Sciences South Carolina, a statewide research collaborative focused on transforming healthcare quality, health information systems and patient outcomes, developed an open-source system called Research Permissions Management System (RPMS).

Competency of the patient

The ability to give informed consent is governed by a general requirement of competency. In common law jurisdictions, adults are presumed competent to consent. This presumption can be rebutted, for instance, in circumstances of mental illness or other incompetence. This may be prescribed in legislation or based on a common-law standard of inability to understand the nature of the procedure. In cases of incompetent adults, a health care proxy makes medical decisions. In the absence of a proxy, the medical practitioner is expected to act in the patient's best interests until a proxy can be found.

By contrast, 'minors' (which may be defined differently in different jurisdictions) are generally presumed incompetent to consent, but depending on their age and other factors may be required to provide Informed assent. In some jurisdictions (e.g. much of the U.S.), this is a strict standard. In other jurisdictions (e.g. England, Australia, Canada), this presumption may be rebutted through proof that the minor is 'mature' (the 'Gillick standard'). In cases of incompetent minors, informed consent is usually required from the parent (rather than the 'best interests standard') although a parens patriae order may apply, allowing the court to dispense with parental consent in cases of refusal.

Deception

Research involving deception is controversial given the requirement for informed consent. Deception typically arises in social psychology, when researching a particular psychological process requires that investigators deceive subjects. For example, in the Milgram experiment, researchers wanted to determine the willingness of participants to obey authority figures despite their personal conscientious objections. They had authority figures demand that participants deliver what they thought was an electric shock to another research participant. For the study to succeed, it was necessary to deceive the participants so they believed that the subject was a peer and that their electric shocks caused the peer actual pain.

Nonetheless, research involving deception prevents subjects from exercising their basic right of autonomous informed decision-making and conflicts with the ethical principle of respect for persons.

The Ethical Principles of Psychologists and Code of Conduct set by the American Psychological Association says that psychologists may conduct research that includes a deceptive compartment only if they can both justify the act by the value and importance of the study's results and show they could not obtain the results by some other way. Moreover, the research should bear no potential harm to the subject as an outcome of deception, either physical pain or emotional distress. Finally, the code requires a debriefing session in which the experimenter both tells the subject about the deception and gives subject the option of withdrawing the data.

Abortion

In some U.S. states, informed consent laws (sometimes called "right to know" laws) require that a woman seeking an elective abortion receive information from the abortion provider about her legal rights, alternatives to abortion (such as adoption), available public and private assistance, and other information specified in the law, before the abortion is performed. Other countries with such laws (e.g. Germany) require that the information giver be properly certified to make sure that no abortion is carried out for the financial gain of the abortion provider and to ensure that the decision to have an abortion is not swayed by any form of incentive.

Some informed consent laws have been criticized for allegedly using "loaded language in an apparently deliberate attempt to 'personify' the fetus," but those critics acknowledge that "most of the information in the [legally mandated] materials about abortion comports with recent scientific findings and the principles of informed consent", although "some content is either misleading or altogether incorrect."

Within research

Informed consent is part of the ethical clinical research as well, in which a human subject voluntarily confirms his or her willingness to participate in a particular clinical trial, after having been informed of all aspects of the trial that are relevant to the subject's decision to participate. Informed consent is documented by means of a written, signed, and dated informed consent form. In medical research, the Nuremberg Code set a base international standard in 1947, which continued to develop, for example in response to the ethical violation in the Holocaust. Nowadays, medical research is overseen by an ethics committee that also oversees the informed consent process.

As the medical guidelines established in the Nuremberg Code were imported into the ethical guidelines for the social sciences, informed consent became a common part of the research procedure. However, while informed consent is the default in medical settings, it is not always required in the social science. Here, research often involves low or no risk for participants, unlike in many medical experiments. Second, the mere knowledge that they participate in a study can cause people to alter their behavior, as in the Hawthorne Effect: "In the typical lab experiment, subjects enter an environment in which they are keenly aware that their behavior is being monitored, recorded, and subsequently scrutinized." In such cases, seeking informed consent directly interferes with the ability to conduct the research, because the very act of revealing that a study is being conducted is likely to alter the behavior studied. List exemplifies the potential dilemma that can result: "if one were interested in exploring whether, and to what extent, race or gender influences the prices that buyers pay for used cars, it would be difficult to measure accurately the degree of discrimination among used car dealers who know that they are taking part in an experiment." In cases where such interference is likely, and after careful consideration, a researcher may forgo the informed consent process. This is commonly done after weighting the risk to study participants versus the benefit to society and whether participants are present in the study out of their own wish and treated fairly. Researchers often consult with an ethics committee or institutional review board to render a decision.

The birth of new online media, such as social media, has complicated the idea of informed consent. In an online environment people pay little attention to Terms of Use agreements and can subject themselves to research without thorough knowledge. This issue came to the public light following a study conducted by Facebook Inc. in 2014, and published by that company and Cornell University. Facebook conducted a study where they altered the Facebook News Feeds of roughly 700,000 users to reduce either the amount of positive or negative posts they saw for a week. The study then analyzed if the users status updates changed during the different conditions. The study was published in the Proceedings of the National Academy of Sciences. The lack of informed consent led to outrage among many researchers and users. Many believed that by potentially altering the mood of users by altering what posts they see, Facebook put at-risk individuals at higher dangers for depression and suicide. However, supports of Facebook claim that Facebook details that they have the right to use information for research in their terms of use. Others say the experiment is just a part of Facebook's current work, which alters News Feeds algorithms continually to keep people interested and coming back to the site. Others pointed out that this specific study is not along but that news organizations constantly try out different headlines using algorithms to elicit emotions and garner clicks or Facebook shares. They say this Facebook study is no different from things people already accept. Still, others say that Facebook broke the law when conducting the experiment on user that did not give informed consent. The Facebook study controversy raises numerous questions about informed consent and the differences in the ethical review process between publicly and privately funded research. Some say Facebook was within its limits and others see the need for more informed consent and/or the establishment of in-house private review boards.

Conflicts of interest

Other, long-standing controversies underscore the role for conflicts of interest among medical school faculty and researchers. For example, in 2014 coverage of University of California (UC) medical school faculty members has included news of ongoing corporate payments to researchers and practitioners from companies that market and produce the very devices and treatments they recommend to patients. Robert Pedowitz, the former chairman of UCLA's orthopedic surgery department, reported concern that his colleague's financial conflicts of interest could negatively affect patient care or research into new treatments. In a subsequent lawsuit about whistleblower retaliation, the university provided a $10 million settlement to Pedowitz while acknowledging no wrongdoing. Consumer Watchdog, an oversight group, observed that University of California policies were "either inadequate or unenforced...Patients in UC hospitals deserve the most reliable surgical devices and medication…and they shouldn't be treated as subjects in expensive experiments." Other UC incidents include taking the eggs of women for implantation into other women without consent and injecting live bacteria into human brains, resulting in potentially premature deaths.

History

old paper document
Spanish
 
old paper document
English
Walter Reed authored these informed consent documents in 1900 for his research on yellow fever

Informed consent is a technical term first used by attorney, Paul G. Gebhard, in a medical malpractice United States court case in 1957. In tracing its history, some scholars have suggested tracing the history of checking for any of these practices:

  1. A patient agrees to a health intervention based on an understanding of it.
  2. The patient has multiple choices and is not compelled to choose a particular one.
  3. The consent includes giving permission.

These practices are part of what constitutes informed consent, and their history is the history of informed consent. They combine to form the modern concept of informed consent—which rose in response to particular incidents in modern research. Whereas various cultures in various places practiced informed consent, the modern concept of informed consent was developed by people who drew influence from Western tradition.

Medical history

In this Ottoman Empire document from 1539 a father promises to not sue a surgeon in case of death following the removal of his son's urinary stones.

Historians cite a series of medical guidelines to trace the history of informed consent in medical practice.

The Hippocratic Oath, a Greek text dating to 500 B.C.E., was the first set of Western writings giving guidelines for the conduct of medical professionals. Consent by patients as well as several other, now considered fundamental issues, is not mentioned. The Hippocratic Corpus advises that physicians conceal most information from patients to give the patients the best care. The rationale is a beneficence model for care—the doctor knows better than the patient, and therefore should direct the patient's care, because the patient is not likely to have better ideas than the doctor.

Henri de Mondeville, a French surgeon who in the 14th century, wrote about medical practice. He traced his ideas to the Hippocratic Oath. Among his recommendations were that doctors "promise a cure to every patient" in hopes that the good prognosis would inspire a good outcome to treatment. Mondeville never mentioned getting consent, but did emphasize the need for the patient to have confidence in the doctor. He also advised that when deciding therapeutically unimportant details the doctor should meet the patients' requests "so far as they do not interfere with treatment".

In Ottoman Empire records there exists an agreement from 1539 in which negotiates details of a surgery, including fee and a commitment not to sue in case of death. This is the oldest identified written document in which a patient acknowledges risk of medical treatment and writes to express their willingness to proceed.

Benjamin Rush was an 18th-century United States physician who was influenced by the Age of Enlightenment cultural movement. Because of this, he advised that doctors ought to share as much information as possible with patients. He recommended that doctors educate the public and respect a patient's informed decision to accept therapy. There is no evidence that he supported seeking a consent from patients. In a lecture titled "On the duties of patients to their physicians", he stated that patients should be strictly obedient to the physician's orders; this was representative of much of his writings. John Gregory, Rush's teacher, wrote similar views that a doctor could best practice beneficence by making decisions for the patients without their consent.

Thomas Percival was a British physician who published a book called Medical Ethics in 1803. Percival was a student of the works of Gregory and various earlier Hippocratic physicians. Like all previous works, Percival's Medical Ethics makes no mention of soliciting for the consent of patients or respecting their decisions. Percival said that patients have a right to truth, but when the physician could provide better treatment by lying or withholding information, he advised that the physician do as he thought best.

When the American Medical Association was founded they in 1847 produced a work called the first edition of the American Medical Association Code of Medical Ethics. Many sections of this book are verbatim copies of passages from Percival's Medical Ethics. A new concept in this book was the idea that physicians should fully disclose all patient details truthfully when talking to other physicians, but the text does not also apply this idea to disclosing information to patients. Through this text, Percival's ideas became pervasive guidelines throughout the United States as other texts were derived from them.

Worthington Hooker was an American physician who in 1849 published Physician and Patient. This medical ethics book was radical demonstrating understanding of the AMA's guidelines and Percival's philosophy and soundly rejecting all directives that a doctor should lie to patients. In Hooker's view, benevolent deception is not fair to the patient, and he lectured widely on this topic. Hooker's ideas were not broadly influential.

The US Canterbury v. Spence case established the principle of informed consent in US law. Earlier legal cases had created the underpinnings for informed consent, but his judgment gave a detailed and thought through discourse on the matter. The judgment cites cases going back to 1914 as precedent for informed consent.

Research history

Historians cite a series of human subject research experiments to trace the history of informed consent in research.

The U.S. Army Yellow Fever Commission "is considered the first research group in history to use consent forms." In 1900, Major Walter Reed was appointed head of the four man U.S. Army Yellow Fever Commission in Cuba that determined mosquitoes were the vector for yellow fever transmission. His earliest experiments were probably done without formal documentation of informed consent. In later experiments he obtained support from appropriate military and administrative authorities. He then drafted what is now "one of the oldest series of extant informed consent documents." The three surviving examples are in Spanish with English translations; two have an individual's signature and one is marked with an X.

Tearoom Trade is the name of a book by American psychologist Laud Humphreys. In it he describes his research into male homosexual acts. In conducting this research he never sought consent from his research subjects and other researchers raised concerns that he violated the right to privacy for research participants.

Henrietta Lacks on January 29, 1951, shortly after the birth of her son Joseph, Lacks entered Johns Hopkins Hospital in Baltimore with profuse bleeding. She was diagnosed with cervical cancer and was treated with inserts of radium tubes. During her radiation treatments for the tumor, two samples—one of healthy cells, the other of malignant cells—were removed from her cervix without her permission. Later that year, 31-year-old Henrietta Lacks died from the cancer. Her cells were cultured creating Hela cells, but the family was not informed until 1973, the family learned the truth when scientists asked for DNA samples after finding that HeLa had contaminated other samples. In 2013 researchers published the genome without the Lacks family consent.

The Milgram experiment is the name of a 1961 experiment conducted by American psychologist Stanley Milgram. In the experiment Milgram had an authority figure order research participants to commit a disturbing act of harming another person. After the experiment he would reveal that he had deceived the participants and that they had not hurt anyone, but the research participants were upset at the experience of having participated in the research. The experiment raised broad discussion on the ethics of recruiting participants for research without giving them full information about the nature of the research.

Chester M. Southam used HeLa cells to inject into cancer patients and Ohio State Penitentiary inmates without informed consent to determine if people could become immune to cancer and if cancer could be transmitted.

New areas

With the growth of bioethics in the 21st century to include environmental sustainability, some authors, such as Cristina Richie have proposed a "green consent". This would include information and education about the climate impact of pharmaceuticals (carbon cost of medications) and climate change health hazards.

Ethics of informed consent

The principle of informed consent rests in the ethical standards of autonomy and self-determination. Effective informed consent includes information dis-closure,voluntary choice,and decision-making capacity.

Capacity can be defined as the ability to understand and reason through the decision-making process so that a rational choice can be made about whether or not to consent to treatment .

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...