Search This Blog

Saturday, April 11, 2026

Evidence-based medicine

From Wikipedia, the free encyclopedia

Evidence-based medicine (EBM), sometimes known within healthcare as evidence-based practice (EBP), is "the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients. It means integrating individual clinical expertise with the best available external clinical evidence from systematic research." The aim of EBM is to integrate the experience of the clinician, the values of the patient, and the best available scientific information to guide decision-making about clinical management. The term was originally used to describe an approach to teaching the practice of medicine and improving decisions by individual physicians about individual patients.

The EBM Pyramid is a tool that helps in visualizing the hierarchy of evidence in medicine, from least authoritative, like expert opinions, to most authoritative, like systematic reviews.

Adoption of evidence-based medicine is necessary in a human rights-based approach to public health and a precondition for accessing the right to health.

Background, history, and definition

Medicine has a long history of scientific inquiry into the prevention, diagnosis, and treatment of human disease. In the 11th century AD, Avicenna, a Persian physician and philosopher, developed an approach to EBM that was mostly similar to current ideas and practises.

The concept of a controlled clinical trial was first described in 1662 by Jan Baptist van Helmont in reference to the practice of bloodletting. Wrote Van Helmont:

Let us take out of the Hospitals, out of the Camps, or from elsewhere, 200, or 500 poor People, that have fevers or Pleuritis. Let us divide them in Halfes, let us cast lots, that one halfe of them may fall to my share, and the others to yours; I will cure them without blood-letting and sensible evacuation; but you do, as ye know ... we shall see how many Funerals both of us shall have...

The first published report describing the conduct and results of a controlled clinical trial was by James Lind, a Scottish naval surgeon who conducted research on scurvy during his time aboard HMS Salisbury in the Channel Fleet, while patrolling the Bay of Biscay. Lind divided the sailors participating in his experiment into six groups, so that the effects of various treatments could be fairly compared. Lind found improvement in symptoms and signs of scurvy among the group of men treated with lemons or oranges. He published a treatise describing the results of this experiment in 1753.

An early critique of statistical methods in medicine was published in 1835, in Comtes Rendus de l'Académie des Sciences, Paris, by a man referred to as "Mr Civiale".

In 1990, Gordon Guyatt, then a young internal medicine residency coordinator at McMaster University, introduced a teaching method he initially termed "Scientific Medicine." This approach emphasized applying critical appraisal techniques directly to bedside clinical decision-making, building on the work of his mentor, David Sackett. However, the concept met resistance from colleagues, as it implied that existing clinical practices lacked scientific rigor, even though this was likely true. To address this, Guyatt rebranded the approach as "Evidence-Based Medicine", a term first formally introduced in a 1991 editorial in the ACP Journal Club. Although the name was coined in 1991, it took several years after and a concerted efforts of many other teams to define the foundations of this method.

Although more popular in medicine, the concept of "evidence-based" is spreading to other disciplines, such as the humanities, and to languages other than English, albeit at a slower pace.

Clinical decision-making

Alvan Feinstein's publication of Clinical Judgment in 1967 focused attention on the role of clinical reasoning and identified biases that can affect it. In 1972, Archie Cochrane published Effectiveness and Efficiency, which described the lack of controlled trials supporting many practices that had previously been assumed to be effective. In 1973, John Wennberg began to document wide variations in how physicians practiced. Through the 1980s, David M. Eddy described errors in clinical reasoning and gaps in evidence. In the mid-1980s, Alvin Feinstein, David Sackett and others published textbooks on clinical epidemiology, which translated epidemiological methods to physician decision-making. Toward the end of the 1980s, a group at RAND showed that large proportions of procedures performed by physicians were considered inappropriate even by the standards of their own experts.

Evidence-based guidelines and policies

David M. Eddy first began to use the term 'evidence-based' in 1987 in workshops and a manual commissioned by the Council of Medical Specialty Societies to teach formal methods for designing clinical practice guidelines. The manual was eventually published by the American College of Physicians. Eddy first published the term 'evidence-based' in March 1990, in an article in the Journal of the American Medical Association (JAMA) that laid out the principles of evidence-based guidelines and population-level policies, which Eddy described as "explicitly describing the available evidence that pertains to a policy and tying the policy to evidence instead of standard-of-care practices or the beliefs of experts. The pertinent evidence must be identified, described, and analyzed. The policymakers must determine whether the policy is justified by the evidence. A rationale must be written." He discussed evidence-based policies in several other papers published in JAMA in the spring of 1990. Those papers were part of a series of 28 published in JAMA between 1990 and 1997 on formal methods for designing population-level guidelines and policies.

Medical education

The term 'evidence-based medicine' was introduced slightly later, in the context of medical education. In the autumn of 1990, Gordon Guyatt used it in an unpublished description of a program at McMaster University for prospective or new medical students. Guyatt and others first published the term two years later (1992) to describe a new approach to teaching the practice of medicine.

In 1996, David Sackett and colleagues clarified the definition of this tributary of evidence-based medicine as "the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients. ... [It] means integrating individual clinical expertise with the best available external clinical evidence from systematic research." This branch of evidence-based medicine aims to make individual decision making more structured and objective by better reflecting the evidence from research. Population-based data are applied to the care of an individual patient, while respecting the fact that practitioners have clinical expertise reflected in effective and efficient diagnosis and thoughtful identification and compassionate use of individual patients' predicaments, rights, and preferences.

Between 1993 and 2000, the Evidence-Based Medicine Working Group at McMaster University published the methods to a broad physician audience in a series of 25 "Users' Guides to the Medical Literature" in JAMA. In 1995 Rosenberg and Donald defined individual-level, evidence-based medicine as "the process of finding, appraising, and using contemporaneous research findings as the basis for medical decisions." In 2010, Greenhalgh used a definition that emphasized quantitative methods: "the use of mathematical estimates of the risk of benefit and harm, derived from high-quality research on population samples, to inform clinical decision-making in the diagnosis, investigation or management of individual patients."

The two original definitions highlight important differences in how evidence-based medicine is applied to populations versus individuals. When designing guidelines applied to large groups of people in settings with relatively little opportunity for modification by individual physicians, evidence-based policymaking emphasizes that good evidence should exist to document a test's or treatment's effectiveness. In the setting of individual decision-making, practitioners can be given greater latitude in how they interpret research and combine it with their clinical judgment. In 2005, Eddy offered an umbrella definition for the two branches of EBM: "Evidence-based medicine is a set of principles and methods intended to ensure that to the greatest extent possible, medical decisions, guidelines, and other types of policies are based on and consistent with good evidence of effectiveness and benefit."

Progress

In the area of evidence-based guidelines and policies, the explicit insistence on evidence of effectiveness was introduced by the American Cancer Society in 1980. The U.S. Preventive Services Task Force (USPSTF) began issuing guidelines for preventive interventions based on evidence-based principles in 1984. In 1985, the Blue Cross Blue Shield Association applied strict evidence-based criteria for covering new technologies. Beginning in 1987, specialty societies such as the American College of Physicians, and voluntary health organizations such as the American Heart Association, wrote many evidence-based guidelines. In 1991, Kaiser Permanente, a managed care organization in the US, began an evidence-based guidelines program. In 1991, Richard Smith wrote an editorial in the British Medical Journal and introduced the ideas of evidence-based policies in the UK. In 1993, the Cochrane Collaboration created a network of 13 countries to produce systematic reviews and guidelines. In 1997, the US Agency for Healthcare Research and Quality (AHRQ, then known as the Agency for Health Care Policy and Research, or AHCPR) established Evidence-based Practice Centers (EPCs) to produce evidence reports and technology assessments to support the development of guidelines. In the same year, a National Guideline Clearinghouse that followed the principles of evidence-based policies was created by AHRQ, the AMA, and the American Association of Health Plans (now America's Health Insurance Plans). In 1999, the National Institute for Clinical Excellence (NICE) was created in the UK to circulate evidence and guidance on treatments within the NHS.

In the area of medical education, medical schools in Canada, the US, the UK, Australia, and other countries now offer programs that teach evidence-based medicine. A 2009 study of UK programs found that more than half of UK medical schools offered some training in evidence-based medicine, although the methods and content varied considerably, and EBM teaching was restricted by lack of curriculum time, trained tutors and teaching materials. Many programs have been developed to help individual physicians gain better access to evidence. For example, UpToDate was created in the early 1990s. The Cochrane Collaboration began publishing evidence reviews in 1993. In 1995, BMJ Publishing Group launched Clinical Evidence, a 6-monthly periodical that provided brief summaries of the current state of evidence about important clinical questions for clinicians.

Current practice

By 2000, use of the term evidence-based had extended to other levels of the health care system. An example is evidence-based health services, which seek to increase the competence of health service decision makers and the practice of evidence-based medicine at the organizational or institutional level.

The multiple tributaries of evidence-based medicine share an emphasis on the importance of incorporating evidence from formal research in medical policies and decisions. However, because they differ on the extent to which they require good evidence of effectiveness before promoting a guideline or payment policy, a distinction is sometimes made between evidence-based medicine and science-based medicine, which also takes into account factors such as prior plausibility and compatibility with established science (as when medical organizations promote controversial treatments such as acupuncture). Differences also exist regarding the extent to which it is feasible to incorporate individual-level information in decisions. Thus, evidence-based guidelines and policies may not readily "hybridise" with experience-based practices orientated towards ethical clinical judgement, and can lead to contradictions, contest, and unintended crises. The most effective "knowledge leaders" (managers and clinical leaders) use a broad range of management knowledge in their decision making, rather than just formal evidence. Evidence-based guidelines may provide the basis for governmentality in health care, and consequently play a central role in the governance of contemporary health care systems.

Methods

Steps

The steps for designing explicit, evidence-based guidelines were described in the late 1980s: formulate the question (population, intervention, comparison intervention, outcomes, time horizon, setting); search the literature to identify studies that inform the question; interpret each study to determine precisely what it says about the question; if several studies address the question, synthesize their results (meta-analysis); summarize the evidence in evidence tables; compare the benefits, harms and costs in a balance sheet; draw a conclusion about the preferred practice; write the guideline; write the rationale for the guideline; have others review each of the previous steps; implement the guideline.

For the purposes of medical education and individual-level decision making, five steps of EBM in practice were described in 1992 and the experience of delegates attending the 2003 Conference of Evidence-Based Health Care Teachers and Developers was summarized into five steps and published in 2005. This five-step process can broadly be categorized as follows:

  1. Translation of uncertainty to an answerable question; includes critical questioning, study design and levels of evidence
  2. Systematic retrieval of the best evidence available
  3. Critical appraisal of evidence for internal validity that can be broken down into aspects regarding:
    • Systematic errors as a result of selection bias, information bias and confounding
    • Quantitative aspects of diagnosis and treatment
    • The effect size and aspects regarding its precision
    • Clinical importance of results
    • External validity or generalizability
  4. Application of results in practice
  5. Evaluation of performance

Evidence reviews

Systematic reviews of published research studies are a major part of the evaluation of particular treatments. The Cochrane Collaboration is one of the best-known organisations that conducts systematic reviews. Like other producers of systematic reviews, it requires authors to provide a detailed study protocol as well as a reproducible plan of their literature search and evaluations of the evidence. After the best evidence is assessed, treatment is categorized as (1) likely to be beneficial, (2) likely to be harmful, or (3) without evidence to support either benefit or harm.

A 2007 analysis of 1,016 systematic reviews from all 50 Cochrane Collaboration Review Groups found that 44% of the reviews concluded that the intervention was likely to be beneficial, 7% concluded that the intervention was likely to be harmful, and 49% concluded that evidence did not support either benefit or harm. 96% recommended further research. In 2017, a study assessed the role of systematic reviews produced by Cochrane Collaboration to inform US private payers' policymaking; it showed that although the medical policy documents of major US private payers were informed by Cochrane systematic reviews, there was still scope to encourage the further use.

Assessing the quality of evidence

Evidence-based medicine categorizes different types of clinical evidence and rates or grades them according to the strength of their freedom from the various biases that beset medical research. For example, the strongest evidence for therapeutic interventions is provided by systematic review of randomized, well-blinded, placebo-controlled trials with allocation concealment and complete follow-up involving a homogeneous patient population and medical condition. In contrast, patient testimonials, case reports, and even expert opinion have little value as proof because of the placebo effect, the biases inherent in observation and reporting of cases, and difficulties in ascertaining who is an expert (however, some critics have argued that expert opinion "does not belong in the rankings of the quality of empirical evidence because it does not represent a form of empirical evidence" and continue that "expert opinion would seem to be a separate, complex type of knowledge that would not fit into hierarchies otherwise limited to empirical evidence alone.").

Several organizations have developed grading systems for assessing the quality of evidence. For example, in 1989 the U.S. Preventive Services Task Force (USPSTF) put forth the following system:

  • Level I: Evidence obtained from at least one properly designed randomized controlled trial.
  • Level II-1: Evidence obtained from well-designed controlled trials without randomization.
  • Level II-2: Evidence obtained from well-designed cohort studies or case-control studies, preferably from more than one center or research group.
  • Level II-3: Evidence obtained from multiple time series designs with or without the intervention. Dramatic results in uncontrolled trials might also be regarded as this type of evidence.
  • Level III: Opinions of respected authorities, based on clinical experience, descriptive studies, or reports of expert committees.

Another example are the Oxford CEBM Levels of Evidence published by the Centre for Evidence-Based Medicine. First released in September 2000, the Levels of Evidence provide a way to rank evidence for claims about prognosis, diagnosis, treatment benefits, treatment harms, and screening, which most grading schemes do not address. The original CEBM Levels were Evidence-Based On Call to make the process of finding evidence feasible and its results explicit. In 2011, an international team redesigned the Oxford CEBM Levels to make them more understandable and to take into account recent developments in evidence ranking schemes. The Oxford CEBM Levels of Evidence have been used by patients and clinicians, as well as by experts to develop clinical guidelines, such as recommendations for the optimal use of phototherapy and topical therapy in psoriasis and guidelines for the use of the BCLC staging system for diagnosing and monitoring hepatocellular carcinoma in Canada.

In 2000, a system was developed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE) working group. The GRADE system takes into account more dimensions than just the quality of medical research. It requires users who are performing an assessment of the quality of evidence, usually as part of a systematic review, to consider the impact of different factors on their confidence in the results. Authors of GRADE tables assign one of four levels to evaluate the quality of evidence, on the basis of their confidence that the observed effect (a numeric value) is close to the true effect. The confidence value is based on judgments assigned in five different domains in a structured manner. The GRADE working group defines 'quality of evidence' and 'strength of recommendations' based on the quality as two different concepts that are commonly confused with each other.

Systematic reviews may include randomized controlled trials that have low risk of bias, or observational studies that have high risk of bias. In the case of randomized controlled trials, the quality of evidence is high but can be downgraded in five different domains.

  • Risk of bias: A judgment made on the basis of the chance that bias in included studies has influenced the estimate of effect.
  • Imprecision: A judgment made on the basis of the chance that the observed estimate of effect could change completely.
  • Indirectness: A judgment made on the basis of the differences in characteristics of how the study was conducted and how the results are actually going to be applied.
  • Inconsistency: A judgment made on the basis of the variability of results across the included studies.
  • Publication bias: A judgment made on the basis of the question whether all the research evidence has been taken to account.

In the case of observational studies per GRADE, the quality of evidence starts off lower and may be upgraded in three domains in addition to being subject to downgrading.

  • Large effect: Methodologically strong studies show that the observed effect is so large that the probability of it changing completely is less likely.
  • Plausible confounding would change the effect: Despite the presence of a possible confounding factor that is expected to reduce the observed effect, the effect estimate still shows significant effect.
  • Dose response gradient: The intervention used becomes more effective with increasing dose. This suggests that a further increase will likely bring about more effect.

Meaning of the levels of quality of evidence as per GRADE:

  • High Quality Evidence: The authors are very confident that the presented estimate lies very close to the true value. In other words, the probability is very low that further research will completely change the presented conclusions.
  • Moderate Quality Evidence: The authors are confident that the presented estimate lies close to the true value, but it is also possible that it may be substantially different. In other words, further research may completely change the conclusions.
  • Low Quality Evidence: The authors are not confident in the effect estimate, and the true value may be substantially different. In other words, further research is likely to change the presented conclusions completely.
  • Very Low Quality Evidence: The authors do not have any confidence in the estimate and it is likely that the true value is substantially different from it. In other words, new research will probably change the presented conclusions completely.

Categories of recommendations

In guidelines and other publications, recommendation for a clinical service is classified by the balance of risk versus benefit and the level of evidence on which this information is based. The U.S. Preventive Services Task Force uses the following system:

  • Level A: Good scientific evidence suggests that the benefits of the clinical service substantially outweigh the potential risks. Clinicians should discuss the service with eligible patients.
  • Level B: At least fair scientific evidence suggests that the benefits of the clinical service outweighs the potential risks. Clinicians should discuss the service with eligible patients.
  • Level C: At least fair scientific evidence suggests that the clinical service provides benefits, but the balance between benefits and risks is too close for general recommendations. Clinicians need not offer it unless individual considerations apply.
  • Level D: At least fair scientific evidence suggests that the risks of the clinical service outweigh potential benefits. Clinicians should not routinely offer the service to asymptomatic patients.
  • Level I: Scientific evidence is lacking, of poor quality, or conflicting, such that the risk versus benefit balance cannot be assessed. Clinicians should help patients understand the uncertainty surrounding the clinical service.

GRADE guideline panelists may make strong or weak recommendations on the basis of further criteria. Some of the important criteria are the balance between desirable and undesirable effects (not considering cost), the quality of the evidence, values and preferences and costs (resource utilization).

Despite the differences between systems, the purposes are the same: to guide users of clinical research information on which studies are likely to be most valid. However, the individual studies still require careful critical appraisal.

Statistical measures

Evidence-based medicine attempts to express clinical benefits of tests and treatments using mathematical methods. Tools used by practitioners of evidence-based medicine include:

  • Likelihood ratio The pre-test odds of a particular diagnosis, multiplied by the likelihood ratio, determines the post-test odds. (Odds can be calculated from, and then converted to, the [more familiar] probability.) This reflects Bayes' theorem. The differences in likelihood ratio between clinical tests can be used to prioritize clinical tests according to their usefulness in a given clinical situation.
  • AUC-ROC The area under the receiver operating characteristic curve (AUC-ROC) reflects the relationship between sensitivity and specificity for a given test. High-quality tests will have an AUC-ROC approaching 1, and high-quality publications about clinical tests will provide information about the AUC-ROC. Cutoff values for positive and negative tests can influence specificity and sensitivity, but they do not affect AUC-ROC.
  • Number needed to treat (NNT)/Number needed to harm (NNH). NNT and NNH are ways of expressing the effectiveness and safety, respectively, of interventions in a way that is clinically meaningful. NNT is the number of people who need to be treated in order to achieve the desired outcome (e.g. survival from cancer) in one patient. For example, if a treatment increases the chance of survival by 5%, then 20 people need to be treated in order for 1 additional patient to survive because of the treatment. The concept can also be applied to diagnostic tests. For example, if 1,339 women age 50–59 need to be invited for breast cancer screening over a ten-year period in order to prevent one woman from dying of breast cancer, then the NNT for being invited to breast cancer screening is 1339.

Quality of clinical trials

Evidence-based medicine attempts to objectively evaluate the quality of clinical research by critically assessing techniques reported by researchers in their publications.

  • Trial design considerations: High-quality studies have clearly defined eligibility criteria and have minimal missing data.
  • Generalizability considerations: Studies may only be applicable to narrowly defined patient populations and may not be generalizable to other clinical contexts.
  • Follow-up: Sufficient time for defined outcomes to occur can influence the prospective study outcomes and the statistical power of a study to detect differences between a treatment and control arm.
  • Power: A mathematical calculation can determine whether the number of patients is sufficient to detect a difference between treatment arms. A negative study may reflect a lack of benefit, or simply a lack of sufficient quantities of patients to detect a difference.

Limitations and criticism

There are a number of limitations and criticisms of evidence-based medicine. Two widely cited categorization schemes for the various published critiques of EBM include the three-fold division of Straus and McAlister ("limitations universal to the practice of medicine, limitations unique to evidence-based medicine and misperceptions of evidence-based-medicine") and the five-point categorization of Cohen, Stavri and Hersh (EBM is a poor philosophic basis for medicine, defines evidence too narrowly, is not evidence-based, is limited in usefulness when applied to individual patients, or reduces the autonomy of the doctor/patient relationship).

In no particular order, some published objections include:

  • Research produced by EBM, such as from randomized controlled trials (RCTs), may not be relevant for all treatment situations. Research tends to focus on specific populations, but individual persons can vary substantially from population norms. Because certain population segments have been historically under-researched (due to reasons such as race, gender, age, and co-morbid diseases), evidence from RCTs may not be generalizable to those populations. Thus, EBM applies to groups of people, but this should not preclude clinicians from using their personal experience in deciding how to treat each patient. One author advises that "the knowledge gained from clinical research does not directly answer the primary clinical question of what is best for the patient at hand" and suggests that evidence-based medicine should not discount the value of clinical experience. Another author stated that "the practice of evidence-based medicine means integrating individual clinical expertise with the best available external clinical evidence from systematic research."
  • Use of evidence-based guidelines often fits poorly for complex, multimorbid patients. This is because the guidelines are usually based on clinical studies focused on single diseases. In reality, the recommended treatments in such circumstances may interact unfavorably with each other and often lead to polypharmacy.
  • The theoretical ideal of EBM (that every narrow clinical question, of which hundreds of thousands can exist, would be answered by meta-analysis and systematic reviews of multiple RCTs) faces the limitation that research (especially the RCTs themselves) is expensive; thus, in reality, for the foreseeable future, the demand for EBM will always be much higher than the supply, and the best humanity can do is to triage the application of scarce resources.
  • Research can be influenced by biases such as political or belief biaspublication bias and conflict of interest in academic publishing. For example, studies with conflicts due to industry funding are more likely to favor their product. It has been argued that contemporary evidence based medicine is an illusion, since evidence based medicine has been corrupted by corporate interests, failed regulation, and commercialisation of academia.
  • Systematic Reviews methodologies are capable of bias and abuse in respect of (i) choice of inclusion criteria (ii) choice of outcome measures, comparisons and analyses (iii) the subjectivity inevitable in Risk of Bias assessments, even when codified procedures and criteria are observed. An example of all these problems can be seen in a Cochrane Review.
  • A lag exists between when the RCT is conducted and when its results are published.
  • A lag exists between when results are published and when they are properly applied.
  • Hypocognition (the absence of a simple, consolidated mental framework into which new information can be placed) can hinder the application of EBM.
  • Values: while patient values are considered in the original definition of EBM, the importance of values is not commonly emphasized in EBM training, a potential problem under current study.

A 2018 study, "Why all randomised controlled trials produce biased results", assessed the 10 most cited RCTs and argued that trials face a wide range of biases and constraints, from trials only being able to study a small set of questions amenable to randomisation and generally only being able to assess the average treatment effect of a sample, to limitations in extrapolating results to another context, among many others outlined in the study.

Application of evidence in clinical settings

Despite the emphasis on evidence-based medicine, unsafe or ineffective medical practices continue to be applied, because of patient demand for tests or treatments, because of failure to access information about the evidence, or because of the rapid pace of change in the scientific evidence. For example, between 2003 and 2017, the evidence shifted on hundreds of medical practices, including whether hormone replacement therapy was safe, whether babies should be given certain vitamins, and whether antidepressant drugs are effective in people with Alzheimer's disease. Even when the evidence unequivocally shows that a treatment is either not safe or not effective, it may take many years for other treatments to be adopted.

There are many factors that contribute to lack of uptake or implementation of evidence-based recommendations. These include lack of awareness at the individual clinician or patient (micro) level, lack of institutional support at the organisation level (meso) level or higher at the policy (macro) level. In other cases, significant change can require a generation of physicians to retire or die and be replaced by physicians who were trained with more recent evidence.

Physicians may also reject evidence that conflicts with their anecdotal experience or because of cognitive biases – for example, a vivid memory of a rare but shocking outcome (the availability heuristic), such as a patient dying after refusing treatment. They may overtreat to "do something" or to address a patient's emotional needs. They may worry about malpractice charges based on a discrepancy between what the patient expects and what the evidence recommends. They may also overtreat or provide ineffective treatments because the treatment feels biologically plausible.

It is the responsibility of those developing clinical guidelines to include an implementation plan to facilitate uptake. The implementation process will include an implementation plan, analysis of the context, identifying barriers and facilitators and designing the strategies to address them.

Education

Training in evidence based medicine is offered across the continuum of medical education. Educational competencies have been created for the education of health care professionals.

The Berlin questionnaire and the Fresno Test are validated instruments for assessing the effectiveness of education in evidence-based medicine. These questionnaires have been used in diverse settings.

A Campbell systematic review that included 24 trials examined the effectiveness of e-learning in improving evidence-based health care knowledge and practice. It was found that e-learning, compared to no learning, improves evidence-based health care knowledge and skills but not attitudes and behaviour. No difference in outcomes is present when comparing e-learning with face-to-face learning. Combining e-learning and face-to-face learning (blended learning) has a positive impact on evidence-based knowledge, skills, attitude and behavior. As a form of e-learning, some medical school students engage in editing Wikipedia to increase their EBM skills, and some students construct EBM materials to develop their skills in communicating medical knowledge.

Naturalistic fallacy

From Wikipedia, the free encyclopedia

In metaethics, the naturalistic fallacy is the claim that it is possible to define good in terms of merely described entities, properties, or processes such as pleasant, desirable, or fitness. The term was introduced by British philosopher G. E. Moore in his 1903 book Principia Ethica.

Moore's naturalistic fallacy is closely related to the is–ought problem, which comes from David Hume's Treatise of Human Nature (1738–40); however, unlike Hume's view of the is–ought problem, Moore (and other proponents of ethical non-naturalism) did not consider the naturalistic fallacy to be at odds with moral realism.

Common uses

The is–ought problem

The term naturalistic fallacy is sometimes used to label the problematic inference of an ought from an is (the is–ought problem). Michael Ridge relevantly elaborates that "[t]he intuitive idea is that evaluative conclusions require at least one evaluative premise—purely factual premises about the naturalistic features of things do not entail or even support evaluative conclusions." This problematic inference usually takes the form of saying that if people generally do something (e.g., eat three times a day, smoke cigarettes, dress warmly in cold weather), then people ought to do that thing. The naturalistic fallacy occurs when the is–ought inference ("People eat three times a day, so it is morally good for people to eat three times a day") is justified by the claim that whatever practice exists is a natural one ("because eating three times a day is pleasant and desirable").

Bentham, in discussing the relations of law and morality, found that when people discuss problems and issues they talk about how they wish it would be, instead of how it actually is. This can be seen in discussions of natural law and positive law. Bentham criticized natural law theory because in his view it was an instance of the naturalistic fallacy, claiming that it described how things are rather than how they ought to be.

Moore's discussion

The title page of Principia Ethica

According to G. E. Moore's Principia Ethica, when philosophers try to define good reductively, in terms of natural properties like pleasant or desirable, they are committing the naturalistic fallacy:

... the assumption that because some quality or combination of qualities invariably and necessarily accompanies the quality of goodness, or is invariably and necessarily accompanied by it, or both, this quality or combination of qualities is identical with goodness. If, for example, it is believed that whatever is pleasant is and must be good, or that whatever is good is and must be pleasant, or both, it is committing the naturalistic fallacy to infer from this that goodness and pleasantness are one and the same quality. The naturalistic fallacy is the assumption that because the words 'good' and, say, 'pleasant' necessarily describe the same objects, they must attribute the same quality to them.

— Arthur N. Prior, Logic And The Basis Of Ethics

In defense of ethical non-naturalism against ethical naturalism, Moore's argument is concerned with the semantic and metaphysical underpinnings of ethics. Moore argues that good, in the sense of intrinsic value, is simply ineffable. It cannot be defined because it is not reducible to other properties, being "one of those innumerable objects of thought which are themselves incapable of definition, because they are the ultimate terms by reference to which whatever 'is' capable of definition must be defined".[5] On the other hand, ethical naturalists eschew such principles in favor of a more empirically accessible analysis of what it means to be good: for example, in terms of pleasure in the context of hedonism.

That "pleased" does not mean "having the sensation of red", or anything else whatever, does not prevent us from understanding what it does mean. It is enough for us to know that "pleased" does mean "having the sensation of pleasure", and though pleasure is absolutely indefinable, though pleasure is pleasure and nothing else whatever, yet we feel no difficulty in saying that we are pleased. The reason is, of course, that when I say "I am pleased", I do not mean that "I" am the same thing as "having pleasure". And similarly no difficulty need be found in my saying that "pleasure is good" and yet not meaning that "pleasure" is the same thing as "good", that pleasure means good, and that good means pleasure. If I were to imagine that when I said "I am pleased", I meant that I was exactly the same thing as "pleased", I should not indeed call that a naturalistic fallacy, although it would be the same fallacy as I have called naturalistic with reference to Ethics.

— G. E. Moore, Principia Ethica § 12

In §7, Moore argues that a property is either a complex of simple properties, or else it is irreducibly simple. Complex properties can be defined in terms of their constituent parts but a simple property lacks parts. In addition to good and pleasure, Moore suggests that colour qualia are undefined: if one wants to understand yellow, one must see examples of it. It will do no good to read the dictionary and learn that yellow names the colour of egg yolks and ripe lemons, or that yellow names the primary colour between green and orange on the spectrum, or that the perception of yellow is stimulated by electromagnetic radiation with a wavelength of between 570 and 590 nanometers, because yellow is all that and more, by the open question argument.

Appeal to nature

Some people use the phrase, naturalistic fallacy or appeal to nature, in a different sense, to characterize inferences of the form "Something is natural; therefore, it is morally acceptable" or "This property is unnatural; therefore, this property is undesirable." Such inferences are common in discussions of medicine, homosexuality, environmentalism, and veganism.

The naturalistic fallacy is the idea that what is found in nature is good. It was the basis for social Darwinism, the belief that helping the poor and sick would get in the way of evolution, which depends on the survival of the fittest. Today, biologists denounce the naturalistic fallacy because they want to describe the natural world honestly, without people deriving morals about how we ought to behave (as in: If birds and beasts engage in adultery, infanticide, cannibalism, it must be OK).

Criticism

Bernard Williams called Moore's use of the term naturalistic fallacy a "spectacular misnomer", the matter in question being metaphysical, as opposed to rational.

Some philosophers reject the naturalistic fallacy or suggest solutions for the proposed is–ought problem.

Bound-up functions

Ralph McInerny suggests that ought is already bound up in is, insofar as the very nature of things have ends/goals within them. For example, a clock is a device used to keep time. When one understands the function of a clock, then a standard of evaluation is implicit in the very description of the clock, i.e., because it is a clock, it ought to keep the time. Thus, if one cannot pick a good clock from a bad clock, then one does not really know what a clock is. In like manner, if one cannot determine good human action from bad, then one does not really know what the human person is.

Irrationality of anti-naturalistic fallacy

The belief that naturalistic fallacy is inherently flawed has been criticized as lacking rational bases, and labelled anti-naturalistic fallacy. For instance, Alex Walter wrote:

"The naturalistic fallacy and Hume's 'law' are frequently appealed to for the purpose of drawing limits around the scope of scientific inquiry into ethics and morality. These two objections are shown to be without force."

That is because said beliefs implicitly assert that there is no connection between the facts and the norms (in particular, between the facts and the mental process that led to adoption of the norms). However, some philosophers argue that these connections are inevitable.

A very basic example is that if people view rescuing people as morally correct, this would shape their beliefs on what constitutes danger and what situations warrant intervention. For wider-ranging examples, if one believes that a certain ethnic group of humans have a population-level statistical hereditary predisposition to destroy civilization while the other person does not believe that such is the case, that difference in beliefs about factual matters will make the first person conclude that persecution of said ethnic group is an excusable "necessary evil" while the second person will conclude that it is a totally unjustifiable evil.

Similarly, if two people think it is evil to keep people working extremely hard in extreme poverty, they will draw different conclusions on de facto rights (as opposed to purely semantic rights) of property owners. The latter is dependent on whether they believe property owners are responsible for the aforementioned exploitation. One who accepts this premise would conclude that it is necessary to persecute property owners to mitigate exploitation. The one who does not, on the other hand, would conclude that the persecution is unnecessary and evil.

Inconsistent application

Some critics of the assumption that is-ought conclusions are fallacies point at observations of people who purport to consider such conclusions as fallacies do not do so consistently. Examples mentioned are that evolutionary psychologists who gripe about "the naturalistic fallacy" do make is-ought conclusions themselves when, for instance, alleging that the notion of the blank slate would lead to totalitarian social engineering or that certain views on sexuality would lead to attempts to convert homosexuals to heterosexuals. Critics point at this as a sign that charges of the naturalistic fallacy are inconsistent rhetorical tactics rather than detection of a fallacy.

Universally normative allegations of varied harm

A criticism of the concept of the naturalistic fallacy is that while "descriptive" statements (used here in the broad sense about statements that purport to be about facts regardless of whether they are true or false, used simply as opposed to normative statements) about specific differences in effects can be inverted depending on values (such as the statement "people X are predisposed to eating babies" being normative against group X only in the context of protecting children while the statement "individual or group X is predisposed to emit greenhouse gases" is normative against individual/group X only in the context of protecting the environment), the statement "individual/group X is predisposed to harm whatever values others have" is universally normative against individual/group X. This refers to individual/group X being "descriptively" alleged to detect what other entities capable of valuing are protecting and then destroying it without individual/group X having any values of its own. For example, in the context of one philosophy advocating child protection considering eating babies the worst evil and advocating industries that emit greenhouse gases to finance a safe short term environment for children while another philosophy considers long term damage to the environment the worst evil and advocates eating babies to reduce overpopulation and with it consumption that emits greenhouse gases, such an individual/group X could be alleged to advocate both eating babies and building autonomous industries to maximize greenhouse gas emissions, making the two otherwise enemy philosophies become allies against individual/group X as a "common enemy". The principle, that of allegations of an individual or group being predisposed to adapt their harm to damage any values including combined harm of apparently opposite values inevitably making normative implications regardless of which the specific values are, is argued to extend to any other situations with any other values as well due to the allegation being of the individual or group adapting their destruction to different values. This is mentioned as an example of at least one type of "descriptive" allegation being bound to make universally normative implications, as well as the allegation not being scientifically self-correcting due to individual or group X being alleged to manipulate others to support their alleged all-destructive agenda which dismisses any scientific criticism of the allegation as "part of the agenda that destroys everything", and that the objection that some values may condemn some specific ways to persecute individual/group X is irrelevant since different values would also have various ways to do things against individuals or groups that they would consider acceptable to do. This is pointed out as a falsifying counterexample to the claim that "no descriptive statement can in itself become normative".

Non-synonymous properties

In 1939, William Frankena critiqued G. E. Moore's conception of the naturalistic fallacy, claiming the concept was an instance of a definist fallacy. Frankena stated that, in arguing that good cannot be defined by natural properties, Moore was trying to avoid a broader confusion caused by attempting to define a term using non-synonymous properties.

Frankena also argued that naturalistic fallacy is a complete misnomer because it is neither limited to naturalistic properties nor necessarily a fallacy. On the first word (naturalistic), he noted that Moore rejected defining good in non-natural as well as natural terms. Frankena rejected the idea that the second word (fallacy) represented an error in reasoning – a fallacy as it is usually recognized – rather than an error in semantics.

In Moore's open-question argument, because questions such as "Is that which is pleasurable good?" have no definitive answer, then pleasurable is not synonymous with good. Frankena rejected this argument as: the fact that there is always an open question merely reflects the fact that it makes sense to ask whether two things that may be identical in fact are. Thus, even if good were identical to pleasurable, it makes sense to ask whether it is; the answer may be "yes", but the question was legitimate. This seems to contradict Moore's view which accepts that sometimes alternative answers could be dismissed without argument; however, Frankena objects that this would be committing the fallacy of begging the question.

Atomic physics

From Wikipedia, the free encyclopedia

Atomic physics is the field of physics that studies atoms as an isolated system of electrons and an atomic nucleus. Atomic physics typically refers to the study of atomic structure and the interaction between atoms. It is primarily concerned with the way in which electrons are arranged around the nucleus and the processes by which these arrangements change. This comprises ions, neutral atoms and, unless otherwise stated, it can be assumed that the term atom includes ions.

The term atomic physics can be associated with nuclear power and nuclear weapons, due to the synonymous use of atomic and nuclear in standard English. Physicists distinguish between atomic physics—which deals with the atom as a system consisting of a nucleus and electrons—and nuclear physics, which studies nuclear reactions and special properties of atomic nuclei.

As with many scientific fields, strict delineation can be highly contrived and atomic physics is often considered in the wider context of atomic, molecular, and optical physics. As a result, atomic physics research groups are usually classified as such.

Isolated atoms

Atomic physics primarily considers atoms in isolation. Atomic models will consist of a single nucleus that may be surrounded by one or more bound electrons. It is not concerned with the formation of molecules (although much of the physics is identical), nor does it examine atoms in a solid state as condensed matter. It is concerned with processes such as ionization and excitation by photons or collisions with atomic particles.

While modelling atoms in isolation may not seem realistic, if one considers atoms in a gas or plasma then the time-scales for atom-atom interactions are huge in comparison to the atomic processes that are generally considered. This means that the individual atoms can be treated as if each were in isolation, as the vast majority of the time they are. By this consideration, atomic physics provides the underlying theory in plasma physics and atmospheric physics, even though both deal with very large numbers of atoms.

Electronic configuration

Electrons form notional shells around the nucleus. These are normally in a ground state but can be excited by the absorption of energy from light (photons), magnetic fields, or interaction with a colliding particle (typically ions or other electrons).

In the Bohr model, the transition of an electron with n=3 to the shell n=2 is shown, where a photon is emitted. An electron from shell (n=2) must have been removed beforehand by ionization

Electrons that populate a shell are said to be in a bound state. The energy necessary to remove an electron from its shell (taking it to infinity) is called the binding energy. Any quantity of energy absorbed by the electron in excess of this amount is converted to kinetic energy according to the conservation of energy. The atom is said to have undergone the process of ionization.

If the electron absorbs a quantity of energy less than the binding energy, it will be transferred to an excited state. After a certain time, the electron in an excited state will "jump" (undergo a transition) to a lower state. In a neutral atom, the system will emit a photon of the difference in energy, since energy is conserved.

If an inner electron has absorbed more than the binding energy (so that the atom ionizes), then a more outer electron may undergo a transition to fill the inner orbital. In this case, a visible photon or a characteristic X-ray is emitted, or a phenomenon known as the Auger effect may take place, where the released energy is transferred to another bound electron, causing it to go into the continuum. The Auger effect allows one to multiply ionize an atom with a single photon.

There are rather strict selection rules as to the electronic configurations that can be reached by excitation by light –however, there are no such rules for excitation by collision processes.

Bohr model of the atom

The Bohr model, proposed by Niels Bohr in 1913, is a revolutionary theory describing the structure of the hydrogen atom. It introduced the idea of quantized orbits for electrons, combining classical and quantum physics.

Key Postulates of the Bohr Model

  1. Electrons Move in Circular Orbits
    • Electrons revolve around the nucleus in fixed, circular paths called orbits or energy levels.
    • These orbits are stable and do not radiate energy.
  2. Quantization of Angular Momentum:
    • The angular momentum of an electron is quantized and given by: where:
      : electron mass
      : velocity of the electron
      : radius of the orbit
      : reduced Planck constant ()
      : principal quantum number, representing the orbit
  3. Energy Levels
    • Each orbit has a specific energy. The total energy of an electron in the th orbit is: where  is the ground-state energy of the hydrogen atom.
  4. Emission or Absorption of Energy
    • Electrons can transition between orbits by absorbing or emitting energy equal to the difference between the energy levels: where:
      : the Planck constant.
      : frequency of emitted/absorbed radiation.
      : final and initial energy levels.

History and developments

One of the earliest steps towards atomic physics was the recognition that matter was composed of atoms. It forms a part of the texts written in 6th century BC to 2nd century BC, such as those of Democritus or Vaiśeṣika Sūtra written by Kaṇāda. This theory was later developed in the modern sense of the basic unit of a chemical element by the British chemist and physicist John Dalton in the 18th century. At this stage, it was not clear what atoms were, although they could be described and classified by their properties (in bulk). The invention of the periodic system of elements by Dmitri Mendeleev was another great step forward.

The true beginning of atomic physics is marked by the discovery of spectral lines and attempts to describe the phenomenon, most notably by Joseph von Fraunhofer. The study of these lines led to the Bohr atom model and to the birth of quantum mechanics. In seeking to explain atomic spectra, an entirely new mathematical model of matter was revealed. As far as atoms and their electron shells were concerned, not only did this yield a better overall description, i.e. the atomic orbital model, but it also provided a new theoretical basis for chemistry (quantum chemistry) and spectroscopy.

Since the Second World War, both theoretical and experimental fields have advanced at a rapid pace. This can be attributed to progress in computing technology, which has allowed larger and more sophisticated models of atomic structure and associated collision processes. Similar technological advances in accelerators, detectors, magnetic field generation and lasers have greatly assisted experimental work.

Beyond the well-known phenomena which can be described with regular quantum mechanics chaotic processes can occur which need different descriptions.

 

Genetically modified organism

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Genetically_modified_organism

A genetically modified organism (GMO) is any organism whose genetic material has been altered using genetic engineering techniques. The exact definition of a genetically modified organism and what constitutes genetic engineering varies, with the most common being an organism altered in a way that "does not occur naturally by mating and/or natural recombination". A wide variety of organisms have been genetically modified (GM), including animals, plants, and microorganisms.

Genetic modification can include the introduction of new genes or enhancing, altering, or knocking out endogenous genes. In some genetic modifications, genes are transferred within the same species, across species (creating transgenic organisms), and even across kingdoms. Creating a genetically modified organism is a multi-step process. Genetic engineers must isolate the gene they wish to insert into the host organism and combine it with other genetic elements, including a promoter and terminator region and often a selectable marker. A number of techniques are available for inserting the isolated gene into the host genome. Recent advancements using genome editing techniques, notably CRISPR, have made the production of GMOs much simpler. Herbert Boyer and Stanley Cohen made the first genetically modified organism in 1973, a bacterium resistant to the antibiotic kanamycin. The first genetically modified animal, a mouse, was created in 1974 by Rudolf Jaenisch, and the first plant was produced in 1983. In 1994, the Flavr Savr tomato was released, the first commercialized genetically modified food. The first genetically modified animal to be commercialized was the GloFish (2003) and the first genetically modified animal to be approved for food use was the AquAdvantage salmon in 2015.

Bacteria are the easiest organisms to engineer and have been used for research, food production, industrial protein purification (including drugs), agriculture, and art. There is potential to use them for environmental purposes or as medicine. Fungi have been engineered with much the same goals. Viruses play an important role as vectors for inserting genetic information into other organisms. This use is especially relevant to human gene therapy. There are proposals to remove the virulent genes from viruses to create vaccines. Plants have been engineered for scientific research, to create new colors in plants, deliver vaccines, and to create enhanced crops. Genetically modified crops are publicly the most controversial GMOs, in spite of having the most human health and environmental benefits. Animals are generally much harder to transform and the vast majority are still at the research stage. Mammals are the best model organisms for humans. Livestock is modified with the intention of improving economically important traits such as growth rate, quality of meat, milk composition, disease resistance, and survival. Genetically modified fish are used for scientific research, as pets, and as a food source. Genetic engineering has been proposed as a way to control mosquitos, a vector for many deadly diseases. Although human gene therapy is still relatively new, it has been used to treat genetic disorders such as severe combined immunodeficiency and Leber's congenital amaurosis.

Concerns

Many objections have been raised over the development of GMOs, particularly their commercialization. Many of these involve GM crops and whether food produced from them is safe and what impact growing them will have on the environment. Other concerns are the objectivity and rigor of regulatory authorities, contamination of non-genetically modified food, control of the food supply, patenting of life, and the use of intellectual property rights. Although there is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, GM food safety is a leading issue with critics. Gene flow, impact on non-target organisms, and escape are the major environmental concerns. Countries have adopted regulatory measures to deal with these concerns. There are differences in the regulation for the release of GMOs between countries, with some of the most marked differences occurring between the US and Europe. Key issues concerning regulators include whether GM food should be labeled and the status of gene-edited organisms.

Definition

The definition of a genetically modified organism (GMO) is not clear and varies widely between countries, international bodies, and other communities. At its broadest, the definition of a GMO can include anything that has had its genes altered, including by nature. Taking a less broad view, it can encompass every organism that has had its genes altered by humans, which would include all crops and livestock. In 1993, the Encyclopedia Britannica defined genetic engineering as "any of a wide range of techniques ... among them artificial insemination, in vitro fertilization (e.g., 'test-tube' babies), sperm banks, cloning, and gene manipulation." The European Union (EU) included a similarly broad definition in early reviews, specifically mentioning GMOs being produced by "selective breeding and other means of artificial selection" These definitions were promptly adjusted with a number of exceptions added as the result of pressure from scientific and farming communities, as well as developments in science. The EU definition later excluded traditional breeding, in vitro fertilization, induction of polyploidy, mutation breeding, and cell fusion techniques that do not use recombinant nucleic acids or a genetically modified organism in the process.

Another approach was the definition provided by the Food and Agriculture Organization, the World Health Organization, and the European Commission, stating that the organisms must be altered in a way that does "not occur naturally by mating and/or natural recombination".  Progress in science, such as the discovery of horizontal gene transfer being a relatively common natural phenomenon, further added to the confusion on what "occurs naturally", which led to further adjustments and exceptions. There are examples of crops that fit this definition, but are not normally considered GMOs. For example, the grain crop triticale was fully developed in a laboratory in 1930 using various techniques to alter its genome.

Genetically engineered organism (GEO) can be considered a more precise term compared to GMO when describing organisms' genomes that have been directly manipulated with biotechnology. The Cartagena Protocol on Biosafety used the synonym living modified organism (LMO) in 2000 and defined it as "any living organism that possesses a novel combination of genetic material obtained through the use of modern biotechnology." Modern biotechnology is further defined as "In vitro nucleic acid techniques, including recombinant deoxyribonucleic acid (DNA) and direct injection of nucleic acid into cells or organelles, or fusion of cells beyond the taxonomic family."

Originally, the term GMO was not commonly used by scientists to describe genetically engineered organisms until after usage of GMO became common in popular media. The United States Department of Agriculture (USDA) considers GMOs to be plants or animals with heritable changes introduced by genetic engineering or traditional methods, while GEO specifically refers to organisms with genes introduced, eliminated, or rearranged using molecular biology, particularly recombinant DNA techniques, such as transgenesis.

The definitions focus on the process more than the product, which means there could be GMOS and non-GMOs with very similar genotypes and phenotypes. This has led scientists to label it as a scientifically meaningless category, saying that it is impossible to group all the different types of GMOs under one common definition. It has also caused issues for organic institutions and groups looking to ban GMOs. It also poses problems as new processes are developed. The current definitions came in before genome editing became popular and there is some confusion as to whether they are GMOs. The EU has adjudged that they are changing their GMO definition to include "organisms obtained by mutagenesis", but has excluded those "obtained by means of certain mutagenesis techniques, namely those which have conventionally been used in a number of applications and have a long safety record" from regulation. This refers to traditional random mutagenesis (radiation/chemical mutation breeding) and would not exclude "new techniques" (especially those that have emerged since the adoption of the GMO directive) like gene editing. In contrast the USDA has ruled that gene edited organisms are not considered GMOs.

Even greater inconsistency and confusion is associated with various "Non-GMO" or "GMO-free" labeling schemes in food marketing, where even products such as water or salt, which do not contain any organic substances and genetic material (and thus cannot be genetically modified by definition), are being labeled to create an impression of being "more healthy".

Production

A gene gun uses biolistics to insert DNA into plant tissue.

Creating a genetically modified organism (GMO) is a multi-step process. Genetic engineers must isolate the gene they wish to insert into the host organism. This gene can be taken from a cell or artificially synthesized. If the chosen gene or the donor organism's genome has been well studied it may already be accessible from a genetic library. The gene is then combined with other genetic elements, including a promoter and terminator region and a selectable marker.

A number of techniques are available for inserting the isolated gene into the host genome. Bacteria can be induced to take up foreign DNA, usually by exposed heat shock or electroporation. DNA is generally inserted into animal cells using microinjection, where it can be injected through the cell's nuclear envelope directly into the nucleus, or through the use of viral vectors. In plants the DNA is often inserted using Agrobacterium-mediated recombinationbiolistics or electroporation.

As only a single cell is transformed with genetic material, the organism must be regenerated from that single cell. In plants this is accomplished through tissue culture. In animals it is necessary to ensure that the inserted DNA is present in the embryonic stem cells. Further testing using PCR, Southern hybridization, and DNA sequencing is conducted to confirm that an organism contains the new gene.

Traditionally the new genetic material was inserted randomly within the host genome. Gene targeting techniques, which creates double-stranded breaks and takes advantage on the cells natural homologous recombination repair systems, have been developed to target insertion to exact locations. Genome editing uses artificially engineered nucleases that create breaks at specific points. There are four families of engineered nucleases: meganucleaseszinc finger nucleasestranscription activator-like effector nucleases (TALENs), and the Cas9-guideRNA system (adapted from CRISPR). TALEN and CRISPR are the two most commonly used and each has its own advantages. TALENs have greater target specificity, while CRISPR is easier to design and more efficient.

History

Herbert Boyer (pictured) and Stanley Cohen created the first genetically modified organism in 1973.

Humans have domesticated plants and animals since around 12,000 BCE, using selective breeding or artificial selection (as contrasted with natural selection). The process of selective breeding, in which organisms with desired traits (and thus with the desired genes) are used to breed the next generation and organisms lacking the trait are not bred, is a precursor to the modern concept of genetic modification. Various advancements in genetics allowed humans to directly alter the DNA and therefore genes of organisms. In 1972, Paul Berg created the first recombinant DNA molecule when he combined DNA from a monkey virus with that of the lambda virus.

Herbert Boyer and Stanley Cohen made the first genetically modified organism in 1973. They took a gene from a bacterium that provided resistance to the antibiotic kanamycin, inserted it into a plasmid and then induced other bacteria to incorporate the plasmid. The bacteria that had successfully incorporated the plasmid was then able to survive in the presence of kanamycin. Boyer and Cohen expressed other genes in bacteria. This included genes from the toad Xenopus laevis in 1974, creating the first GMO expressing a gene from an organism of a different kingdom.

In 1974, Rudolf Jaenisch created the first genetically modified animal.

In 1974, Rudolf Jaenisch created a transgenic mouse by introducing foreign DNA into its embryo, making it the world's first transgenic animal. However it took another eight years before transgenic mice were developed that passed the transgene to their offspring. Genetically modified mice were created in 1984 that carried cloned oncogenes, predisposing them to developing cancer. Mice with genes removed (termed a knockout mouse) were created in 1989. The first transgenic livestock were produced in 1985 and the first animal to synthesize transgenic proteins in their milk were mice in 1987. The mice were engineered to produce human tissue plasminogen activator, a protein involved in breaking down blood clots.

In 1983, the first genetically engineered plant was developed by Michael W. Bevan, Richard B. Flavell and Mary-Dell Chilton. They infected tobacco with Agrobacterium transformed with an antibiotic resistance gene and through tissue culture techniques were able to grow a new plant containing the resistance gene. The gene gun was invented in 1987, allowing transformation of plants not susceptible to Agrobacterium infection. In 2000, Vitamin A-enriched golden rice was the first plant developed with increased nutrient value.

In 1976, Genentech, the first genetic engineering company was founded by Herbert Boyer and Robert Swanson; a year later, the company produced a human protein (somatostatin) in E. coli. Genentech announced the production of genetically engineered human insulin in 1978. The insulin produced by bacteria, branded Humulin, was approved for release by the Food and Drug Administration in 1982. In 1988, the first human antibodies were produced in plants. In 1987, a strain of Pseudomonas syringae became the first genetically modified organism to be released into the environment when a strawberry and potato field in California were sprayed with it.

The first genetically modified crop, an antibiotic-resistant tobacco plant, was produced in 1982. China was the first country to commercialize transgenic plants, introducing a virus-resistant tobacco in 1992. In 1994, Calgene attained approval to commercially release the Flavr Savr tomato, the first genetically modified food. Also in 1994, the European Union approved tobacco engineered to be resistant to the herbicide bromoxynil, making it the first genetically engineered crop commercialized in Europe. An insect resistant Potato was approved for release in the US in 1995, and by 1996 approval had been granted to commercially grow 8 transgenic crops and one flower crop (carnation) in 6 countries plus the EU.

In 2010, scientists at the J. Craig Venter Institute announced that they had created the first synthetic bacterial genome. They named it Synthia and it was the world's first synthetic life form.

The first genetically modified animal to be commercialized was the GloFish, a Zebra fish with a fluorescent gene added that allows it to glow in the dark under ultraviolet light. It was released to the US market in 2003. In 2015, AquAdvantage salmon became the first genetically modified animal to be approved for food use. Approval is for fish raised in Panama and sold in the US. The salmon were transformed with a growth hormone-regulating gene from a Pacific Chinook salmon and a promoter from an ocean pout enabling it to grow year-round instead of only during spring and summer.

Bacteria

Left: Bacteria transformed with pGLO under ambient light
Right: Bacteria transformed with pGLO visualized under ultraviolet light

Bacteria were the first organisms to be genetically modified in the laboratory, due to the relative ease of modifying their chromosomes. This ease made them important tools for the creation of other GMOs. Genes and other genetic information from a wide range of organisms can be added to a plasmid and inserted into bacteria for storage and modification. Bacteria are cheap, easy to grow, clonal, multiply quickly and can be stored at −80 °C almost indefinitely. Once a gene is isolated it can be stored inside the bacteria, providing an unlimited supply for research. A large number of custom plasmids make manipulating DNA extracted from bacteria relatively easy.

Their ease of use has made them great tools for scientists looking to study gene function and evolution. The simplest model organisms come from bacteria, with most of our early understanding of molecular biology coming from studying Escherichia coli. Scientists can easily manipulate and combine genes within the bacteria to create novel or disrupted proteins and observe the effect this has on various molecular systems. Researchers have combined the genes from bacteria and archaea, leading to insights on how these two diverged in the past. In the field of synthetic biology, they have been used to test various synthetic approaches, from synthesizing genomes to creating novel nucleotides.

Bacteria have been used in the production of food for a long time, and specific strains have been developed and selected for that work on an industrial scale. They can be used to produce enzymes, amino acids, flavorings, and other compounds used in food production. With the advent of genetic engineering, new genetic changes can easily be introduced into these bacteria. Most food-producing bacteria are lactic acid bacteria, and this is where the majority of research into genetically engineering food-producing bacteria has gone. The bacteria can be modified to operate more efficiently, reduce toxic byproduct production, increase output, create improved compounds, and remove unnecessary pathways. Food products from genetically modified bacteria include alpha-amylase, which converts starch to simple sugars, chymosin, which clots milk protein for cheese making, and pectinesterase, which improves fruit juice clarity. The majority are produced in the US and even though regulations are in place to allow production in Europe, as of 2015 no food products derived from bacteria are currently available there.

Genetically modified bacteria are used to produce large amounts of proteins for industrial use. The bacteria are generally grown to a large volume before the gene encoding the protein is activated. The bacteria are then harvested and the desired protein purified from them. The high cost of extraction and purification has meant that only high value products have been produced at an industrial scale. The majority of these products are human proteins for use in medicine. Many of these proteins are impossible or difficult to obtain via natural methods and they are less likely to be contaminated with pathogens, making them safer. The first medicinal use of GM bacteria was to produce the protein insulin to treat diabetes. Other medicines produced include clotting factors to treat hemophiliahuman growth hormone to treat various forms of dwarfisminterferon to treat some cancers, erythropoietin for anemic patients, and tissue plasminogen activator which dissolves blood clots. Outside of medicine they have been used to produce biofuels. There is interest in developing an extracellular expression system within the bacteria to reduce costs and make the production of more products economical.

With a greater understanding of the role that the microbiome plays in human health, there is a potential to treat diseases by genetically altering the bacteria to, themselves, be therapeutic agents. Ideas include altering gut bacteria so they destroy harmful bacteria, or using bacteria to replace or increase deficient enzymes or proteins. One research focus is to modify Lactobacillus, bacteria that naturally provide some protection against HIV, with genes that will further enhance this protection. If the bacteria do not form colonies inside the patient, the person must repeatedly ingest the modified bacteria in order to get the required doses. Enabling the bacteria to form a colony could provide a more long-term solution, but could also raise safety concerns as interactions between bacteria and the human body are less well understood than with traditional drugs. There are concerns that horizontal gene transfer to other bacteria could have unknown effects. As of 2018 there are clinical trials underway testing the efficacy and safety of these treatments.

For over a century, bacteria have been used in agriculture. Crops have been inoculated with Rhizobia (and more recently Azospirillum) to increase their production or to allow them to be grown outside their original habitat. Application of Bacillus thuringiensis (Bt) and other bacteria can help protect crops from insect infestation and plant diseases. With advances in genetic engineering, these bacteria have been manipulated for increased efficiency and expanded host range. Markers have also been added to aid in tracing the spread of the bacteria. The bacteria that naturally colonize certain crops have also been modified, in some cases to express the Bt genes responsible for pest resistance. Pseudomonas strains of bacteria cause frost damage by nucleating water into ice crystals around themselves. This led to the development of ice-minus bacteria, which have the ice-forming genes removed. When applied to crops they can compete with the non-modified bacteria and confer some frost resistance.

This artwork is made with bacteria modified to express 8 different colors of fluorescent proteins.

Other uses for genetically modified bacteria include bioremediation, where the bacteria are used to convert pollutants into a less toxic form. Genetic engineering can increase the levels of the enzymes used to degrade a toxin or to make the bacteria more stable under environmental conditions. Bioart has also been created using genetically modified bacteria. In the 1980s artist Jon Davis and geneticist Dana Boyd converted the Germanic symbol for femininity (ᛉ) into binary code and then into a DNA sequence, which was then expressed in Escherichia coli. This was taken a step further in 2012, when a whole book was encoded onto DNA. Paintings have also been produced using bacteria transformed with fluorescent proteins.

Viruses

Viruses are often modified so they can be used as vectors for inserting genetic information into other organisms. This process is called transduction and if successful the recipient of the introduced DNA becomes a GMO. Different viruses have different efficiencies and capabilities. Researchers can use this to control for various factors; including the target location, insert size, and duration of gene expression. Any dangerous sequences inherent in the virus must be removed, while those that allow the gene to be delivered effectively are retained.

While viral vectors can be used to insert DNA into almost any organism it is especially relevant for its potential in treating human disease. Although primarily still at trial stages, there has been some successes using gene therapy to replace defective genes. This is most evident in curing patients with severe combined immunodeficiency rising from adenosine deaminase deficiency (ADA-SCID), although the development of leukemia in some ADA-SCID patients along with the death of Jesse Gelsinger in a 1999 trial set back the development of this approach for many years. In 2009, another breakthrough was achieved when an eight-year-old boy with Leber's congenital amaurosis regained normal eyesight and in 2016 GlaxoSmithKline gained approval to commercialize a gene therapy treatment for ADA-SCID. As of 2018, there are a substantial number of clinical trials underway, including treatments for hemophilia, glioblastoma, chronic granulomatous disease, cystic fibrosis and various cancers.

The most common virus used for gene delivery comes from adenoviruses as they can carry up to 7.5 kb of foreign DNA and infect a relatively broad range of host cells, although they have been known to elicit immune responses in the host and only provide short term expression. Other common vectors are adeno-associated viruses, which have lower toxicity and longer-term expression, but can only carry about 4kb of DNA. Herpes simplex viruses make promising vectors, having a carrying capacity of over 30kb and providing long term expression, although they are less efficient at gene delivery than other vectors. The best vectors for long term integration of the gene into the host genome are retroviruses, but their propensity for random integration is problematic. Lentiviruses are a part of the same family as retroviruses with the advantage of infecting both dividing and non-dividing cells, whereas retroviruses only target dividing cells. Other viruses that have been used as vectors include alphaviruses, flaviviruses, measles viruses, rhabdoviruses, Newcastle disease virus, poxviruses, and picornaviruses.

Most vaccines consist of viruses that have been attenuated, disabled, weakened or killed in some way so that their virulent properties are no longer effective. Genetic engineering could theoretically be used to create viruses with the virulent genes removed. This does not affect the viruses infectivity, invokes a natural immune response and there is no chance that they will regain their virulence function, which can occur with some other vaccines. As such they are generally considered safer and more efficient than conventional vaccines, although concerns remain over non-target infection, potential side effects and horizontal gene transfer to other viruses. Another potential approach is to use vectors to create novel vaccines for diseases that have no vaccines available or the vaccines that do not work effectively, such as AIDS, malaria, and tuberculosis. The most effective vaccine against Tuberculosis, the Bacillus Calmette–Guérin (BCG) vaccine, only provides partial protection. A modified vaccine expressing a M tuberculosis antigen is able to enhance BCG protection. It has been shown to be safe to use at phase II trials, although not as effective as initially hoped. Other vector-based vaccines have already been approved and many more are being developed.

Another potential use of genetically modified viruses is to alter them so they can directly treat diseases. This can be through expression of protective proteins or by directly targeting infected cells. In 2004, researchers reported that a genetically modified virus that exploits the selfish behavior of cancer cells might offer an alternative way of killing tumours. Since then, several researchers have developed genetically modified oncolytic viruses that show promise as treatments for various types of cancer. In 2017, researchers genetically modified a virus to express spinach defensin proteins. The virus was injected into orange trees to combat citrus greening disease that had reduced orange production by 70% since 2005.

Natural viral diseases, such as myxomatosis and rabbit hemorrhagic disease, have been used to help control pest populations. Over time the surviving pests become resistant, leading researchers to look at alternative methods. Genetically modified viruses that make the target animals infertile through immunocontraception have been created in the laboratory as well as others that target the developmental stage of the animal. There are concerns with using this approach regarding virus containment and cross species infection. Sometimes the same virus can be modified for contrasting purposes. Genetic modification of the myxoma virus has been proposed to conserve European wild rabbits in the Iberian peninsula and to help regulate them in Australia. To protect the Iberian species from viral diseases, the myxoma virus was genetically modified to immunize the rabbits, while in Australia the same myxoma virus was genetically modified to lower fertility in the Australian rabbit population.

Outside of biology scientists have used a genetically modified virus to construct a lithium-ion battery and other nanostructured materials. It is possible to engineer bacteriophages to express modified proteins on their surface and join them up in specific patterns (a technique called phage display). These structures have potential uses for energy storage and generation, biosensing and tissue regeneration with some new materials currently produced including quantum dots, liquid crystals, nanorings and nanofibres. The battery was made by engineering M13 bacteriaophages so they would coat themselves in iron phosphate and then assemble themselves along a carbon nanotube. This created a highly conductive medium for use in a cathode, allowing energy to be transferred quickly. They could be constructed at lower temperatures with non-toxic chemicals, making them more environmentally friendly.

Fungi

Fungi can be used for many of the same processes as bacteria. For industrial applications, yeasts combine the bacterial advantages of being a single-celled organism that is easy to manipulate and grow with the advanced protein modifications found in eukaryotes. They can be used to produce large complex molecules for use in food, pharmaceuticals, hormones, and steroids. Yeast is important for wine production and as of 2016 two genetically modified yeasts involved in the fermentation of wine have been commercialized in the United States and Canada. One has increased malolactic fermentation efficiency, while the other prevents the production of dangerous ethyl carbamate compounds during fermentation. There have also been advances in the production of biofuel from genetically modified fungi.

Fungi, being the most common pathogens of insects, make attractive biopesticides. Unlike bacteria and viruses they have the advantage of infecting the insects by contact alone, although they are out competed in efficiency by chemical pesticides. Genetic engineering can improve virulence, usually by adding more virulent proteins, increasing infection rate or enhancing spore persistence. Many of the disease carrying vectors are susceptible to entomopathogenic fungi. An attractive target for biological control are mosquitos, vectors for a range of deadly diseases, including malaria, yellow fever and dengue fever. Mosquitos can evolve quickly so it becomes a balancing act of killing them before the Plasmodium they carry becomes the infectious disease, but not so fast that they become resistant to the fungi. By genetically engineering fungi like Metarhizium anisopliae and Beauveria bassiana to delay the development of mosquito infectiousness the selection pressure to evolve resistance is reduced. Another strategy is to add proteins to the fungi that block transmission of malaria or remove the Plasmodium altogether.

Agaricus bisporus the common white button mushroom, has been gene edited to resist browning, giving it a longer shelf life. The process used CRISPR to knock out a gene that encodes polyphenol oxidase. As it didn't introduce any foreign DNA into the organism it was not deemed to be regulated under existing GMO frameworks and as such is the first CRISPR-edited organism to be approved for release. This has intensified debates as to whether gene-edited organisms should be considered genetically modified organisms and how they should be regulated.

Plants

Tissue culture used to regenerate Arabidopsis thaliana

Plants have been engineered for scientific research, to display new flower colors, deliver vaccines, and to create enhanced crops. Many plants are pluripotent, meaning that a single cell from a mature plant can be harvested and under the right conditions can develop into a new plant. This ability can be taken advantage of by genetic engineers; by selecting for cells that have been successfully transformed in an adult plant a new plant can then be grown that contains the transgene in every cell through a process known as tissue culture.

Much of the advances in the field of genetic engineering has come from experimentation with tobacco. Major advances in tissue culture and plant cellular mechanisms for a wide range of plants has originated from systems developed in tobacco. It was the first plant to be altered using genetic engineering and is considered a model organism for not only genetic engineering, but a range of other fields. As such the transgenic tools and procedures are well established making tobacco one of the easiest plants to transform. Another major model organism relevant to genetic engineering is Arabidopsis thaliana. Its small genome and short life cycle makes it easy to manipulate and it contains many homologs to important crop species. It was the first plant sequenced, has a host of online resources available and can be transformed by simply dipping a flower in a transformed Agrobacterium solution.

In research, plants are engineered to help discover the functions of certain genes. The simplest way to do this is to remove the gene and see what phenotype develops compared to the wild type form. Any differences are possibly the result of the missing gene. Unlike mutagenisis, genetic engineering allows targeted removal without disrupting other genes in the organism. Some genes are only expressed in certain tissues, so reporter genes, like GUS, can be attached to the gene of interest allowing visualization of the location. Other ways to test a gene is to alter it slightly and then return it to the plant and see if it still has the same effect on phenotype. Other strategies include attaching the gene to a strong promoter and see what happens when it is overexpressed, forcing a gene to be expressed in a different location or at different developmental stages.

Suntory "blue" rose

Some genetically modified plants are purely ornamental. They are modified for flower color, fragrance, flower shape and plant architecture. The first genetically modified ornamentals commercialized altered color. Carnations were released in 1997, with the most popular genetically modified organism, a blue rose (actually lavender or mauve) created in 2004. The roses are sold in Japan, the United States, and Canada. Other genetically modified ornamentals include Chrysanthemum and Petunia. As well as increasing aesthetic value there are plans to develop ornamentals that use less water or are resistant to the cold, which would allow them to be grown outside their natural environments.

It has been proposed to genetically modify some plant species threatened by extinction to be resistant to invasive plants and diseases, such as the emerald ash borer in North American and the fungal disease, Ceratocystis platani, in European plane trees. The papaya ringspot virus devastated papaya trees in Hawaii in the twentieth century until transgenic papaya plants were given pathogen-derived resistance. However, genetic modification for conservation in plants remains mainly speculative. A unique concern is that a transgenic species may no longer bear enough resemblance to the original species to truly claim that the original species is being conserved. Instead, the transgenic species may be genetically different enough to be considered a new species, thus diminishing the conservation worth of genetic modification.

Crops

Wild type peanut (top) and transgenic peanut with Bacillus thuringiensis gene added (bottom) exposed to cornstalk borer larva

Genetically modified crops are genetically modified plants that are used in agriculture. The first crops developed were used for animal or human food and provide resistance to certain pests, diseases, environmental conditions, spoilage or chemical treatments (e.g. resistance to a herbicide). The second generation of crops aimed to improve the quality, often by altering the nutrient profile. Third generation genetically modified crops could be used for non-food purposes, including the production of pharmaceutical agents, biofuels, and other industrially useful goods, as well as for bioremediation.

Kenyans examining insect-resistant transgenic Bacillus thuringiensis (Bt) corn

There are three main aims to agricultural advancement; increased production, improved conditions for agricultural workers and sustainability. GM crops contribute by improving harvests through reducing insect pressure, increasing nutrient value and tolerating different abiotic stresses. Despite this potential, as of 2018, the commercialized crops are limited mostly to cash crops like cotton, soybean, maize and canola and the vast majority of the introduced traits provide either herbicide tolerance or insect resistance. Soybeans accounted for half of all genetically modified crops planted in 2014. Adoption by farmers has been rapid, between 1996 and 2013, the total surface area of land cultivated with GM crops increased by a factor of 100. Geographically though the spread has been uneven, with strong growth in the Americas and parts of Asia and little in Europe and Africa. Its socioeconomic spread has been more even, with approximately 54% of worldwide GM crops grown in developing countries in 2013. Although doubts have been raised, most studies have found growing GM crops to be beneficial to farmers through decreased pesticide use as well as increased crop yield and farm profit.

The majority of GM crops have been modified to be resistant to selected herbicides, usually a glyphosate or glufosinate based one. Genetically modified crops engineered to resist herbicides are now more available than conventionally bred resistant varieties; in the USA 93% of soybeans and most of the GM maize grown is glyphosate tolerant. Most currently available genes used to engineer insect resistance come from the Bacillus thuringiensis bacterium and code for delta endotoxins. A few use the genes that encode for vegetative insecticidal proteins. The only gene commercially used to provide insect protection that does not originate from B. thuringiensis is the Cowpea trypsin inhibitor (CpTI). CpTI was first approved for use cotton in 1999 and is currently undergoing trials in rice. Less than one percent of GM crops contained other traits, which include providing virus resistance, delaying senescence and altering the plants composition.

Golden rice compared to white rice

Golden rice is the most well known GM crop that is aimed at increasing nutrient value. It has been engineered with three genes that biosynthesise beta-carotene, a precursor of vitamin A, in the edible parts of rice. It is intended to produce a fortified food to be grown and consumed in areas with a shortage of dietary vitamin A, a deficiency which each year is estimated to kill 670,000 children under the age of 5 and cause an additional 500,000 cases of irreversible childhood blindness. The original golden rice produced 1.6μg/g of the carotenoids, with further development increasing this 23 times. It gained its first approvals for use as food in 2018.

Plants and plant cells have been genetically engineered for production of biopharmaceuticals in bioreactors, a process known as pharming. Work has been done with duckweed Lemna minor, the algae Chlamydomonas reinhardtii and the moss Physcomitrella patens. Biopharmaceuticals produced include cytokines, hormones, antibodies, enzymes and vaccines, most of which are accumulated in the plant seeds. Many drugs also contain natural plant ingredients and the pathways that lead to their production have been genetically altered or transferred to other plant species to produce greater volume. Other options for bioreactors are biopolymers and biofuels. Unlike bacteria, plants can modify the proteins post-translationally, allowing them to make more complex molecules. They also pose less risk of being contaminated. Therapeutics have been cultured in transgenic carrot and tobacco cells, including a drug treatment for Gaucher's disease.

Vaccine production and storage has great potential in transgenic plants. Vaccines are expensive to produce, transport, and administer, so having a system that could produce them locally would allow greater access to poorer and developing areas. As well as purifying vaccines expressed in plants it is also possible to produce edible vaccines in plants. Edible vaccines stimulate the immune system when ingested to protect against certain diseases. Being stored in plants reduces the long-term cost as they can be disseminated without the need for cold storage, don't need to be purified, and have long term stability. Also being housed within plant cells provides some protection from the gut acids upon digestion. However the cost of developing, regulating, and containing transgenic plants is high, leading to most current plant-based vaccine development being applied to veterinary medicine, where the controls are not as strict.

Genetically modified crops have been proposed as one of the ways to reduce farming-related CO2 emissions due to higher yield, reduced use of pesticides, reduced use of tractor fuel and no tillage. According to a 2021 study, in EU alone widespread adoption of GE crops would reduce greenhouse gas emissions by 33 million tons of CO2 equivalent or 7.5% of total farming-related emissions.

Animals

The vast majority of genetically modified animals are at the research stage with the number close to entering the market remaining small. As of 2018 only three genetically modified animals have been approved, all in the USA. A goat and a chicken have been engineered to produce medicines and a salmon has increased its own growth. Despite the differences and difficulties in modifying them, the end aims are much the same as for plants. GM animals are created for research purposes, production of industrial or therapeutic products, agricultural uses, or improving their health. There is also a market for creating genetically modified pets.

Mammals

Some chimeras, like the blotched mouse shown, are created through genetic modification techniques like gene targeting.

The process of genetically engineering mammals is slow, tedious, and expensive. However, new technologies are making genetic modifications easier and more precise. The first transgenic mammals were produced by injecting viral DNA into embryos and then implanting the embryos in females. The embryo would develop and it would be hoped that some of the genetic material would be incorporated into the reproductive cells. Then researchers would have to wait until the animal reached breeding age and then offspring would be screened for the presence of the gene in every cell. The development of the CRISPR-Cas9 gene editing system as a cheap and fast way of directly modifying germ cells, effectively halving the amount of time needed to develop genetically modified mammals.

Mammals are the best models for human disease, making genetic engineered ones vital to the discovery and development of cures and treatments for many serious diseases. Knocking out genes responsible for human genetic disorders allows researchers to study the mechanism of the disease and to test possible cures. Genetically modified mice have been the most common mammals used in biomedical research, as they are cheap and easy to manipulate. Pigs are also a good target as they have a similar body size and anatomical features, physiology, pathophysiological response and diet. Nonhuman primates are the most similar model organisms to humans, but there is less public acceptance towards using them as research animals. In 2009, scientists announced that they had successfully transferred a gene into a primate species (marmosets) for the first time. Their first research target for these marmosets was Parkinson's disease, but they were also considering amyotrophic lateral sclerosis and Huntington's disease.

Human proteins expressed in mammals are more likely to be similar to their natural counterparts than those expressed in plants or microorganisms. Stable expression has been accomplished in sheep, pigs, rats and other animals. In 2009, the first human biological drug produced from such an animal, a goat, was approved. The drug, ATryn, is an anticoagulant which reduces the probability of blood clots during surgery or childbirth and is extracted from the goat's milk. Human alpha-1-antitrypsin is another protein that has been produced from goats and is used in treating humans with this deficiency. Another medicinal area is in creating pigs with greater capacity for human organ transplants (xenotransplantation). Pigs have been genetically modified so that their organs can no longer carry retroviruses or have modifications to reduce the chance of rejection. Chimeric pigs could carry fully human organs. The first human transplant of a genetically modified pig heart occurred in 2023, and kidney in 2024.

Livestock are modified with the intention of improving economically important traits such as growth-rate, quality of meat, milk composition, disease resistance and survival. Animals have been engineered to grow faster, be healthier and resist diseases. Modifications have also improved the wool production of sheep and udder health of cows. Goats have been genetically engineered to produce milk with strong spiderweb-like silk proteins in their milk. A GM pig called Enviropig was created with the capability of digesting plant phosphorus more efficiently than conventional pigs. They could reduce water pollution since they excrete 30 to 70% less phosphorus in manure. Dairy cows have been genetically engineered to produce milk that would be the same as human breast milk. This could potentially benefit mothers who cannot produce breast milk but want their children to have breast milk rather than formula. Researchers have also developed a genetically engineered cow that produces allergy-free milk.

Mice expressing the green fluorescent protein

Scientists have genetically engineered several organisms, including some mammals, to include green fluorescent protein (GFP), for research purposes. GFP and other similar reporting genes allow easy visualization and localization of the products of the genetic modification. Fluorescent pigs have been bred to study human organ transplants, regenerating ocular photoreceptor cells, and other topics. In 2011, green-fluorescent cats were created to help find therapies for HIV/AIDS and other diseases as feline immunodeficiency virus is related to HIV.

There have been suggestions that genetic engineering could be used to bring animals back from extinction. It involves changing the genome of a close living relative to resemble the extinct one and is currently being attempted with the passenger pigeon. Genes associated with the woolly mammoth have been added to the genome of an African Elephant, although the lead researcher says he has no intention of creating live elephants and transferring all the genes and reversing years of genetic evolution is a long way from being feasible. It is more likely that scientists could use this technology to conserve endangered animals by bringing back lost diversity or transferring evolved genetic advantages from adapted organisms to those that are struggling.

Humans

Gene therapy uses genetically modified viruses to deliver genes which can cure disease in humans. Although gene therapy is still relatively new, it has had some successes. It has been used to treat genetic disorders such as severe combined immunodeficiency, and Leber's congenital amaurosis. Treatments are also being developed for a range of other currently incurable diseases, such as cystic fibrosissickle cell anemiaParkinson's diseasecancerdiabetesheart disease and muscular dystrophy. These treatments only effect somatic cells, meaning any changes would not be inheritable. Germline gene therapy results in any change being inheritable, which has raised concerns within the scientific community.

In 2015, CRISPR was used to edit the DNA of non-viable human embryos. In November 2018, He Jiankui announced that he had edited the genomes of two human embryos, in an attempt to disable the CCR5 gene, which codes for a receptor that HIV uses to enter cells. He said that twin girls, Lulu and Nana, had been born a few weeks earlier and that they carried functional copies of CCR5 along with disabled CCR5 (mosaicism) and were still vulnerable to HIV. The work was widely condemned as unethical, dangerous, and premature.

Fish

Genetically modified fish are used for scientific research, as pets and as a food source. Aquaculture is a growing industry, currently providing over half the consumed fish worldwide. Through genetic engineering it is possible to increase growth rates, reduce food intake, remove allergenic properties, increase cold tolerance and provide disease resistance. Fish can also be used to detect aquatic pollution or function as bioreactors.

Several groups have been developing zebrafish to detect pollution by attaching fluorescent proteins to genes activated by the presence of pollutants. The fish will then glow and can be used as environmental sensors. The GloFish is a brand of genetically modified fluorescent zebrafish with bright red, green, and orange fluorescent color. It was originally developed by one of the groups to detect pollution, but is now part of the ornamental fish trade, becoming the first genetically modified animal to become publicly available as a pet when in 2003 it was introduced for sale in the USA.

GM fish are widely used in basic research in genetics and development. Two species of fish, zebrafish and medaka, are most commonly modified because they have optically clear chorions (membranes in the egg), rapidly develop, and the one-cell embryo is easy to see and microinject with transgenic DNA. Zebrafish are model organisms for developmental processes, regeneration, genetics, behavior, disease mechanisms and toxicity testing. Their transparency allows researchers to observe developmental stages, intestinal functions and tumour growth. The generation of transgenic protocols (whole organism, cell or tissue specific, tagged with reporter genes) has increased the level of information gained by studying these fish.

GM fish have been developed with promoters driving an over-production of growth hormone for use in the aquaculture industry to increase the speed of development and potentially reduce fishing pressure on wild stocks. This has resulted in dramatic growth enhancement in several species, including salmontrout and tilapiaAquaBounty Technologies, a biotechnology company, have produced a salmon (called AquAdvantage salmon) that can mature in half the time as wild salmon. It obtained regulatory approval in 2015, the first non-plant GMO food to be commercialized. As of August 2017, GMO salmon is being sold in Canada. Sales in the US started in May 2021.

Insects

In biological research, transgenic fruit flies (Drosophila melanogaster) are model organisms used to study the effects of genetic changes on development. Fruit flies are often preferred over other animals due to their short life cycle and low maintenance requirements. They also have a relatively simple genome compared to many vertebrates, with typically only one copy of each gene, making phenotypic analysis easy. Drosophila have been used to study genetics and inheritance, embryonic development, learning, behavior, and aging. The discovery of transposons, in particular the p-element, in Drosophila provided an early method to add transgenes to their genome, although this has been taken over by more modern gene-editing techniques.

Due to their significance to human health, scientists are looking at ways to control mosquitoes through genetic engineering. Malaria-resistant mosquitoes have been developed in the laboratory by inserting a gene that reduces the development of the malaria parasite and then use homing endonucleases to rapidly spread that gene throughout the male population (known as a gene drive). This approach has been taken further by using the gene drive to spread a lethal gene. In trials the populations of Aedes aegypti mosquitoes, the single most important carrier of dengue fever and Zika virus, were reduced by between 80% and by 90%. Another approach is to use a sterile insect technique, whereby males genetically engineered to be sterile out compete viable males, to reduce population numbers.

Other insect pests that make attractive targets are moths. Diamondback moths cause US$4 to $5 billion of damage each year worldwide. The approach is similar to the sterile technique tested on mosquitoes, where males are transformed with a gene that prevents any females born from reaching maturity. They underwent field trials in 2017. Genetically modified moths have previously been released in field trials. In this case a strain of pink bollworm that were sterilized with radiation were genetically engineered to express a red fluorescent protein making it easier for researchers to monitor them.

Silkworm, the larvae stage of Bombyx mori, is an economically important insect in sericulture. Scientists are developing strategies to enhance silk quality and quantity. There is also potential to use the silk producing machinery to make other valuable proteins. Proteins currently developed to be expressed by silkworms include; human serum albumin, human collagen α-chain, mouse monoclonal antibody and N-glycanase. Silkworms have been created that produce spider silk, a stronger but extremely difficult to harvest silk, and even novel silks.

Other

Frog expressing green fluorescent protein

Systems have been developed to create transgenic organisms in a wide variety of other animals. Chickens have been genetically modified for a variety of purposes. This includes studying embryo development, preventing the transmission of bird flu and providing evolutionary insights using reverse engineering to recreate dinosaur-like phenotypes. A GM chicken that produces the drug Kanuma, an enzyme that treats a rare condition, in its egg passed US regulatory approval in 2015. Genetically modified frogs, in particular Xenopus laevis and Xenopus tropicalis, are used in developmental biology research. GM frogs can also be used as pollution sensors, especially for endocrine disrupting chemicals. There are proposals to use genetic engineering to control cane toads in Australia.

The nematode Caenorhabditis elegans is one of the major model organisms for researching molecular biologyRNA interference (RNAi) was discovered in C. elegans and could be induced by simply feeding them bacteria modified to express double stranded RNA. It is also relatively easy to produce stable transgenic nematodes and this along with RNAi are the major tools used in studying their genes. The most common use of transgenic nematodes has been studying gene expression and localization by attaching reporter genes. Transgenes can also be combined with RNAi techniques to rescue phenotypes, study gene function, image cell development in real time or control expression for different tissues or developmental stages. Transgenic nematodes have been used to study viruses, toxicology, diseases, and to detect environmental pollutants.

Transgenic Hydra expressing green fluorescent protein

The gene responsible for albinism in sea cucumbers has been found and used to engineer white sea cucumbers, a rare delicacy. The technology also opens the way to investigate the genes responsible for some of the cucumbers more unusual traits, including hibernating in summer, eviscerating their intestines, and dissolving their bodies upon death. Flatworms have the ability to regenerate themselves from a single cell. Until 2017 there was no effective way to transform them, which hampered research. By using microinjection and radiation scientists have now created the first genetically modified flatworms. The bristle worm, a marine annelid, has been modified. It is of interest due to its reproductive cycle being synchronized with lunar phases, regeneration capacity and slow evolution rate. Cnidaria such as Hydra and the sea anemone Nematostella vectensis are attractive model organisms to study the evolution of immunity and certain developmental processes. Other animals that have been genetically modified include snailsgeckos, turtlescrayfish, oysters, shrimp, clams, abalone and sponges.

Regulation

Genetically modified organisms are regulated by government agencies. This applies to research as well as the release of genetically modified organisms, including crops and food. The development of a regulatory framework concerning genetic engineering began in 1975, at Asilomar, California. The Asilomar meeting recommended a set of guidelines regarding the cautious use of recombinant technology and any products resulting from that technology. The Cartagena Protocol on Biosafety was adopted on 29 January 2000 and entered into force on 11 September 2003. It is an international treaty that governs the transfer, handling, and use of genetically modified organisms. One hundred and fifty-seven countries are members of the Protocol and many use it as a reference point for their own regulations.

Universities and research institutes generally have a special committee that is responsible for approving any experiments that involve genetic engineering. Many experiments also need permission from a national regulatory group or legislation. All staff must be trained in the use of GMOs and all laboratories must gain approval from their regulatory agency to work with GMOs. The legislation covering GMOs are often derived from regulations and guidelines in place for the non-GMO version of the organism, although they are more severe. There is a near-universal system for assessing the relative risks associated with GMOs and other agents to laboratory staff and the community. They are assigned to one of four risk categories based on their virulence, the severity of the disease, the mode of transmission, and the availability of preventive measures or treatments. There are four biosafety levels that a laboratory can fall into, ranging from level 1 (which is suitable for working with agents not associated with disease) to level 4 (working with life-threatening agents). Different countries use different nomenclature to describe the levels and can have different requirements for what can be done at each level.

A label marking this peanut butter as being non-GMO
Detail of a French cheese box declaring "GMO-free" production (i.e., below 0.9%)

There are differences in the regulation for the release of GMOs between countries, with some of the most marked differences occurring between the US and Europe. Regulation varies in a given country depending on the intended use of the products of the genetic engineering. For example, a crop not intended for food use is generally not reviewed by authorities responsible for food safety. Some nations have banned the release of GMOs or restricted their use, and others permit them with widely differing degrees of regulation. In 2016, thirty eight countries officially ban or prohibit the cultivation of GMOs and nine (Algeria, Bhutan, Kenya, Kyrgyzstan, Madagascar, Peru, Russia, Venezuela and Zimbabwe) ban their importation. Most countries that do not allow GMO cultivation do permit research using GMOs. Despite regulation, illegal releases have sometimes occurred, due to weakness of enforcement.

The European Union (EU) differentiates between approval for cultivation within the EU and approval for import and processing. While only a few GMOs have been approved for cultivation in the EU a number of GMOs have been approved for import and processing. The cultivation of GMOs has triggered a debate about the market for GMOs in Europe. Depending on the coexistence regulations, incentives for cultivation of GM crops differ. The US policy does not focus on the process as much as other countries, looks at verifiable scientific risks and uses the concept of substantial equivalence. Whether gene edited organisms should be regulated the same as genetically modified organism is debated. USA regulations sees them as separate and does not regulate them under the same conditions, while in Europe a GMO is any organism created using genetic engineering techniques.

One of the key issues concerning regulators is whether GM products should be labeled. The European Commission says that mandatory labeling and traceability are needed to allow for informed choice, avoid potential false advertising and facilitate the withdrawal of products if adverse effects on health or the environment are discovered. The American Medical Association and the American Association for the Advancement of Science say that absent scientific evidence of harm even voluntary labeling is misleading and will falsely alarm consumers. Labeling of GMO products in the marketplace is required in 64 countries. Labeling can be mandatory up to a threshold GM content level (which varies between countries) or voluntary. In the U.S., the National Bioengineered Food Disclosure Standard (Mandatory Compliance Date: January 1, 2022) requires labeling GM foods. In Canada, labeling of GM food is voluntary, while in Europe all food (including processed food) or feed which contains greater than 0.9% of approved GMOs must be labeled. In 2014, sales of products that had been labeled as non-GMO grew 30 percent to $1.1 billion.

Controversy

There is controversy over GMOs, especially with regard to their release outside laboratory environments. The dispute involves consumers, producers, biotechnology companies, governmental regulators, non-governmental organizations, and scientists. Many of these concerns involve GM crops and whether food produced from them is safe and what impact growing them will have on the environment. These controversies have led to litigation, international trade disputes, and protests, and to restrictive regulation of commercial products in some countries. Most concerns are around the health and environmental effects of GMOs. These include whether they may provoke an allergic reaction, whether the transgenes could transfer to human cells, and whether genes not approved for human consumption could outcross into the food supply.

A protester advocating for the labeling of GMOs

There is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, but that each GM food needs to be tested on a case-by-case basis before introduction. Nonetheless, members of the public are much less likely than scientists to perceive GM foods as safe. The legal and regulatory status of GM foods varies by country, with some nations banning or restricting them, and others permitting them with widely differing degrees of regulation.

As late as the 1990s gene flow into wild populations was thought to be unlikely and rare, and if it were to occur, easily eradicated. It was thought that this would add no additional environmental costs or risks – no effects were expected other than those already caused by pesticide applications. However, in the decades since, several such examples have been observed. Gene flow between GM crops and compatible plants, along with increased use of broad-spectrum herbicides, can increase the risk of herbicide resistant weed populations. Debate over the extent and consequences of gene flow intensified in 2001 when a paper was published showing transgenes had been found in landrace maize in Mexico, the crop's center of diversity. Gene flow from GM crops to other organisms has been found to generally be lower than what would occur naturally. In order to address some of these concerns some GMOs have been developed with traits to help control their spread. To prevent the genetically modified salmon inadvertently breeding with wild salmon, all the fish raised for food are females, triploid, 99% are reproductively sterile, and raised in areas where escaped salmon could not survive. Bacteria have also been modified to depend on nutrients that cannot be found in nature, and genetic use restriction technology has been developed, though not yet marketed, that causes the second generation of GM plants to be sterile.

Other environmental and agronomic concerns include a decrease in biodiversity, an increase in secondary pests (non-targeted pests) and evolution of resistant insect pests. In the areas of China and the US with Bt crops the overall biodiversity of insects has increased and the impact of secondary pests has been minimal. Resistance was found to be slow to evolve when best practice strategies were followed. The impact of Bt crops on beneficial non-target organisms became a public issue after a 1999 paper suggested they could be toxic to monarch butterflies. Follow up studies have since shown that the toxicity levels encountered in the field were not high enough to harm the larvae.

Accusations that scientists are "playing God" and other religious issues have been ascribed to the technology from the beginning. With the ability to genetically engineer humans now possible there are ethical concerns over how far this technology should go, or if it should be used at all. Much debate revolves around where the line between treatment and enhancement is and whether the modifications should be inheritable. Other concerns include contamination of the non-genetically modified food supply, the rigor of the regulatory process, consolidation of control of the food supply in companies that make and sell GMOs, exaggeration of the benefits of genetic modification, or concerns over the use of herbicides with glyphosate. Other issues raised include the patenting of life and the use of intellectual property rights.

There are large differences in consumer acceptance of GMOs, with Europeans more likely to view GM food negatively than North Americans. GMOs arrived on the scene as the public confidence in food safety, attributed to recent food scares such as Bovine spongiform encephalopathy and other scandals involving government regulation of products in Europe, was low. This along with campaigns run by various non-governmental organizations (NGO) have been very successful in blocking or limiting the use of GM crops. NGOs like the Organic Consumers Association, the Union of Concerned ScientistsGreenpeace and other groups have said that risks have not been adequately identified and managed and that there are unanswered questions regarding the potential long-term impact on human health from food derived from GMOs. They propose mandatory labeling or a moratorium on such products.

Observable universe

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Observable_universe     Visual...