Search This Blog

Wednesday, April 14, 2021

Genetic engineering

From Wikipedia, the free encyclopedia

Genetic engineering, also called genetic modification or genetic manipulation, is the direct manipulation of an organism's genes using biotechnology. It is a set of technologies used to change the genetic makeup of cells, including the transfer of genes within and across species boundaries to produce improved or novel organisms. New DNA is obtained by either isolating and copying the genetic material of interest using recombinant DNA methods or by artificially synthesising the DNA. A construct is usually created and used to insert this DNA into the host organism. The first recombinant DNA molecule was made by Paul Berg in 1972 by combining DNA from the monkey virus SV40 with the lambda virus. As well as inserting genes, the process can be used to remove, or "knock out", genes. The new DNA can be inserted randomly, or targeted to a specific part of the genome.

An organism that is generated through genetic engineering is considered to be genetically modified (GM) and the resulting entity is a genetically modified organism (GMO). The first GMO was a bacterium generated by Herbert Boyer and Stanley Cohen in 1973. Rudolf Jaenisch created the first GM animal when he inserted foreign DNA into a mouse in 1974. The first company to focus on genetic engineering, Genentech, was founded in 1976 and started the production of human proteins. Genetically engineered human insulin was produced in 1978 and insulin-producing bacteria were commercialised in 1982. Genetically modified food has been sold since 1994, with the release of the Flavr Savr tomato. The Flavr Savr was engineered to have a longer shelf life, but most current GM crops are modified to increase resistance to insects and herbicides. GloFish, the first GMO designed as a pet, was sold in the United States in December 2003. In 2016 salmon modified with a growth hormone were sold.

Genetic engineering has been applied in numerous fields including research, medicine, industrial biotechnology and agriculture. In research GMOs are used to study gene function and expression through loss of function, gain of function, tracking and expression experiments. By knocking out genes responsible for certain conditions it is possible to create animal model organisms of human diseases. As well as producing hormones, vaccines and other drugs, genetic engineering has the potential to cure genetic diseases through gene therapy. The same techniques that are used to produce drugs can also have industrial applications such as producing enzymes for laundry detergent, cheeses and other products.

The rise of commercialised genetically modified crops has provided economic benefit to farmers in many different countries, but has also been the source of most of the controversy surrounding the technology. This has been present since its early use; the first field trials were destroyed by anti-GM activists. Although there is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, GM food safety is a leading concern with critics. Gene flow, impact on non-target organisms, control of the food supply and intellectual property rights have also been raised as potential issues. These concerns have led to the development of a regulatory framework, which started in 1975. It has led to an international treaty, the Cartagena Protocol on Biosafety, that was adopted in 2000. Individual countries have developed their own regulatory systems regarding GMOs, with the most marked differences occurring between the US and Europe.

IUPAC definition
Genetic engineering: Process of inserting new genetic information into existing cells in order to modify a specific organism for the purpose of changing its characteristics. 

Overview

Comparison of conventional plant breeding with transgenic and cisgenic genetic modification

Genetic engineering is a process that alters the genetic structure of an organism by either removing or introducing DNA. Unlike traditional animal and plant breeding, which involves doing multiple crosses and then selecting for the organism with the desired phenotype, genetic engineering takes the gene directly from one organism and delivers it to the other. This is much faster, can be used to insert any genes from any organism (even ones from different domains) and prevents other undesirable genes from also being added.

Genetic engineering could potentially fix severe genetic disorders in humans by replacing the defective gene with a functioning one. It is an important tool in research that allows the function of specific genes to be studied. Drugs, vaccines and other products have been harvested from organisms engineered to produce them. Crops have been developed that aid food security by increasing yield, nutritional value and tolerance to environmental stresses.

The DNA can be introduced directly into the host organism or into a cell that is then fused or hybridised with the host. This relies on recombinant nucleic acid techniques to form new combinations of heritable genetic material followed by the incorporation of that material either indirectly through a vector system or directly through micro-injection, macro-injection or micro-encapsulation.

Genetic engineering does not normally include traditional breeding, in vitro fertilisation, induction of polyploidy, mutagenesis and cell fusion techniques that do not use recombinant nucleic acids or a genetically modified organism in the process. However, some broad definitions of genetic engineering include selective breeding. Cloning and stem cell research, although not considered genetic engineering, are closely related and genetic engineering can be used within them. Synthetic biology is an emerging discipline that takes genetic engineering a step further by introducing artificially synthesised material into an organism. Such synthetic DNA as Artificially Expanded Genetic Information System and Hachimoji DNA is made in this new field.

Plants, animals or microorganisms that have been changed through genetic engineering are termed genetically modified organisms or GMOs. If genetic material from another species is added to the host, the resulting organism is called transgenic. If genetic material from the same species or a species that can naturally breed with the host is used the resulting organism is called cisgenic. If genetic engineering is used to remove genetic material from the target organism the resulting organism is termed a knockout organism. In Europe genetic modification is synonymous with genetic engineering while within the United States of America and Canada genetic modification can also be used to refer to more conventional breeding methods.

History

Humans have altered the genomes of species for thousands of years through selective breeding, or artificial selection as contrasted with natural selection. More recently, mutation breeding has used exposure to chemicals or radiation to produce a high frequency of random mutations, for selective breeding purposes. Genetic engineering as the direct manipulation of DNA by humans outside breeding and mutations has only existed since the 1970s. The term "genetic engineering" was first coined by Jack Williamson in his science fiction novel Dragon's Island, published in 1951 – one year before DNA's role in heredity was confirmed by Alfred Hershey and Martha Chase, and two years before James Watson and Francis Crick showed that the DNA molecule has a double-helix structure – though the general concept of direct genetic manipulation was explored in rudimentary form in Stanley G. Weinbaum's 1936 science fiction story Proteus Island.

In 1974 Rudolf Jaenisch created a genetically modified mouse, the first GM animal.

In 1972, Paul Berg created the first recombinant DNA molecules by combining DNA from the monkey virus SV40 with that of the lambda virus. In 1973 Herbert Boyer and Stanley Cohen created the first transgenic organism by inserting antibiotic resistance genes into the plasmid of an Escherichia coli bacterium. A year later Rudolf Jaenisch created a transgenic mouse by introducing foreign DNA into its embryo, making it the world's first transgenic animal These achievements led to concerns in the scientific community about potential risks from genetic engineering, which were first discussed in depth at the Asilomar Conference in 1975. One of the main recommendations from this meeting was that government oversight of recombinant DNA research should be established until the technology was deemed safe.

In 1976 Genentech, the first genetic engineering company, was founded by Herbert Boyer and Robert Swanson and a year later the company produced a human protein (somatostatin) in E.coli. Genentech announced the production of genetically engineered human insulin in 1978. In 1980, the U.S. Supreme Court in the Diamond v. Chakrabarty case ruled that genetically altered life could be patented. The insulin produced by bacteria was approved for release by the Food and Drug Administration (FDA) in 1982.

In 1983, a biotech company, Advanced Genetic Sciences (AGS) applied for U.S. government authorisation to perform field tests with the ice-minus strain of Pseudomonas syringae to protect crops from frost, but environmental groups and protestors delayed the field tests for four years with legal challenges. In 1987, the ice-minus strain of P. syringae became the first genetically modified organism (GMO) to be released into the environment when a strawberry field and a potato field in California were sprayed with it. Both test fields were attacked by activist groups the night before the tests occurred: "The world's first trial site attracted the world's first field trasher".

The first field trials of genetically engineered plants occurred in France and the US in 1986, tobacco plants were engineered to be resistant to herbicides. The People's Republic of China was the first country to commercialise transgenic plants, introducing a virus-resistant tobacco in 1992. In 1994 Calgene attained approval to commercially release the first genetically modified food, the Flavr Savr, a tomato engineered to have a longer shelf life. In 1994, the European Union approved tobacco engineered to be resistant to the herbicide bromoxynil, making it the first genetically engineered crop commercialised in Europe. In 1995, Bt Potato was approved safe by the Environmental Protection Agency, after having been approved by the FDA, making it the first pesticide producing crop to be approved in the US. In 2009 11 transgenic crops were grown commercially in 25 countries, the largest of which by area grown were the US, Brazil, Argentina, India, Canada, China, Paraguay and South Africa.

In 2010, scientists at the J. Craig Venter Institute created the first synthetic genome and inserted it into an empty bacterial cell. The resulting bacterium, named Mycoplasma laboratorium, could replicate and produce proteins. Four years later this was taken a step further when a bacterium was developed that replicated a plasmid containing a unique base pair, creating the first organism engineered to use an expanded genetic alphabet. In 2012, Jennifer Doudna and Emmanuelle Charpentier collaborated to develop the CRISPR/Cas9 system, a technique which can be used to easily and specifically alter the genome of almost any organism.

Process

Polymerase chain reaction is a powerful tool used in molecular cloning

Creating a GMO is a multi-step process. Genetic engineers must first choose what gene they wish to insert into the organism. This is driven by what the aim is for the resultant organism and is built on earlier research. Genetic screens can be carried out to determine potential genes and further tests then used to identify the best candidates. The development of microarrays, transcriptomics and genome sequencing has made it much easier to find suitable genes. Luck also plays its part; the round-up ready gene was discovered after scientists noticed a bacterium thriving in the presence of the herbicide.

Gene isolation and cloning

The next step is to isolate the candidate gene. The cell containing the gene is opened and the DNA is purified. The gene is separated by using restriction enzymes to cut the DNA into fragments or polymerase chain reaction (PCR) to amplify up the gene segment. These segments can then be extracted through gel electrophoresis. If the chosen gene or the donor organism's genome has been well studied it may already be accessible from a genetic library. If the DNA sequence is known, but no copies of the gene are available, it can also be artificially synthesised. Once isolated the gene is ligated into a plasmid that is then inserted into a bacterium. The plasmid is replicated when the bacteria divide, ensuring unlimited copies of the gene are available.

Before the gene is inserted into the target organism it must be combined with other genetic elements. These include a promoter and terminator region, which initiate and end transcription. A selectable marker gene is added, which in most cases confers antibiotic resistance, so researchers can easily determine which cells have been successfully transformed. The gene can also be modified at this stage for better expression or effectiveness. These manipulations are carried out using recombinant DNA techniques, such as restriction digests, ligations and molecular cloning.

Inserting DNA into the host genome

A gene gun uses biolistics to insert DNA into plant tissue

There are a number of techniques used to insert genetic material into the host genome. Some bacteria can naturally take up foreign DNA. This ability can be induced in other bacteria via stress (e.g. thermal or electric shock), which increases the cell membrane's permeability to DNA; up-taken DNA can either integrate with the genome or exist as extrachromosomal DNA. DNA is generally inserted into animal cells using microinjection, where it can be injected through the cell's nuclear envelope directly into the nucleus, or through the use of viral vectors.

Plant genomes can be engineered by physical methods or by use of Agrobacterium for the delivery of sequences hosted in T-DNA binary vectors. In plants the DNA is often inserted using Agrobacterium-mediated transformation, taking advantage of the Agrobacteriums T-DNA sequence that allows natural insertion of genetic material into plant cells. Other methods include biolistics, where particles of gold or tungsten are coated with DNA and then shot into young plant cells, and electroporation, which involves using an electric shock to make the cell membrane permeable to plasmid DNA.

As only a single cell is transformed with genetic material, the organism must be regenerated from that single cell. In plants this is accomplished through the use of tissue culture. In animals it is necessary to ensure that the inserted DNA is present in the embryonic stem cells. Bacteria consist of a single cell and reproduce clonally so regeneration is not necessary. Selectable markers are used to easily differentiate transformed from untransformed cells. These markers are usually present in the transgenic organism, although a number of strategies have been developed that can remove the selectable marker from the mature transgenic plant.

A. tumefaciens attaching itself to a carrot cell

Further testing using PCR, Southern hybridization, and DNA sequencing is conducted to confirm that an organism contains the new gene. These tests can also confirm the chromosomal location and copy number of the inserted gene. The presence of the gene does not guarantee it will be expressed at appropriate levels in the target tissue so methods that look for and measure the gene products (RNA and protein) are also used. These include northern hybridisation, quantitative RT-PCR, Western blot, immunofluorescence, ELISA and phenotypic analysis.

The new genetic material can be inserted randomly within the host genome or targeted to a specific location. The technique of gene targeting uses homologous recombination to make desired changes to a specific endogenous gene. This tends to occur at a relatively low frequency in plants and animals and generally requires the use of selectable markers. The frequency of gene targeting can be greatly enhanced through genome editing. Genome editing uses artificially engineered nucleases that create specific double-stranded breaks at desired locations in the genome, and use the cell's endogenous mechanisms to repair the induced break by the natural processes of homologous recombination and nonhomologous end-joining. There are four families of engineered nucleases: meganucleases, zinc finger nucleases, transcription activator-like effector nucleases (TALENs), and the Cas9-guideRNA system (adapted from CRISPR). TALEN and CRISPR are the two most commonly used and each has its own advantages. TALENs have greater target specificity, while CRISPR is easier to design and more efficient. In addition to enhancing gene targeting, engineered nucleases can be used to introduce mutations at endogenous genes that generate a gene knockout.

Applications

Genetic engineering has applications in medicine, research, industry and agriculture and can be used on a wide range of plants, animals and microorganisms. Bacteria, the first organisms to be genetically modified, can have plasmid DNA inserted containing new genes that code for medicines or enzymes that process food and other substrates. Plants have been modified for insect protection, herbicide resistance, virus resistance, enhanced nutrition, tolerance to environmental pressures and the production of edible vaccines. Most commercialised GMOs are insect resistant or herbicide tolerant crop plants. Genetically modified animals have been used for research, model animals and the production of agricultural or pharmaceutical products. The genetically modified animals include animals with genes knocked out, increased susceptibility to disease, hormones for extra growth and the ability to express proteins in their milk.

Medicine

Genetic engineering has many applications to medicine that include the manufacturing of drugs, creation of model animals that mimic human conditions and gene therapy. One of the earliest uses of genetic engineering was to mass-produce human insulin in bacteria. This application has now been applied to human growth hormones, follicle stimulating hormones (for treating infertility), human albumin, monoclonal antibodies, antihemophilic factors, vaccines and many other drugs. Mouse hybridomas, cells fused together to create monoclonal antibodies, have been adapted through genetic engineering to create human monoclonal antibodies. In 2017, genetic engineering of chimeric antigen receptors on a patient's own T-cells was approved by the U.S. FDA as a treatment for the cancer acute lymphoblastic leukemia. Genetically engineered viruses are being developed that can still confer immunity, but lack the infectious sequences.

Genetic engineering is also used to create animal models of human diseases. Genetically modified mice are the most common genetically engineered animal model. They have been used to study and model cancer (the oncomouse), obesity, heart disease, diabetes, arthritis, substance abuse, anxiety, aging and Parkinson disease. Potential cures can be tested against these mouse models. Also genetically modified pigs have been bred with the aim of increasing the success of pig to human organ transplantation.

Gene therapy is the genetic engineering of humans, generally by replacing defective genes with effective ones. Clinical research using somatic gene therapy has been conducted with several diseases, including X-linked SCID, chronic lymphocytic leukemia (CLL), and Parkinson's disease. In 2012, Alipogene tiparvovec became the first gene therapy treatment to be approved for clinical use. In 2015 a virus was used to insert a healthy gene into the skin cells of a boy suffering from a rare skin disease, epidermolysis bullosa, in order to grow, and then graft healthy skin onto 80 percent of the boy's body which was affected by the illness.

Germline gene therapy would result in any change being inheritable, which has raised concerns within the scientific community. In 2015, CRISPR was used to edit the DNA of non-viable human embryos, leading scientists of major world academies to call for a moratorium on inheritable human genome edits. There are also concerns that the technology could be used not just for treatment, but for enhancement, modification or alteration of a human beings' appearance, adaptability, intelligence, character or behavior. The distinction between cure and enhancement can also be difficult to establish. In November 2018, He Jiankui announced that he had edited the genomes of two human embryos, to attempt to disable the CCR5 gene, which codes for a receptor that HIV uses to enter cells. He said that twin girls, Lulu and Nana, had been born a few weeks earlier. He said that the girls still carried functional copies of CCR5 along with disabled CCR5 (mosaicism) and were still vulnerable to HIV. The work was widely condemned as unethical, dangerous, and premature. Currently, germline modification is banned in 40 countries. Scientists that do this type of research will often let embryos grow for a few days without allowing it to develop into a baby. 

Researchers are altering the genome of pigs to induce the growth of human organs to be used in transplants. Scientists are creating "gene drives", changing the genomes of mosquitoes to make them immune to malaria, and then looking to spread the genetically altered mosquitoes throughout the mosquito population in the hopes of eliminating the disease.

Research

Human cells in which some proteins are fused with green fluorescent protein to allow them to be visualised

Genetic engineering is an important tool for natural scientists, with the creation of transgenic organisms one of the most important tools for analysis of gene function. Genes and other genetic information from a wide range of organisms can be inserted into bacteria for storage and modification, creating genetically modified bacteria in the process. Bacteria are cheap, easy to grow, clonal, multiply quickly, relatively easy to transform and can be stored at -80 °C almost indefinitely. Once a gene is isolated it can be stored inside the bacteria providing an unlimited supply for research. Organisms are genetically engineered to discover the functions of certain genes. This could be the effect on the phenotype of the organism, where the gene is expressed or what other genes it interacts with. These experiments generally involve loss of function, gain of function, tracking and expression.

  • Loss of function experiments, such as in a gene knockout experiment, in which an organism is engineered to lack the activity of one or more genes. In a simple knockout a copy of the desired gene has been altered to make it non-functional. Embryonic stem cells incorporate the altered gene, which replaces the already present functional copy. These stem cells are injected into blastocysts, which are implanted into surrogate mothers. This allows the experimenter to analyse the defects caused by this mutation and thereby determine the role of particular genes. It is used especially frequently in developmental biology. When this is done by creating a library of genes with point mutations at every position in the area of interest, or even every position in the whole gene, this is called "scanning mutagenesis". The simplest method, and the first to be used, is "alanine scanning", where every position in turn is mutated to the unreactive amino acid alanine.
  • Gain of function experiments, the logical counterpart of knockouts. These are sometimes performed in conjunction with knockout experiments to more finely establish the function of the desired gene. The process is much the same as that in knockout engineering, except that the construct is designed to increase the function of the gene, usually by providing extra copies of the gene or inducing synthesis of the protein more frequently. Gain of function is used to tell whether or not a protein is sufficient for a function, but does not always mean it's required, especially when dealing with genetic or functional redundancy.
  • Tracking experiments, which seek to gain information about the localisation and interaction of the desired protein. One way to do this is to replace the wild-type gene with a 'fusion' gene, which is a juxtaposition of the wild-type gene with a reporting element such as green fluorescent protein (GFP) that will allow easy visualisation of the products of the genetic modification. While this is a useful technique, the manipulation can destroy the function of the gene, creating secondary effects and possibly calling into question the results of the experiment. More sophisticated techniques are now in development that can track protein products without mitigating their function, such as the addition of small sequences that will serve as binding motifs to monoclonal antibodies.
  • Expression studies aim to discover where and when specific proteins are produced. In these experiments, the DNA sequence before the DNA that codes for a protein, known as a gene's promoter, is reintroduced into an organism with the protein coding region replaced by a reporter gene such as GFP or an enzyme that catalyses the production of a dye. Thus the time and place where a particular protein is produced can be observed. Expression studies can be taken a step further by altering the promoter to find which pieces are crucial for the proper expression of the gene and are actually bound by transcription factor proteins; this process is known as promoter bashing.

Industrial

Products of genetic engineering

Organisms can have their cells transformed with a gene coding for a useful protein, such as an enzyme, so that they will overexpress the desired protein. Mass quantities of the protein can then be manufactured by growing the transformed organism in bioreactor equipment using industrial fermentation, and then purifying the protein. Some genes do not work well in bacteria, so yeast, insect cells or mammalians cells can also be used. These techniques are used to produce medicines such as insulin, human growth hormone, and vaccines, supplements such as tryptophan, aid in the production of food (chymosin in cheese making) and fuels. Other applications with genetically engineered bacteria could involve making them perform tasks outside their natural cycle, such as making biofuels, cleaning up oil spills, carbon and other toxic waste and detecting arsenic in drinking water. Certain genetically modified microbes can also be used in biomining and bioremediation, due to their ability to extract heavy metals from their environment and incorporate them into compounds that are more easily recoverable.

In materials science, a genetically modified virus has been used in a research laboratory as a scaffold for assembling a more environmentally friendly lithium-ion battery. Bacteria have also been engineered to function as sensors by expressing a fluorescent protein under certain environmental conditions.

Agriculture

Bt-toxins present in peanut leaves (bottom image) protect it from extensive damage caused by lesser cornstalk borer larvae (top image).

One of the best-known and controversial applications of genetic engineering is the creation and use of genetically modified crops or genetically modified livestock to produce genetically modified food. Crops have been developed to increase production, increase tolerance to abiotic stresses, alter the composition of the food, or to produce novel products.

The first crops to be released commercially on a large scale provided protection from insect pests or tolerance to herbicides. Fungal and virus resistant crops have also been developed or are in development. This makes the insect and weed management of crops easier and can indirectly increase crop yield. GM crops that directly improve yield by accelerating growth or making the plant more hardy (by improving salt, cold or drought tolerance) are also under development. In 2016 Salmon have been genetically modified with growth hormones to reach normal adult size much faster.

GMOs have been developed that modify the quality of produce by increasing the nutritional value or providing more industrially useful qualities or quantities. The Amflora potato produces a more industrially useful blend of starches. Soybeans and canola have been genetically modified to produce more healthy oils. The first commercialised GM food was a tomato that had delayed ripening, increasing its shelf life.

Plants and animals have been engineered to produce materials they do not normally make. Pharming uses crops and animals as bioreactors to produce vaccines, drug intermediates, or the drugs themselves; the useful product is purified from the harvest and then used in the standard pharmaceutical production process. Cows and goats have been engineered to express drugs and other proteins in their milk, and in 2009 the FDA approved a drug produced in goat milk.

Other applications

Genetic engineering has potential applications in conservation and natural area management. Gene transfer through viral vectors has been proposed as a means of controlling invasive species as well as vaccinating threatened fauna from disease. Transgenic trees have been suggested as a way to confer resistance to pathogens in wild populations. With the increasing risks of maladaptation in organisms as a result of climate change and other perturbations, facilitated adaptation through gene tweaking could be one solution to reducing extinction risks. Applications of genetic engineering in conservation are thus far mostly theoretical and have yet to be put into practice.

Genetic engineering is also being used to create microbial art. Some bacteria have been genetically engineered to create black and white photographs. Novelty items such as lavender-colored carnations, blue roses, and glowing fish have also been produced through genetic engineering.

Regulation

The regulation of genetic engineering concerns the approaches taken by governments to assess and manage the risks associated with the development and release of GMOs. The development of a regulatory framework began in 1975, at Asilomar, California. The Asilomar meeting recommended a set of voluntary guidelines regarding the use of recombinant technology. As the technology improved the US established a committee at the Office of Science and Technology, which assigned regulatory approval of GM food to the USDA, FDA and EPA. The Cartagena Protocol on Biosafety, an international treaty that governs the transfer, handling, and use of GMOs, was adopted on 29 January 2000. One hundred and fifty-seven countries are members of the Protocol and many use it as a reference point for their own regulations.

The legal and regulatory status of GM foods varies by country, with some nations banning or restricting them, and others permitting them with widely differing degrees of regulation. Some countries allow the import of GM food with authorisation, but either do not allow its cultivation (Russia, Norway, Israel) or have provisions for cultivation even though no GM products are yet produced (Japan, South Korea). Most countries that do not allow GMO cultivation do permit research. Some of the most marked differences occurring between the US and Europe. The US policy focuses on the product (not the process), only looks at verifiable scientific risks and uses the concept of substantial equivalence. The European Union by contrast has possibly the most stringent GMO regulations in the world. All GMOs, along with irradiated food, are considered "new food" and subject to extensive, case-by-case, science-based food evaluation by the European Food Safety Authority. The criteria for authorisation fall in four broad categories: "safety", "freedom of choice", "labelling", and "traceability". The level of regulation in other countries that cultivate GMOs lie in between Europe and the United States.

Regulatory agencies by geographical region
Region Regulators Notes
US USDA, FDA and EPA
Europe European Food Safety Authority
Canada Health Canada and the Canadian Food Inspection Agency Regulated products with novel features regardless of method of origin
Africa Common Market for Eastern and Southern Africa Final decision lies with each individual country.
China Office of Agricultural Genetic Engineering Biosafety Administration
India Institutional Biosafety Committee, Review Committee on Genetic Manipulation and Genetic Engineering Approval Committee
Argentina National Agricultural Biotechnology Advisory Committee (environmental impact), the National Service of Health and Agrifood Quality (food safety) and the National Agribusiness Direction (effect on trade) Final decision made by the Secretariat of Agriculture, Livestock, Fishery and Food.
Brazil National Biosafety Technical Commission (environmental and food safety) and the Council of Ministers (commercial and economical issues)
Australia Office of the Gene Technology Regulator (oversees all GM products), Therapeutic Goods Administration (GM medicines) and Food Standards Australia New Zealand (GM food). The individual state governments can then assess the impact of release on markets and trade and apply further legislation to control approved genetically modified products.

One of the key issues concerning regulators is whether GM products should be labeled. The European Commission says that mandatory labeling and traceability are needed to allow for informed choice, avoid potential false advertising and facilitate the withdrawal of products if adverse effects on health or the environment are discovered. The American Medical Association and the American Association for the Advancement of Science say that absent scientific evidence of harm even voluntary labeling is misleading and will falsely alarm consumers. Labeling of GMO products in the marketplace is required in 64 countries. Labeling can be mandatory up to a threshold GM content level (which varies between countries) or voluntary. In Canada and the US labeling of GM food is voluntary, while in Europe all food (including processed food) or feed which contains greater than 0.9% of approved GMOs must be labelled.

Controversy

Critics have objected to the use of genetic engineering on several grounds, including ethical, ecological and economic concerns. Many of these concerns involve GM crops and whether food produced from them is safe and what impact growing them will have on the environment. These controversies have led to litigation, international trade disputes, and protests, and to restrictive regulation of commercial products in some countries.

Accusations that scientists are "playing God" and other religious issues have been ascribed to the technology from the beginning. Other ethical issues raised include the patenting of life, the use of intellectual property rights, the level of labeling on products, control of the food supply and the objectivity of the regulatory process. Although doubts have been raised, economically most studies have found growing GM crops to be beneficial to farmers.

Gene flow between GM crops and compatible plants, along with increased use of selective herbicides, can increase the risk of "superweeds" developing. Other environmental concerns involve potential impacts on non-target organisms, including soil microbes, and an increase in secondary and resistant insect pests. Many of the environmental impacts regarding GM crops may take many years to be understood and are also evident in conventional agriculture practices. With the commercialisation of genetically modified fish there are concerns over what the environmental consequences will be if they escape.

There are three main concerns over the safety of genetically modified food: whether they may provoke an allergic reaction; whether the genes could transfer from the food into human cells; and whether the genes not approved for human consumption could outcross to other crops. There is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, but that each GM food needs to be tested on a case-by-case basis before introduction. Nonetheless, members of the public are less likely than scientists to perceive GM foods as safe.

In popular culture

Genetic engineering features in many science fiction stories. Frank Herbert's novel The White Plague described the deliberate use of genetic engineering to create a pathogen which specifically killed women. Another of Herbert's creations, the Dune series of novels, uses genetic engineering to create the powerful but despised Tleilaxu. Films such as The Island and Blade Runner bring the engineered creature to confront the person who created it or the being it was cloned from. Few films have informed audiences about genetic engineering, with the exception of the 1978 The Boys from Brazil and the 1993 Jurassic Park, both of which made use of a lesson, a demonstration, and a clip of scientific film. 

Genetic engineering methods are weakly represented in film; Michael Clark, writing for The Wellcome Trust, calls the portrayal of genetic engineering and biotechnology "seriously distorted" in films such as The 6th Day. In Clark's view, the biotechnology is typically "given fantastic but visually arresting forms" while the science is either relegated to the background or fictionalised to suit a young audience.

Drug design

Drug design

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Drug discovery cycle schematic

Drug design, often referred to as rational drug design or simply rational design, is the inventive process of finding new medications based on the knowledge of a biological target. The drug is most commonly an organic small molecule that activates or inhibits the function of a biomolecule such as a protein, which in turn results in a therapeutic benefit to the patient. In the most basic sense, drug design involves the design of molecules that are complementary in shape and charge to the biomolecular target with which they interact and therefore will bind to it. Drug design frequently but not necessarily relies on computer modeling techniques. This type of modeling is sometimes referred to as computer-aided drug design. Finally, drug design that relies on the knowledge of the three-dimensional structure of the biomolecular target is known as structure-based drug design. In addition to small molecules, biopharmaceuticals including peptides and especially therapeutic antibodies are an increasingly important class of drugs and computational methods for improving the affinity, selectivity, and stability of these protein-based therapeutics have also been developed.

The phrase "drug design" is to some extent a misnomer. A more accurate term is ligand design (i.e., design of a molecule that will bind tightly to its target). Although design techniques for prediction of binding affinity are reasonably successful, there are many other properties, such as bioavailability, metabolic half-life, side effects, etc., that first must be optimized before a ligand can become a safe and efficacious drug. These other characteristics are often difficult to predict with rational design techniques. Nevertheless, due to high attrition rates, especially during clinical phases of drug development, more attention is being focused early in the drug design process on selecting candidate drugs whose physicochemical properties are predicted to result in fewer complications during development and hence more likely to lead to an approved, marketed drug. Furthermore, in vitro experiments complemented with computation methods are increasingly used in early drug discovery to select compounds with more favorable ADME (absorption, distribution, metabolism, and excretion) and toxicological profiles.

Drug targets

A biomolecular target (most commonly a protein or a nucleic acid) is a key molecule involved in a particular metabolic or signaling pathway that is associated with a specific disease condition or pathology or to the infectivity or survival of a microbial pathogen. Potential drug targets are not necessarily disease causing but must by definition be disease modifying. In some cases, small molecules will be designed to enhance or inhibit the target function in the specific disease modifying pathway. Small molecules (for example receptor agonists, antagonists, inverse agonists, or modulators; enzyme activators or inhibitors; or ion channel openers or blockers) will be designed that are complementary to the binding site of target. Small molecules (drugs) can be designed so as not to affect any other important "off-target" molecules (often referred to as antitargets) since drug interactions with off-target molecules may lead to undesirable side effects. Due to similarities in binding sites, closely related targets identified through sequence homology have the highest chance of cross reactivity and hence highest side effect potential.

Most commonly, drugs are organic small molecules produced through chemical synthesis, but biopolymer-based drugs (also known as biopharmaceuticals) produced through biological processes are becoming increasingly more common. In addition, mRNA-based gene silencing technologies may have therapeutic applications.

Rational drug discovery

In contrast to traditional methods of drug discovery (known as forward pharmacology), which rely on trial-and-error testing of chemical substances on cultured cells or animals, and matching the apparent effects to treatments, rational drug design (also called reverse pharmacology) begins with a hypothesis that modulation of a specific biological target may have therapeutic value. In order for a biomolecule to be selected as a drug target, two essential pieces of information are required. The first is evidence that modulation of the target will be disease modifying. This knowledge may come from, for example, disease linkage studies that show an association between mutations in the biological target and certain disease states. The second is that the target is "druggable". This means that it is capable of binding to a small molecule and that its activity can be modulated by the small molecule.

Once a suitable target has been identified, the target is normally cloned and produced and purified. The purified protein is then used to establish a screening assay. In addition, the three-dimensional structure of the target may be determined.

The search for small molecules that bind to the target is begun by screening libraries of potential drug compounds. This may be done by using the screening assay (a "wet screen"). In addition, if the structure of the target is available, a virtual screen may be performed of candidate drugs. Ideally the candidate drug compounds should be "drug-like", that is they should possess properties that are predicted to lead to oral bioavailability, adequate chemical and metabolic stability, and minimal toxic effects. Several methods are available to estimate druglikeness such as Lipinski's Rule of Five and a range of scoring methods such as lipophilic efficiency. Several methods for predicting drug metabolism have also been proposed in the scientific literature.

Due to the large number of drug properties that must be simultaneously optimized during the design process, multi-objective optimization techniques are sometimes employed. Finally because of the limitations in the current methods for prediction of activity, drug design is still very much reliant on serendipity and bounded rationality.

Computer-aided drug design

The most fundamental goal in drug design is to predict whether a given molecule will bind to a target and if so how strongly. Molecular mechanics or molecular dynamics is most often used to estimate the strength of the intermolecular interaction between the small molecule and its biological target. These methods are also used to predict the conformation of the small molecule and to model conformational changes in the target that may occur when the small molecule binds to it. Semi-empirical, ab initio quantum chemistry methods, or density functional theory are often used to provide optimized parameters for the molecular mechanics calculations and also provide an estimate of the electronic properties (electrostatic potential, polarizability, etc.) of the drug candidate that will influence binding affinity.

Molecular mechanics methods may also be used to provide semi-quantitative prediction of the binding affinity. Also, knowledge-based scoring function may be used to provide binding affinity estimates. These methods use linear regression, machine learning, neural nets or other statistical techniques to derive predictive binding affinity equations by fitting experimental affinities to computationally derived interaction energies between the small molecule and the target.

Ideally, the computational method will be able to predict affinity before a compound is synthesized and hence in theory only one compound needs to be synthesized, saving enormous time and cost. The reality is that present computational methods are imperfect and provide, at best, only qualitatively accurate estimates of affinity. In practice it still takes several iterations of design, synthesis, and testing before an optimal drug is discovered. Computational methods have accelerated discovery by reducing the number of iterations required and have often provided novel structures.

Drug design with the help of computers may be used at any of the following stages of drug discovery:

  1. hit identification using virtual screening (structure- or ligand-based design)
  2. hit-to-lead optimization of affinity and selectivity (structure-based design, QSAR, etc.)
  3. lead optimization of other pharmaceutical properties while maintaining affinity
Flowchart of a common Clustering Analysis for Structure-Based Drug Design
Flowchart of a Usual Clustering Analysis for Structure-Based Drug Design

In order to overcome the insufficient prediction of binding affinity calculated by recent scoring functions, the protein-ligand interaction and compound 3D structure information are used for analysis. For structure-based drug design, several post-screening analyses focusing on protein-ligand interaction have been developed for improving enrichment and effectively mining potential candidates:

  • Consensus scoring
    • Selecting candidates by voting of multiple scoring functions
    • May lose the relationship between protein-ligand structural information and scoring criterion
  • Cluster analysis
    • Represent and cluster candidates according to protein-ligand 3D information
    • Needs meaningful representation of protein-ligand interactions.

Types

Drug discovery cycle highlighting both ligand-based (indirect) and structure-based (direct) drug design strategies.

There are two major types of drug design. The first is referred to as ligand-based drug design and the second, structure-based drug design.

Ligand-based

Ligand-based drug design (or indirect drug design) relies on knowledge of other molecules that bind to the biological target of interest. These other molecules may be used to derive a pharmacophore model that defines the minimum necessary structural characteristics a molecule must possess in order to bind to the target. In other words, a model of the biological target may be built based on the knowledge of what binds to it, and this model in turn may be used to design new molecular entities that interact with the target. Alternatively, a quantitative structure-activity relationship (QSAR), in which a correlation between calculated properties of molecules and their experimentally determined biological activity, may be derived. These QSAR relationships in turn may be used to predict the activity of new analogs.

Structure-based

Structure-based drug design (or direct drug design) relies on knowledge of the three dimensional structure of the biological target obtained through methods such as x-ray crystallography or NMR spectroscopy. If an experimental structure of a target is not available, it may be possible to create a homology model of the target based on the experimental structure of a related protein. Using the structure of the biological target, candidate drugs that are predicted to bind with high affinity and selectivity to the target may be designed using interactive graphics and the intuition of a medicinal chemist. Alternatively various automated computational procedures may be used to suggest new drug candidates.

Current methods for structure-based drug design can be divided roughly into three main categories. The first method is identification of new ligands for a given receptor by searching large databases of 3D structures of small molecules to find those fitting the binding pocket of the receptor using fast approximate docking programs. This method is known as virtual screening. A second category is de novo design of new ligands. In this method, ligand molecules are built up within the constraints of the binding pocket by assembling small pieces in a stepwise manner. These pieces can be either individual atoms or molecular fragments. The key advantage of such a method is that novel structures, not contained in any database, can be suggested. A third method is the optimization of known ligands by evaluating proposed analogs within the binding cavity.

Binding site identification

Binding site identification is the first step in structure based design. If the structure of the target or a sufficiently similar homolog is determined in the presence of a bound ligand, then the ligand should be observable in the structure in which case location of the binding site is trivial. However, there may be unoccupied allosteric binding sites that may be of interest. Furthermore, it may be that only apoprotein (protein without ligand) structures are available and the reliable identification of unoccupied sites that have the potential to bind ligands with high affinity is non-trivial. In brief, binding site identification usually relies on identification of concave surfaces on the protein that can accommodate drug sized molecules that also possess appropriate "hot spots" (hydrophobic surfaces, hydrogen bonding sites, etc.) that drive ligand binding.

Scoring functions

Structure-based drug design attempts to use the structure of proteins as a basis for designing new ligands by applying the principles of molecular recognition. Selective high affinity binding to the target is generally desirable since it leads to more efficacious drugs with fewer side effects. Thus, one of the most important principles for designing or obtaining potential new ligands is to predict the binding affinity of a certain ligand to its target (and known antitargets) and use the predicted affinity as a criterion for selection.

One early general-purposed empirical scoring function to describe the binding energy of ligands to receptors was developed by Böhm. This empirical scoring function took the form:

where:

  • ΔG0 – empirically derived offset that in part corresponds to the overall loss of translational and rotational entropy of the ligand upon binding.
  • ΔGhb – contribution from hydrogen bonding
  • ΔGionic – contribution from ionic interactions
  • ΔGlip – contribution from lipophilic interactions where |Alipo| is surface area of lipophilic contact between the ligand and receptor
  • ΔGrot – entropy penalty due to freezing a rotatable in the ligand bond upon binding

A more general thermodynamic "master" equation is as follows:

where:

  • desolvation – enthalpic penalty for removing the ligand from solvent
  • motion – entropic penalty for reducing the degrees of freedom when a ligand binds to its receptor
  • configuration – conformational strain energy required to put the ligand in its "active" conformation
  • interaction – enthalpic gain for "resolvating" the ligand with its receptor

The basic idea is that the overall binding free energy can be decomposed into independent components that are known to be important for the binding process. Each component reflects a certain kind of free energy alteration during the binding process between a ligand and its target receptor. The Master Equation is the linear combination of these components. According to Gibbs free energy equation, the relation between dissociation equilibrium constant, Kd, and the components of free energy was built.

Various computational methods are used to estimate each of the components of the master equation. For example, the change in polar surface area upon ligand binding can be used to estimate the desolvation energy. The number of rotatable bonds frozen upon ligand binding is proportional to the motion term. The configurational or strain energy can be estimated using molecular mechanics calculations. Finally the interaction energy can be estimated using methods such as the change in non polar surface, statistically derived potentials of mean force, the number of hydrogen bonds formed, etc. In practice, the components of the master equation are fit to experimental data using multiple linear regression. This can be done with a diverse training set including many types of ligands and receptors to produce a less accurate but more general "global" model or a more restricted set of ligands and receptors to produce a more accurate but less general "local" model.

Examples

A particular example of rational drug design involves the use of three-dimensional information about biomolecules obtained from such techniques as X-ray crystallography and NMR spectroscopy. Computer-aided drug design in particular becomes much more tractable when there is a high-resolution structure of a target protein bound to a potent ligand. This approach to drug discovery is sometimes referred to as structure-based drug design. The first unequivocal example of the application of structure-based drug design leading to an approved drug is the carbonic anhydrase inhibitor dorzolamide, which was approved in 1995.

Another important case study in rational drug design is imatinib, a tyrosine kinase inhibitor designed specifically for the bcr-abl fusion protein that is characteristic for Philadelphia chromosome-positive leukemias (chronic myelogenous leukemia and occasionally acute lymphocytic leukemia). Imatinib is substantially different from previous drugs for cancer, as most agents of chemotherapy simply target rapidly dividing cells, not differentiating between cancer cells and other tissues.

Additional examples include:

Case Studies

Criticism

It has been argued that the highly rigid and focused nature of rational drug design suppresses serendipity in drug discovery. Because many of the most significant medical discoveries have been inadvertent, the recent focus on rational drug design may limit the progress of drug discovery. Furthermore, the rational design of a drug may be limited by a crude or incomplete understanding of the underlying molecular processes of the disease it is intended to treat.

Environmental degradation

From Wikipedia, the free encyclopedia
 
Eighty-plus years after the abandonment of Wallaroo Mines (Kadina, South Australia), mosses remain the only vegetation at some spots of the site's grounds.

Environmental degradation is the deterioration of the environment through depletion of resources such as air, water and soil; the destruction of ecosystems; habitat destruction; the extinction of wildlife; and pollution. It is defined as any change or disturbance to the environment perceived to be deleterious or undesirable.

Environmental degradation is one of the ten threats officially cautioned by the High-level Panel on Threats, Challenges and Change of the United Nations. The United Nations International Strategy for Disaster Reduction defines environmental degradation as "the reduction of the capacity of the environment to meet social and ecological objectives, and needs". Environmental degradation comes in many types. When natural habitats are destroyed or natural resources are depleted, the environment is degraded. Efforts to counteract this problem include environmental protection and environmental resources management.

Biodiversity loss

Deforestation in the Maranhão state, Brazil, 2016

Scientists assert that human activity has pushed the earth into a sixth mass extinction event. The loss of biodiversity has been attributed in particular to human overpopulation, continued human population growth and overconsumption of natural resources by the world's wealthy. A 2020 report by the World Wildlife Fund found that human activity, specifically overconsumption, population growth and intensive farming, has destroyed 68% of vertebrate wildlife since 1970. The Global Assessment Report on Biodiversity and Ecosystem Services, published by the United Nation's IPBES in 2019, posits that roughly one million species of plants and animals face extinction from anthropogenic causes, such as expanding human land use for industrial agriculture and livestock rearing, along with overfishing.

Since the establishment of agriculture over 11,000 years ago, humans have altered roughly 70% of the earth's land surface, with the global biomass of vegetation being reduced by half, and terrestrial animal communities seeing a decline in biodiversity greater than 20% on average.

The implications of these losses for human livelihoods and wellbeing have raised serious concerns. With regard to the agriculture sector for example, The State of the World’s Biodiversity for Food and Agriculture, published by the Food and Agriculture Organization of the United Nations in 2019, states that “countries report that many species that contribute to vital ecosystem services, including pollinators, the natural enemies of pests, soil organisms and wild food species, are in decline as a consequence of the destruction and degradation of habitats, overexploitation, pollution and other threats” and that “key ecosystems that deliver numerous services essential to food and agriculture, including supply of freshwater, protection against hazards and provision of habitat for species such as fish and pollinators, are declining.”

Water degradation

Ethiopia's move to fill the Grand Ethiopian Renaissance Dam's reservoir could reduce Nile flows by as much as 25% and devastate Egyptian farmlands.

One major component of environmental degradation is the depletion of the resource of fresh water on Earth. Approximately only 2.5% of all of the water on Earth is fresh water, with the rest being salt water. 69% of fresh water is frozen in ice caps located on Antarctica and Greenland, so only 30% of the 2.5% of fresh water is available for consumption. Fresh water is an exceptionally important resource, since life on Earth is ultimately dependent on it. Water transports nutrients, minerals and chemicals within the biosphere to all forms of life, sustains both plants and animals, and moulds the surface of the Earth with transportation and deposition of materials.

The current top three uses of fresh water account for 95% of its consumption; approximately 85% is used for irrigation of farmland, golf courses, and parks, 6% is used for domestic purposes such as indoor bathing uses and outdoor garden and lawn use, and 4% is used for industrial purposes such as processing, washing, and cooling in manufacturing centres. It is estimated that one in three people over the entire globe are already facing water shortages, almost one-fifth of the world population live in areas of physical water scarcity, and almost one quarter of the world's population live in a developing country that lacks the necessary infrastructure to use water from available rivers and aquifers. Water scarcity is an increasing problem due to many foreseen issues in the future including population growth, increased urbanization, higher standards of living, and climate change.

Climate change and temperature

Climate change affects the Earth's water supply in a large number of ways. It is predicted that the mean global temperature will rise in the coming years due to a number of forces affecting the climate. The amount of atmospheric carbon dioxide (CO2) will rise, and both of these will influence water resources; evaporation depends strongly on temperature and moisture availability which can ultimately affect the amount of water available to replenish groundwater supplies.

Transpiration from plants can be affected by a rise in atmospheric CO2, which can decrease their use of water, but can also raise their use of water from possible increases of leaf area. Temperature rise can reduce the snow season in the winter and increase the intensity of the melting snow leading to peak runoff of this, affecting soil moisture, flood and drought risks, and storage capacities depending on the area.

Warmer winter temperatures cause a decrease in snowpack, which can result in diminished water resources during summer. This is especially important at mid-latitudes and in mountain regions that depend on glacial runoff to replenish their river systems and groundwater supplies, making these areas increasingly vulnerable to water shortages over time; an increase in temperature will initially result in a rapid rise in water melting from glaciers in the summer, followed by a retreat in glaciers and a decrease in the melt and consequently the water supply every year as the size of these glaciers get smaller and smaller.

Thermal expansion of water and increased melting of oceanic glaciers from an increase in temperature gives way to a rise in sea level. This can affect the fresh water supply to coastal areas as well. As river mouths and deltas with higher salinity get pushed further inland, an intrusion of saltwater results in an increase of salinity in reservoirs and aquifers. Sea-level rise may also consequently be caused by a depletion of groundwater, as climate change can affect the hydrologic cycle in a number of ways. Uneven distributions of increased temperatures and increased precipitation around the globe results in water surpluses and deficits, but a global decrease in groundwater suggests a rise in sea level, even after meltwater and thermal expansion were accounted for, which can provide a positive feedback to the problems sea-level rise causes to fresh-water supply.

A rise in air temperature results in a rise in water temperature, which is also very significant in water degradation as the water would become more susceptible to bacterial growth. An increase in water temperature can also affect ecosystems greatly because of a species' sensitivity to temperature, and also by inducing changes in a body of water's self-purification system from decreased amounts of dissolved oxygen in the water due to rises in temperature.

Climate change and precipitation

A rise in global temperatures is also predicted to correlate with an increase in global precipitation but because of increased runoff, floods, increased rates of soil erosion, and mass movement of land, a decline in water quality is probable, because while water will carry more nutrients it will also carry more contaminants. While most of the attention about climate change is directed towards global warming and greenhouse effect, some of the most severe effects of climate change are likely to be from changes in precipitation, evapotranspiration, runoff, and soil moisture. It is generally expected that, on average, global precipitation will increase, with some areas receiving increases and some decreases.

Climate models show that while some regions should expect an increase in precipitation, such as in the tropics and higher latitudes, other areas are expected to see a decrease, such as in the subtropics. This will ultimately cause a latitudinal variation in water distribution. The areas receiving more precipitation are also expected to receive this increase during their winter and actually become drier during their summer, creating even more of a variation of precipitation distribution. Naturally, the distribution of precipitation across the planet is very uneven, causing constant variations in water availability in respective locations.

Changes in precipitation affect the timing and magnitude of floods and droughts, shift runoff processes, and alter groundwater recharge rates. Vegetation patterns and growth rates will be directly affected by shifts in precipitation amount and distribution, which will in turn affect agriculture as well as natural ecosystems. Decreased precipitation will deprive areas of water causing water tables to fall and reservoirs of wetlands, rivers, and lakes to empty. In addition, a possible increase in evaporation and evapotranspiration will result, depending on the accompanied rise in temperature. Groundwater reserves will be depleted, and the remaining water has a greater chance of being of poor quality from saline or contaminants on the land surface.

Population growth

Graph of human population from 10000 BCE to 2000 CE. It shows exponential rise in world population that has taken place since the end of the seventeenth century.

The human population on Earth is expanding exponentially which goes hand in hand with the degradation of the environment at large measures. Humanity's appetite for needs is disarranging the environment's natural equilibrium. Production industries are venting smoke and discharging chemicals that are polluting water resources. The smoke that is emitted into the atmosphere holds detrimental gases such as carbon monoxide and sulphur dioxide. The high levels of pollution in the atmosphere form layers that are eventually absorbed into the atmosphere. Organic compounds such as chlorofluorocarbons (CFCs) have generated an unwanted opening in the ozone layer, which emits higher levels of ultraviolet radiation putting the globe at large threat.

The available fresh water being affected by the climate is also being stretched across an ever-increasing global population. It is estimated that almost a quarter of the global population is living in an area that is using more than 20% of their renewable water supply; water use will rise with population while the water supply is also being aggravated by decreases in streamflow and groundwater caused by climate change. Even though some areas may see an increase in freshwater supply from an uneven distribution of precipitation increase, an increased use of water supply is expected.

An increased population means increased withdrawals from the water supply for domestic, agricultural, and industrial uses, the largest of these being agriculture, believed to be the major non-climate driver of environmental change and water deterioration. The next 50 years will likely be the last period of rapid agricultural expansion, but the larger and wealthier population over this time will demand more agriculture.

Population increase over the last two decades, at least in the United States, has also been accompanied by a shift to an increase in urban areas from rural areas, which concentrates the demand for water into certain areas, and puts stress on the fresh water supply from industrial and human contaminants. Urbanization causes overcrowding and increasingly unsanitary living conditions, especially in developing countries, which in turn exposes an increasingly number of people to disease. About 79% of the world's population is in developing countries, which lack access to sanitary water and sewer systems, giving rises to disease and deaths from contaminated water and increased numbers of disease-carrying insects.

Agriculture

Water pollution due to dairy farming in the Wairarapa in New Zealand

Agriculture is dependent on available soil moisture, which is directly affected by climate dynamics, with precipitation being the input in this system and various processes being the output, such as evapotranspiration, surface runoff, drainage, and percolation into groundwater. Changes in climate, especially the changes in precipitation and evapotranspiration predicted by climate models, will directly affect soil moisture, surface runoff, and groundwater recharge.

In areas with decreasing precipitation as predicted by the climate models, soil moisture may be substantially reduced. With this in mind, agriculture in most areas already needs irrigation, which depletes fresh water supplies both by the physical use of the water and the degradation agriculture causes to the water. Irrigation increases salt and nutrient content in areas that would not normally be affected, and damages streams and rivers from damming and removal of water. Fertilizer enters both human and livestock waste streams that eventually enter groundwater, while nitrogen, phosphorus, and other chemicals from fertilizer can acidify both soils and water. Certain agricultural demands may increase more than others with an increasingly wealthier global population, and meat is one commodity expected to double global food demand by 2050, which directly affects the global supply of fresh water. Cows need water to drink, more if the temperature is high and humidity is low, and more if the production system the cow is in is extensive, since finding food takes more effort. Water is needed in processing of the meat, and also in the production of feed for the livestock. Manure can contaminate bodies of freshwater, and slaughterhouses, depending on how well they are managed, contribute waste such as blood, fat, hair, and other bodily contents to supplies of fresh water.

The transfer of water from agricultural to urban and suburban use raises concerns about agricultural sustainability, rural socioeconomic decline, food security, an increased carbon footprint from imported food, and decreased foreign trade balance. The depletion of fresh water, as applied to more specific and populated areas, increases fresh water scarcity among the population and also makes populations susceptible to economic, social, and political conflict in a number of ways; rising sea levels forces migration from coastal areas to other areas farther inland, pushing populations closer together breaching borders and other geographical patterns, and agricultural surpluses and deficits from the availability of water induce trade problems and economies of certain areas. Climate change is an important cause of involuntary migration and forced displacement According to the Food and Agriculture Organization of the United Nations, global greenhouse gas emissions from animal agriculture exceeds that of transportation.

Water management

A stream in the town of Amlwch, Anglesey which is contaminated by acid mine drainage from the former copper mine at nearby Parys Mountain

The issue of the depletion of fresh water has stimulated increased efforts in water management. While water management systems are often flexible, adaptation to new hydrologic conditions may be very costly. Preventative approaches are necessary to avoid high costs of inefficiency and the need for rehabilitation of water supplies, and innovations to decrease overall demand may be important in planning water sustainability.

Water supply systems, as they exist now, were based on the assumptions of the current climate, and built to accommodate existing river flows and flood frequencies. Reservoirs are operated based on past hydrologic records, and irrigation systems on historical temperature, water availability, and crop water requirements; these may not be a reliable guide to the future. Re-examining engineering designs, operations, optimizations, and planning, as well as re-evaluating legal, technical, and economic approaches to manage water resources are very important for the future of water management in response to water degradation. Another approach is water privatization; despite its economic and cultural effects, service quality and overall quality of the water can be more easily controlled and distributed. Rationality and sustainability is appropriate, and requires limits to overexploitation and pollution and efforts in conservation.

Bayesian inference

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Bayesian_inference Bayesian inference ( / ...