Search This Blog

Friday, January 10, 2020

Anthropocene

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Anthropocene

The Anthropocene (/ænˈθrɒp.əˌsn, -ˈθrɒp.-/ ann-THROP-ə-seen, -⁠THROP-oh-) is a proposed geological epoch dating from the commencement of significant human impact on Earth's geology and ecosystems, including, but not limited to, anthropogenic climate change.

As of June 2019, neither the International Commission on Stratigraphy (ICS) nor the International Union of Geological Sciences (IUGS) have officially approved the term as a recognised subdivision of geologic time, although the Anthropocene Working Group (AWG) of the Subcommission on Quaternary Stratigraphy (SQS) of the ICS voted in April 2016 to proceed towards a formal golden spike (GSSP) proposal to define the Anthropocene epoch in the Geologic time scale and presented the recommendation to the International Geological Congress in August 2016. In May 2019, the AWG voted in favour of submitting a formal proposal to the ICS by 2021, locating potential stratigraphic markers to the mid-twentieth century of the common era. This time period coincides with the Great Acceleration, a post-WWII time during which socioeconomic and earth system trends started increasing dramatically, and the Atomic Age.

Various start dates for the Anthropocene have been proposed, ranging from the beginning of the Agricultural Revolution 12,000–15,000 years ago, to as recent as the 1960s. The ratification process is ongoing, and thus a date remains to be decided definitively, but the peak in radionuclides fallout consequential to atomic bomb testing during the 1950s has been more favoured than others, locating a possible beginning of the Anthropocene to the detonation of the first atomic bomb in 1945, or the Partial Nuclear Test Ban Treaty in 1963.

General

An early concept for the Anthropocene was the Noosphere by Vladimir Vernadsky, who in 1938 wrote of "scientific thought as a geological force." Scientists in the Soviet Union appear to have used the term "anthropocene" as early as the 1960s to refer to the Quaternary, the most recent geological period. Ecologist Eugene F. Stoermer subsequently used "anthropocene" with a different sense in the 1980s and the term was widely popularised in 2000 by atmospheric chemist Paul J. Crutzen, who regards the influence of human behavior on Earth's atmosphere in recent centuries as so significant as to constitute a new geological epoch. 

In 2008, the Stratigraphy Commission of the Geological Society of London considered a proposal to make the Anthropocene a formal unit of geological epoch divisions. A majority of the commission decided the proposal had merit and should be examined further. Independent working groups of scientists from various geological societies have begun to determine whether the Anthropocene will be formally accepted into the Geological Time Scale.

The term "anthropocene" is informally used in scientific contexts. The Geological Society of America entitled its 2011 annual meeting: Archean to Anthropocene: The past is the key to the future. The new epoch has no agreed start-date, but one proposal, based on atmospheric evidence, is to fix the start with the Industrial Revolution ca. 1780, with the invention of the steam engine. Other scientists link the new term to earlier events, such as the rise of agriculture and the Neolithic Revolution (around 12,000 years BP). Evidence of relative human impact – such as the growing human influence on land use, ecosystems, biodiversity, and species extinction – is substantial; scientists think that human impact has significantly changed (or halted) the growth of biodiversity. Those arguing for earlier dates posit that the proposed Anthropocene may have begun as early as 14,000–15,000 years before present, based on geologic evidence; this has led other scientists to suggest that "the onset of the Anthropocene should be extended back many thousand years"; this would be essentially synonymous with the current term, Holocene

The Trinity test in 1945 has been proposed as the start of the Anthropocene.
 
In January 2015, 26 of the 38 members of the International Anthropocene Working Group published a paper suggesting the Trinity test on 16 July 1945 as the starting point of the proposed new epoch. However, a significant minority supports one of several alternative dates. A March 2015 report suggested either 1610 or 1964 as the beginning of Anthropocene. Other scholars point to the diachronous character of the physical strata of the Anthropocene, arguing that onset and impact are spread out over time, not reducible to a single instant or date of start.

A January 2016 report on the climatic, biological, and geochemical signatures of human activity in sediments and ice cores suggested the era since the mid-20th century should be recognised as a geological epoch distinct from the Holocene.

The Anthropocene Working Group met in Oslo in April 2016 to consolidate evidence supporting the argument for the Anthropocene as a true geologic epoch. Evidence was evaluated and the group voted to recommend "Anthropocene" as the new geological age in August 2016. Should the International Commission on Stratigraphy approve the recommendation, the proposal to adopt the term will have to be ratified by the IUGS before its formal adoption as part of the geologic time scale.

In April 2019, the Anthropocene Working Group announced that they would vote on a formal proposal to the International Commission on Stratigraphy, to continue the process started at the 2016 meeting. On 21 May 2019, 29 members of the 34 person AWG panel voted in favour of an official proposal to be made by 2021. The AWG also voted with 29 votes in favour of a starting date in the mid 20th century. Ten candidate sites for a Global boundary Stratotype Section and Point have been identified, one of which will be chosen to be included in the final proposal. Possible markers include microplastics, heavy metals, or the radioactive nuclei left by tests from thermonuclear weapons.

Etymology

The name Anthropocene is a combination of anthropo- from anthropos (Ancient Greek: ἄνθρωπος) meaning "human" and -cene from kainos (Ancient Greek: καινός) meaning "new" or "recent."

As early as 1873, the Italian geologist Antonio Stoppani acknowledged the increasing power and effect of humanity on the Earth's systems and referred to an 'anthropozoic era'.

Although the biologist Eugene Stoermer is often credited with coining the term "anthropocene", it was in informal use in the mid-1970s. Paul Crutzen is credited with independently re-inventing and popularising it. Stoermer wrote, "I began using the term 'anthropocene' in the 1980's, but never formalised it until Paul contacted me." Crutzen has explained, "I was at a conference where someone said something about the Holocene. I suddenly thought this was wrong. The world has changed too much. So I said: 'No, we are in the Anthropocene.' I just made up the word on the spur of the moment. Everyone was shocked. But it seems to have stuck." In 2008, Zalasiewicz suggested in GSA Today that an anthropocene epoch is now appropriate.

Nature of human effects



Homogenocene

Homogenocene (from old Greek: homo-, same geno-, kind, kainos-, new and -cene, period) is a more specific term used to define our current geological epoch, in which biodiversity is diminishing and biogeography and ecosystems around the globe seem more and more similar to one another mainly due to invasive species that have been introduced around the globe either on purpose (crops, livestock) or inadvertently. 

The term Homogenocene was first used by Michael Samways in his editorial article in the Journal of Insect Conservation from 1999 titled "Translocating fauna to foreign lands: Here comes the Homogenocene."

The term was used again by John L. Curnutt in the year 2000 in Ecology, in a short list titled "A Guide to the Homogenocene", which reviewed Alien species in North America and Hawaii: impacts on natural ecosystems by George Cox. Charles C. Mann, in his acclaimed book 1493: Uncovering the New World Columbus Created, gives a bird's eye view of the mechanisms and ongoing implications of the homogenocene.

Biodiversity

The human impact on biodiversity forms one of the primary attributes of the Anthropocene. Humankind has entered what is sometimes called the Earth's sixth major extinction. Most experts agree that human activities have accelerated the rate of species extinction. The exact rate remains controversial – perhaps 100 to 1000 times the normal background rate of extinction. A 2010 study found that
marine phytoplankton – the vast range of tiny algae species accounting for roughly half of Earth's total photosynthetic biomass – has declined substantially in the world's oceans over the past century. From 1950 alone, algal biomass decreased by around 40%, probably in response to ocean warming
– and that the decline had gathered pace in recent years. Some authors have postulated that without human impacts the biodiversity of the planet would continue to grow at an exponential rate.

Increases in global rates of extinction have been elevated above background rates since at least 1500, and appear to have accelerated in the 19th century and further since. A New York Times op-ed on 13 July 2012 by ecologist Roger Bradbury predicted the end of biodiversity for the oceans, labelling coral reefs doomed: "Coral reefs will be the first, but certainly not the last, major ecosystem to succumb to the Anthropocene." This op-ed quickly generated much discussion among conservationists; The Nature Conservancy rebutted Bradbury on its website, defending its position of protecting coral reefs despite continued human impacts causing reef declines.

In a pair of studies published in 2015, extrapolation from observed extinction of Hawaiian snails of the family Amastridae, led to the conclusion that "the biodiversity crisis is real", and that 7% of all species on Earth may have disappeared already. Human predation was noted as being unique in the history of life on Earth as being a globally distributed 'superpredator', with predation of the adults of other apex predators and with widespread impact on food webs worldwide. A study published in May 2017 in Proceedings of the National Academy of Sciences noted that a "biological annihilation" akin to a sixth mass extinction event is underway as a result of anthropogenic causes. The study suggested that as much as 50% of animal individuals that once lived on Earth are already extinct. A different study published in PNAS in May 2018 says that since the dawn of human civilisation, 83% of wild mammals have disappeared. Today, livestock makes up 60% of the biomass of all mammals on earth, followed by humans (36%) and wild mammals (4%). According to the 2019 Global Assessment Report on Biodiversity and Ecosystem Services by IPBES, 25% of plant and animal species are threatened with extinction.

Biogeography and nocturnality

Permanent changes in the distribution of organisms from human influence will become identifiable in the geologic record. Researchers have documented the movement of many species into regions formerly too cold for them, often at rates faster than initially expected. This has occurred in part as a result of changing climate, but also in response to farming and fishing, and to the accidental introduction of non-native species to new areas through global travel. The ecosystem of the entire Black Sea may have changed during the last 2000 years as a result of nutrient and silica input from eroding deforested lands along the Danube River.

Researchers have found that the growth of the human population and expansion of human activity has resulted in many species of animals that are normally active during the day, such as elephants, tigers and boars, becoming nocturnal to avoid contact with humans.

Climate

One geological symptom resulting from human activity is increasing atmospheric carbon dioxide (CO
2
) content. During the glacial–interglacial cycles of the past million years, natural processes have varied CO
2
by approximately 100 ppm (from 180 ppm to 280 ppm) As of 2013, anthropogenic net emissions of CO
2
have increased atmospheric concentration by a comparable amount: From 280 ppm (Holocene or pre-industrial "equilibrium") to approximately 400 ppm, with 2015–2016 monthly monitoring data of CO
2
displaying a rising trend above 400 ppm. This signal in the Earth's climate system is especially significant because it is occurring much faster, and to a greater extent, than previous, similar changes. Most of this increase is due to the combustion of fossil fuels such as coal, oil, and gas, although smaller fractions result from cement production and from land-use changes (such as deforestation).

Geomorphology

Changes in drainage patterns traceable to human activity will persist over geologic time in large parts of the continents where the geologic regime is erosional. This includes the paths of roads and highways defined by their grading and drainage control. Direct changes to the form of the Earth's surface by human activities (e.g., quarrying, landscaping) also record human impacts.

It has been suggested the deposition of calthemite formations are one example of a natural process which has not previously occurred prior to the human modification of the Earth's surface, and therefore represents a unique process of the Anthropocene. Calthemite is a secondary deposit, derived from concrete, lime, mortar or other calcareous material outside the cave environment. Calthemites grow on or under, man-made structures (including mines and tunnels) and mimic the shapes and forms of cave speleothems, such as stalactites, stalagmites, flowstone etc.
 

Stratigraphy


Sedimentological record

Human activities like deforestation and road construction are believed to have elevated average total sediment fluxes across the Earth's surface. However, construction of dams on many rivers around the world means the rates of sediment deposition in any given place do not always appear to increase in the Anthropocene. For instance, many river deltas around the world are actually currently starved of sediment by such dams, and are subsiding and failing to keep up with sea level rise, rather than growing.

Fossil record

Increases in erosion due to farming and other operations will be reflected by changes in sediment composition and increases in deposition rates elsewhere. In land areas with a depositional regime, engineered structures will tend to be buried and preserved, along with litter and debris. Litter and debris thrown from boats or carried by rivers and creeks will accumulate in the marine environment, particularly in coastal areas. Such man-made artifacts preserved in stratigraphy are known as "technofossils".

Changes in biodiversity will also be reflected in the fossil record, as will species introductions. An example cited is the domestic chicken, originally the red junglefowl Gallus gallus, native to south-east Asia but has since become the world's most common bird through human breeding and consumption, with over 60 billion consumed annually and whose bones would become fossilised in landfill sites. Hence, landfills are important resources to find "technofossils".

Trace elements

In terms of trace elements, there are distinct signatures left by modern societies. For example, in the Upper Fremont Glacier in Wyoming, there is a layer of chlorine present in ice cores from 1960's atomic weapon testing programs, as well as a layer of mercury associated with coal plants in the 1980s. From 1945 to 1951, nuclear fallout is found locally around atomic device test sites, whereas from 1952 to 1980, tests of thermonuclear devices have left a clear, global signal of excess 14C
, 239Pu
, and other artificial radionuclides. The highest global concentration of radionuclides was in 1965, one of the dates which has been proposed as a possible benchmark for the start of the formally defined Anthropocene.
Human burning of fossil fuels has also left distinctly elevated concentrations of black carbon, inorganic ash, and spherical carbonaceous particles in recent sediments across the world. Concentrations of these components increases markedly and almost simultaneously around the world beginning around 1950.

Temporal limit


"Early anthropocene" model

While much of the environmental change occurring on Earth is suspected to be a direct consequence of the Industrial Revolution, William Ruddiman has argued that the proposed Anthropocene began approximately 8,000 years ago with the development of farming and sedentary cultures. At this point, humans were dispersed across all of the continents (except Antarctica), and the Neolithic Revolution was ongoing. During this period, humans developed agriculture and animal husbandry to supplement or replace hunter-gatherer subsistence. Such innovations were followed by a wave of extinctions, beginning with large mammals and land birds. This wave was driven by both the direct activity of humans (e.g. hunting) and the indirect consequences of land-use change for agriculture. 

From the past to present, some authors consider the Anthropocene and the Holocene to be the same or coeval geologic time span, and others viewed the Anthropocene as being a bit more recent. Ruddiman claims that the Anthropocene, has had significant human impact on greenhouse gas emissions, which began not in the industrial era, but rather 8,000 years ago, as ancient farmers cleared forests to grow crops. Ruddiman's work has, in turn, been challenged with data from an earlier interglaciation ("Stage 11", approximately 400,000 years ago) which suggests that 16,000 more years must elapse before the current Holocene interglaciation comes to an end, and that thus the early anthropogenic hypothesis is invalid. Furthermore, the argument that "something" is needed to explain the differences in the Holocene is challenged by more recent research showing that all interglacials differ.

Although 8,000 years ago the planet sustained a few million people, it was still fundamentally pristine. This claim is the basis for an assertion that an early date for the proposed Anthropocene term does account for a substantial human footprint on Earth.

Antiquity

One plausible starting point of the Anthropocene could be at ca. 2,000 years ago, which roughly coincides with the start of the final phase of Holocene, the Sub Atlantic.

At this time, the Roman Empire encompassed large portions of Europe, the Middle East, and North Africa. In China the classical dynasties were flowering. The Middle kingdoms of India had already the largest economy of the ancient and medieval world. The Napata/Meroitic kingdom extended over the current Sudan and Ethiopia. The Olmecs controlled central Mexico and Guatemala, and the pre-Incan Chavín people managed areas of northern Peru. Although often apart from each other and intermixed with buffering ecosystems, the areas directly impacted by these civilisations and others were large. Additionally, some activities, such as mining, implied much more widespread perturbation of natural conditions. Over the last 11,500 years or so humans have spread around Earth, increased in number, and profoundly altered the material world. They have taken advantage of global environmental conditions not of their own making. The end of the last glacial period – when as much as 30% of Earth's surface was ice-bound – led to a warmer world with more water (H
2
O
). Although humans existed in the previous Pleistocene epoch, it is only in the recent Holocene period that they have flourished. Today there are more humans alive than at any previous point in Earth's history.

European colonisation of the Americas

Maslin and Lewis argue that the start of the Anthropocene should be dated to the Orbis Spike, a trough in carbon dioxide levels associated with the arrival of Europeans in the Americas. Reaching a minimum around 1610, global carbon dioxide levels were depressed below 285 parts per million, largely as a result of sequestration due to forest regrowth in the Americas. This was likely caused by indigenous peoples abandoning farmland following a sharp population decline due to initial contact with European diseases- around 50 million people or 90% of the indigenous population may have succumbed. For Maslin and Lewis, the Orbis Spike represents a GSSP, a kind of marker used to define the start of a new geological period. They also go on to say that associating the Anthropocene to European arrival in the Americas makes sense given that the continent's colonisation was instrumental in the development of global trade networks and the capitalist economy, which played a significant role in initiating the Industrial Revolution and the Great Acceleration.

Industrial Revolution

Crutzen proposed the Industrial Revolution as the start of Anthropocene. Lovelock proposes that the Anthropocene began with the first application of the Newcomen atmospheric engine in 1712. The Intergovernmental Panel on Climate Change takes the pre-industrial era (chosen as the year 1750) as the baseline related to changes in long-lived, well mixed greenhouse gases. Although it is apparent that the Industrial Revolution ushered in an unprecedented global human impact on the planet, much of Earth's landscape already had been profoundly modified by human activities. The human impact on Earth has grown progressively, with few substantial slowdowns. 

Anthropocene marker

A marker that accounts for a substantial global impact of humans on the total environment, comparable in scale to those associated with significant perturbations of the geological past, is needed in place of minor changes in atmosphere composition.

A useful candidate for this purpose is the pedosphere, which can retain information of its climatic and geochemical history with features lasting for centuries or millennia. Human activity is now firmly established as the sixth factor of soil formation. It affects pedogenesis either directly, by, for example, land levelling, trenching and embankment building for various purposes, organic matter enrichment from additions of manure or other waste, organic matter impoverishment due to continued cultivation, compaction from overgrazing or, indirectly, by drift of eroded materials or pollutants. Anthropogenic soils are those markedly affected by human activities, such as repeated ploughing, the addition of fertilisers, contamination, sealing, or enrichment with artefacts (in the World Reference Base for Soil Resources they are classified as Anthrosols and Technosols). They are recalcitrant repositories of artefacts and properties that testify to the dominance of the human impact, and hence appear to be reliable markers for the Anthropocene. Some anthropogenic soils may be viewed as the 'golden spikes' of geologists (Global Boundary Stratotype Section and Point), which are locations where there are strata successions with clear evidences of a worldwide event, including the appearance of distinctive fossils. Drilling for fossil fuels has also created holes and tubes which are expected to be detectable for millions of years. The astrobiologist David Grinspoon has proposed that the site of the Apollo 11 Lunar landing, with the disturbances and artifacts that are so uniquely characteristic of our species' technological activity and which will survive over geological time spans could be considered as the 'golden spike' of the Anthropocene.

In culture


Humanities

The concept of the Anthropocene has also been approached via humanities such as philosophy, literature and art. In the scholarly world, it has been the subject of increasing attention through special journal issues, conferences, and disciplinary reports. The Anthropocene, its attendant timescale, and ecological implications prompts questions about death and the ends of civilisation, memory and archives, the scope and methods of humanistic inquiry, and emotional responses to the "end of nature." It has been also criticised as an ideological construct. Some environmentalists on the political left suggest that "Capitalocene" is a more historically appropriate term. At the same time, others suggest that the Anthropocene is overly focused on the human species, while ignoring systematic inequalities, such as imperialism and racism, that have also shaped the world.

Peter Brannen criticised the idea of the anthropocene in an article published on The Atlantic, suggesting the short timescale makes it a geologic event rather than an epoch, with hypothetical geologists of the far future being unlikely to notice the presence of a few thousand years of human civilisation. He eventually reconsidered his position after a response from members of the Anthropocene Working Group.

There are several philosophical approaches on how to handle the future of Anthropocene: Business-as-usual, mitigation, geo-engineering options.

Popular culture

  • The concept gained attention of the public via documentary films such as L'homme a mangé la Terre, Anthropocene: The Human Epoch and Anthropocene.
  • David Grinspoon makes a further distinction in the Anthropocene, namely the "proto-Anthropocene" and "mature Anthropocene". He also mentions the term "Terra Sapiens", or Wise Earth.
  • On 2 October 2019, the English musician Nick Mulvey released a music video on YouTube named "In The Anthropocene." In cooperation with Sharp's Brewery, the song was recorded on 105 vinyl records made of washed up plastic from the Cornish coast.

Genetic engineering techniques

From Wikipedia, the free encyclopedia

Genetic engineering can be accomplished using multiple techniques. There are a number of steps that are followed before a genetically modified organism (GMO) is created. Genetic engineers must first choose what gene they wish to insert, modify, or delete. The gene must then be isolated and incorporated, along with other genetic elements, into a suitable vector. This vector is then used to insert the gene into the host genome, creating a transgenic or edited organism. The ability to genetically engineer organisms is built on years of research and discovery on how genes function and how we can manipulate them. Important advances included the discovery of restriction enzymes and DNA ligases and the development of polymerase chain reaction and sequencing.

This allowed the gene of interest to be isolated and then incorporated into a vector. Often a promoter and terminator region was added as well as a selectable marker gene. The gene may be modified further at this point to make it express more efficiently. This vector is then inserted into the host organism's genome. For animals, the gene is typically inserted into embryonic stem cells, while in plants it can be inserted into any tissue that can be cultured into a fully developed plant. Common techniques include microinjection, virus-mediated, Agrobacterium-mediated or biolistics. Further tests are carried out on the resulting organism to ensure stable integration, inheritance and expression. First generation offspring are heterozygous, requiring them to be inbred to create the homozygous pattern necessary for stable inheritance. Homozygosity must be confirmed in second generation specimens.

Traditional techniques inserted the genes randomly into the hosts genome. Advances have allowed genes to be inserted at specific locations within a genome, which reduces the unintended side effects of random insertion. Early targeting systems relied on meganucleases and zinc finger nucleases. Since 2009 more accurate and easier systems to implement have been developed. Transcription activator-like effector nucleases (TALENs) and the Cas9-guideRNA system (adapted from CRISPR) are the two most commonly used. They may potentially be useful in gene therapy and other procedures that require accurate or high throughput targeting.

History

Many different discoveries and advancements led to the development of genetic engineering. Human-directed genetic manipulation began with the domestication of plants and animals through artificial selection in about 12,000 BC. Various techniques were developed to aid in breeding and selection. Hybridization was one way rapid changes in an organism's genetic makeup could be introduced. Crop hybridization most likely first occurred when humans began growing genetically distinct individuals of related species in close proximity. Some plants were able to be propagated by vegetative cloning.

Genetic inheritance was first discovered by Gregor Mendel in 1865, following experiments crossing peas. In 1928 Frederick Griffith proved the existence of a "transforming principle" involved in inheritance, which was identified as DNA in 1944 by Oswald Avery, Colin MacLeod, and Maclyn McCarty. Frederick Sanger developed a method for sequencing DNA in 1977, greatly increasing the genetic information available to researchers. 

After discovering the existence and properties of DNA, tools had to be developed that allowed it to be manipulated. In 1970 Hamilton Smiths lab discovered restriction enzymes, enabling scientists to isolate genes from an organism's genome. DNA ligases, which join broken DNA together, were discovered earlier in 1967. By combining the two enzymes it became possible to "cut and paste" DNA sequences to create recombinant DNA. Plasmids, discovered in 1952, became important tools for transferring information between cells and replicating DNA sequences. Polymerase chain reaction (PCR), developed by Kary Mullis in 1983, allowed small sections of DNA to be amplified (replicated) and aided identification and isolation of genetic material. 

As well as manipulating DNA, techniques had to be developed for its insertion into an organism's genome. Griffith's experiment had already shown that some bacteria had the ability to naturally uptake and express foreign DNA. Artificial competence was induced in Escherichia coli in 1970 by treating them with calcium chloride solution (CaCl2). Transformation using electroporation was developed in the late 1980s, increasing the efficiency and bacterial range. In 1907 a bacterium that caused plant tumors, Agrobacterium tumefaciens, had been discovered. In the early 1970s it was found that this bacteria inserted its DNA into plants using a Ti plasmid. By removing the genes in the plasmid that caused the tumor and adding in novel genes, researchers were able to infect plants with A. tumefaciens and let the bacteria insert their chosen DNA into the genomes of the plants.

Choosing target genes

The first step is to identify the target gene or genes to insert into the host organism. This is driven by the goal for the resultant organism. In some cases only one or two genes are affected. For more complex objectives entire biosynthetic pathways involving multiple genes may be involved. Once found genes and other genetic information from a wide range of organisms can be inserted into bacteria for storage and modification, creating genetically modified bacteria in the process. Bacteria are cheap, easy to grow, clonal, multiply quickly, relatively easy to transform and can be stored at -80 °C almost indefinitely. Once a gene is isolated it can be stored inside the bacteria providing an unlimited supply for research.

Genetic screens can be carried out to determine potential genes followed by other tests identify the best candidates. A simple screen involves randomly mutating DNA with chemicals or radiation and then selecting those that display the desired trait. For organisms where mutation is not practical, scientists instead look for individuals among the population who present the characteristic through naturally-occurring mutations. Processes that look at a phenotype and then try and identify the gene responsible are called forward genetics. The gene then needs to be mapped by comparing the inheritance of the phenotype with known genetic markers. Genes that are close together are likely to be inherited together.

Another option is reverse genetics. This approach involves targeting a specific gene with a mutation and then observing what phenotype develops. The mutation can be designed to inactivate the gene or only allow it to become active only under certain conditions. Conditional mutations are useful for identifying genes that are normally lethal if non-functional. As genes with similar functions share similar sequences (homologous) it is possible to predict the likely function of a gene by comparing its sequence to that of well-studied genes from model organisms. The development of microarrays, transcriptomes and genome sequencing has made it much easier to find desirable genes.

The bacteria Bacillus thuringiensis was first discovered in 1901 as the causative agent in the death of silkworms. Due to these insecticidal properties, the bacteria was used as a biological insecticide, developed commercially in 1938. The cry proteins were discovered to provide the insecticidal activity in 1956, and by the 1980s, scientists had successfully cloned the gene that encodes this protein and expressed it in plants. The gene that provides resistance to the herbicide glyphosate was found after seven years of searching in bacteria that living in the outflow pipe of a Monsanto RoundUp manufacturing facility. In animals, the majority of genes used are growth hormone genes.

Gene manipulation

All genetic engineering processes involve the modification of DNA. Traditionally DNA was isolated from the cells of organisms. Later, genes came to be cloned from a DNA segment after the creation of a DNA library or artificially synthesised. Once isolated, additional genetic elements are added to the gene to allow it to be expressed in the host organism and to aid selection. 

Extraction from cells

First the cell must be gently opened, exposing the DNA without causing too much damage to it. The methods used vary depending on the type of cell. Once open, the DNA must be separated from the other cellular components. A ruptured cell contains proteins and other cell debris. By mixing with phenol and/or chloroform, followed by centrifuging, the nucleic acids can be separated from this debris into an upper aqueous phase. This aqueous phase can be removed and further purified if necessary by repeating the phenol-chloroform steps. The nucleic acids can then be precipitated from the aqueous solution using ethanol or isopropanol. Any RNA can be removed by adding a ribonuclease that will degrade it. Many companies now sell kits that simplify the process.

Gene isolation

The gene of interest must be separated from the extracted DNA. If the sequence is not known then a common method is to break the DNA up with a random digestion method. This is usually accomplished using restriction enzymes (enzymes that cut DNA). A partial restriction digest cuts only some of the restriction sites, resulting in overlapping DNA fragment segments. The DNA fragments are put into individual plasmid vectors and grown inside bacteria. Once in the bacteria the plasmid is copied as the bacteria divides. To determine if a useful gene is present on a particular fragment the DNA library is screened for the desired phenotype. If the phenotype is detected then it is possible that the bacteria contains the target gene.

If the gene does not have a detectble phenotype or a DNA library does not contain the correct gene, other methods must be used to isolate it. If the position of the gene can be determined using molecular markers then chromosome walking is one way to isolate the correct DNA fragment. If the gene expresses close homology to a known gene in another species, then it could be isolated by searching for genes in the library that closely match the known gene.

For known DNA sequences, restriction enzymes that cut the DNA on either side of the gene can be used. Gel electrophoresis then sorts the fragments according to length. Some gels can separate sequences that differ by a single base-pair. The DNA can be visualised by staining it with ethidium bromide and photographing under UV light. A marker with fragments of known lengths can be laid alongside the DNA to estimate the size of each band. The DNA band at the correct size should contain the gene, where it can be excised from the gel. Another technique to isolate genes of known sequences involves polymerase chain reaction (PCR). PCR is a powerful tool that can amplify a given sequence, which can then be isolated through gel electrophoresis. Its effectiveness drops with larger genes and it has the potential to introduce errors into the sequence. 

It is possible to artificially synthesise genes. Some synthetic sequences are available commercially, forgoing many of these early steps.

Modification

The gene to be inserted must be combined with other genetic elements in order for it to work properly. The gene can be modified at this stage for better expression or effectiveness. As well as the gene to be inserted most constructs contain a promoter and terminator region as well as a selectable marker gene. The promoter region initiates transcription of the gene and can be used to control the location and level of gene expression, while the terminator region ends transcription. A selectable marker, which in most cases confers antibiotic resistance to the organism it is expressed in, is used to determine which cells are transformed with the new gene. The constructs are made using recombinant DNA techniques, such as restriction digests, ligations and molecular cloning.

Inserting DNA into the host genome

Once the gene is constructed it must be stably integrated into the target organisms genome or exist as extrachromosomal DNA. There are a number of techniques available for inserting the gene into the host genome and they vary depending on the type of organism targeted. In multicellular eukaryotes, if the transgene is incorporated into the host's germline cells, the resulting host cell can pass the transgene to its progeny. If the transgene is incorporated into somatic cells, the transgene can not be inherited.

Transformation

Bacterial transformation involves moving a gene from one bacteria to another. It is integrated into the recipients plasmid. and can then be expressed by the new host.

Transformation is the direct alteration of a cell's genetic components by passing the genetic material through the cell membrane. About 1% of bacteria are naturally able to take up foreign DNA, but this ability can be induced in other bacteria. Stressing the bacteria with a heat shock or electroporation can make the cell membrane permeable to DNA that may then be incorporated into the genome or exist as extrachromosomal DNA. Typically the cells are incubated in a solution containing divalent cations (often calcium chloride) under cold conditions, before being exposed to a heat pulse (heat shock). Calcium chloride partially disrupts the cell membrane, which allows the recombinant DNA to enter the host cell. It is suggested that exposing the cells to divalent cations in cold condition may change or weaken the cell surface structure, making it more permeable to DNA. The heat-pulse is thought to create a thermal imbalance across the cell membrane, which forces the DNA to enter the cells through either cell pores or the damaged cell wall. Electroporation is another method of promoting competence. In this method the cells are briefly shocked with an electric field of 10-20 kV/cm, which is thought to create holes in the cell membrane through which the plasmid DNA may enter. After the electric shock, the holes are rapidly closed by the cell's membrane-repair mechanisms. Up-taken DNA can either integrate with the bacterials genome or, more commonly, exist as extrachromosomal DNA.

A gene gun uses biolistics to insert DNA into plant tissue
 
A. tumefaciens attaching itself to a carrot cell
 
In plants the DNA is often inserted using Agrobacterium-mediated recombination, taking advantage of the Agrobacteriums T-DNA sequence that allows natural insertion of genetic material into plant cells. Plant tissue are cut into small pieces and soaked in a fluid containing suspended Agrobacterium. The bacteria will attach to many of the plant cells exposed by the cuts. The bacteria uses conjugation to transfer a DNA segment called T-DNA from its plasmid into the plant. The transferred DNA is piloted to the plant cell nucleus and integrated into the host plants genomic DNA.The plasmid T-DNA is integrated semi-randomly into the genome of the host cell.

By modifying the plasmid to express the gene of interest, researchers can insert their chosen gene stably into the plants genome. The only essential parts of the T-DNA are its two small (25 base pair) border repeats, at least one of which is needed for plant transformation. The genes to be introduced into the plant are cloned into a plant transformation vector that contains the T-DNA region of the plasmid. An alternative method is agroinfiltration.

Another method used to transform plant cells is biolistics, where particles of gold or tungsten are coated with DNA and then shot into young plant cells or plant embryos. Some genetic material enters the cells and transforms them. This method can be used on plants that are not susceptible to Agrobacterium infection and also allows transformation of plant plastids. Plants cells can also be transformed using electroporation, which uses an electric shock to make the cell membrane permeable to plasmid DNA. Due to the damage caused to the cells and DNA the transformation efficiency of biolistics and electroporation is lower than agrobacterial transformation.

Transfection

Transformation has a different meaning in relation to animals, indicating progression to a cancerous state, so the process used to insert foreign DNA into animal cells is usually called transfection. There are many ways to directly introduce DNA into animal cells in vitro. Often these cells are stem cells that are used for gene therapy. Chemical based methods uses natural or synthetic compounds to form particles that facilitate the transfer of genes into cells. These synthetic vectors have the ability to bind DNA and accommodate large genetic transfers. One of the simplest methods involves using calcium phosphate to bind the DNA and then exposing it to cultured cells. The solution, along with the DNA, is encaspulated by the cells and a small amount of DNA can be integrated into the genome. Liposomes and polymers can be used as vectors to deliver DNA into cultured animal cells. Positively charged liposomes bind with DNA, while polymers can designed that interact with DNA. They form lipoplexes and polyplexes respectively, which are then up-taken by the cells. Other techniques include using electroporation and biolistics.

To create transgenic animals the DNA must be inserted into viable embryos or eggs. This is usually accomplished using microinjection, where DNA is injected through the cell's nuclear envelope directly into the nucleus. Superovulated fertilised eggs are collected at the single cell stage and cultured in vitro. When the pronuclei from the sperm head and egg are visible through the protoplasm the genetic material is injected into one of them. The oocyte is then implanted in the oviduct of a pseudopregnant animal. Another method is Embryonic Stem Cell-Mediated Gene Transfer. The gene is transfected into embryonic stem cells and then they are inserted into mouse blastocysts that are then implanted into foster mothers. The resulting offspring are chimeric, and further mating can produce mice fully transgenic with the gene of interest.

Transduction

Transduction is the process by which foreign DNA is introduced into a cell by a virus or viral vector. Genetically modified viruses can be used as viral vectors to transfer target genes to another organism in gene therapy. First the virulent genes are removed from the virus and the target genes are inserted instead. The sequences that allow the virus to insert the genes into the host organism must be left intact. Popular virus vectors are developed from retroviruses or adenoviruses. Other viruses used as vectors include, lentiviruses, pox viruses and herpes viruses. The type of virus used will depend on the cells targeted and whether the DNA is to be altered permanently or temporarily.

Regeneration

As often only a single cell is transformed with genetic material, the organism must be regenerated from that single cell. In plants this is accomplished through the use of tissue culture. Each plant species has different requirements for successful regeneration. If successful, the technique produces an adult plant that contains the transgene in every cell. In animals it is necessary to ensure that the inserted DNA is present in the embryonic stem cells. Offspring can be screened for the gene. All offspring from the first generation are heterozygous for the inserted gene and must be inbred to produce a homozygous specimen. Bacteria consist of a single cell and reproduce clonally so regeneration is not necessary. Selectable markers are used to easily differentiate transformed from untransformed cells.

Cells that have been successfully transformed with the DNA contain the marker gene, while those not transformed will not. By growing the cells in the presence of an antibiotic or chemical that selects or marks the cells expressing that gene, it is possible to separate modified from unmodified cells. Another screening method involves a DNA probe that sticks only to the inserted gene. These markers are usually present in the transgenic organism, although a number of strategies have been developed that can remove the selectable marker from the mature transgenic plant.

Confirmation

Finding that a recombinant organism contains the inserted genes is not usually sufficient to ensure that they will be appropriately expressed in the intended tissues. Further testing using PCR, Southern hybridization, and DNA sequencing is conducted to confirm that an organism contains the new gene. These tests can also confirm the chromosomal location and copy number of the inserted gene. Once confirmed methods that look for and measure the gene products (RNA and protein) are also used to assess gene expression, transcription, RNA processing patterns and expression and localization of protein product(s). These include northern hybridisation, quantitative RT-PCR, Western blot, immunofluorescence, ELISA and phenotypic analysis. When appropriate, the organism's offspring are studied to confirm that the transgene and associated phenotype are stably inherited. 

Gene targeting

Traditional methods of genetic engineering generally insert the new genetic material randomly within the host genome. This can impair or alter other genes within the organism. Methods were developed that inserted the new genetic material into specific sites within an organism genome. Early methods that targeted genes at certain sites within a genome relied on homologous recombination. By creating DNA constructs that contain a template that matches the targeted genome sequence, it is possible that the HR processes within the cell will insert the construct at the desired location. Using this method on embryonic stem cells led to the development of transgenic mice with targeted knocked out. It has also been possible to knock in genes or alter gene expression patterns.

If a vital gene is knocked out it can prove lethal to the organism. In order to study the function of these genes, site specific recombinases (SSR) were used. The two most common types are the Cre-LoxP and Flp-FRT systems. Cre recombinase is an enzyme that removes DNA by homologous recombination between binding sequences known as Lox-P sites. The Flip-FRT system operates in a similar way, with the Flip recombinase recognizing FRT sequences. By crossing an organism containing the recombinase sites flanking the gene of interest with an organism that expresses the SSR under control of tissue specific promoters, it is possible to knock out or switch on genes only in certain cells. This has also been used to remove marker genes from transgenic animals. Further modifications of these systems allowed researchers to induce recombination only under certain conditions, allowing genes to be knocked out or expressed at desired times or stages of development.

Genome editing uses artificially engineered nucleases that create specific double-stranded breaks at desired locations in the genome. The breaks are subject to cellular DNA repair processes that can be exploited for targeted gene knock-out, correction or insertion at high frequencies. If a donor DNA containing the appropriate sequence (homologies) is present, then new genetic material containing the transgene will be integrated at the targeted site with high efficiency by homologous recombination. There are four families of engineered nucleases: meganucleases, ZFNs, transcription activator-like effector nucleases (TALEN), the CRISPR/Cas (clustered regularly interspaced short palindromic repeat/CRISPRassociated protein (e.g. CRISPR/Cas9). Among the four types, TALEN and CRISPR/Cas are the two most commonly used. Recent advances have looked at combining multiple systems to exploit the best features of both (e.g. megaTAL that are a fusion of a TALE DNA binding domain and a meganuclease). Recent research has also focused on developing strategies to create gene knock-out or corrections without creating double stranded breaks (base editors).

Meganucleases and Zinc finger nucleases

Meganucleases were first used in 1988 in mammalian cells. Meganucleases are endodeoxyribonucleases that function as restriction enzymes with long recognition sites, making them are more specific to their target site than other restriction enzymes. This increases their specificity and reduces their toxicity as they will not target as many sites within a genome. The most studied meganucleases are the LAGLIDADG family. While meganucleases are still quite susceptible to off-target binding, which makes them less attractive than other gene editing tools, their smaller size still makes them attractive particularly for viral vectorization perspectives.

Zinc-finger nucleases (ZFNs), used for the first time in 1996, are typically created through the fusion of Zinc-finger domains and the FokI nuclease domain. ZFNs have thus the ability to cleave DNA at target sites. By engineering the zinc finger domain to target a specific site within the genome, it is possible to edit the genomic sequence at the desired location. ZFNs have a greater specificity, but still hold the potential to bind to non-specific sequences.. While a certain amount of off-target cleavage is acceptable for creating transgenic model organisms, they might not be optimal for all human gene therapy treatments.

TALEN and CRISPR

Access to the code governing the DNA recognition by transcription activator-like effectors (TALE) in 2009 opened the way to the development of a new class of efficient TAL-based gene editing tools. TALE, proteins secreted by the Xanthomonas plant pathogen, bind with great specificity to genes within the plant host and initiate transcription of the genes helping infection. Engineering TALE by fusing the DNA binding core to the FokI nuclease catalytic domain allowed creation of a new tool of designer nucleases, the TALE nuclease (TALEN). They have one of the greatest specificities of all the current engineered nucleases. Due to the presence of repeat sequences, they are difficult to construct through standard molecular biology procedure and rely on more complicated method of such as Golden gate cloning.

In 2011, another major breakthrough technology was developed based on CRISPR/Cas (clustered regularly interspaced short palindromic repeat / CRISPR associated protein) systems that function as an adaptive immune system in bacteria and archaea. The CRISPR/Cas system allows bacteria and archaea to fight against invading viruses by cleaving viral DNA and inserting pieces of that DNA into their own genome. The organism then transcribes this DNA into RNA and combines this RNA with Cas9 proteins to make double-stranded breaks in the invading viral DNA. The RNA serves as a guide RNA to direct the Cas9 enzyme to the correct spot in the virus DNA. By pairing Cas proteins with a designed guide RNA CRISPR/Cas9 can be used to induce double-stranded breaks at specific points within DNA sequences. The break gets repaired by cellular DNA repair enzymes, creating a small insertion/deletion type mutation in most cases. Targeted DNA repair is possible by providing a donor DNA template that represents the desired change and that is (sometimes) used for double-strand break repair by homologous recombination. It was later demonstrated that CRISPR/Cas9 can edit human cells in a dish. Although the early generation lacks the specificity of TALEN, the major advantage of this technology is the simplicity of the design. It also allows multiple sites to be targeted simultaneously, allowing the editing of multiple genes at once. CRISPR/Cpf1 is a more recently discovered system that requires a different guide RNA to create particular double-stranded breaks (leaves overhangs when cleaving the DNA) when compared to CRISPR/Cas9.

CRISPR/Cas9 is efficient at gene disruption. The creation of HIV-resistant babies by Chinese researcher He Jiankui is perhaps the most famous example of gene disruption using this method. It is far less effective at gene correction. Methods of base editing are under development in which a “nuclease-dead” Cas 9 endonuclease or a related enzyme is used for gene targeting while a linked deaminase enzyme makes a targeted base change in the DNA. The most recent refinement of CRISPR-Cas9 is called Prime Editing. This method links a reverse transcriptase to an RNA-guided engineered nuclease that only makes single-strand cuts but no double-strand breaks. It replaces the portion of DNA next to the cut by the successive action of nuclease and reverse transcriptase, introducing the desired change from an RNA template.

Vectors in gene therapy

From Wikipedia, the free encyclopedia
 
How vectors work to transfer genetic material

Gene therapy utilizes the delivery of DNA into cells, which can be accomplished by several methods, summarized below. The two major classes of methods are those that use recombinant viruses (sometimes called biological nanoparticles or viral vectors) and those that use naked DNA or DNA complexes (non-viral methods).

Viruses

All viruses bind to their hosts and introduce their genetic material into the host cell as part of their replication cycle. This genetic material contains basic 'instructions' of how to produce more copies of these viruses, hacking the body's normal production machinery to serve the needs of the virus. The host cell will carry out these instructions and produce additional copies of the virus, leading to more and more cells becoming infected. Some types of viruses insert their genome into the host's cytoplasm, but do not actually enter the cell. Others penetrate the cell membrane disguised as protein molecules and enter the cell. 

There are two main types of virus infection: lytic and lysogenic. Shortly after inserting its DNA, viruses of the lytic cycle quickly produce more viruses, burst from the cell and infect more cells. Lysogenic viruses integrate their DNA into the DNA of the host cell and may live in the body for many years before responding to a trigger. The virus reproduces as the cell does and does not inflict bodily harm until it is triggered. The trigger releases the DNA from that of the host and employs it to create new viruses.

Retroviruses

The genetic material in retroviruses is in the form of RNA molecules, while the genetic material of their hosts is in the form of DNA. When a retrovirus infects a host cell, it will introduce its RNA together with some enzymes, namely reverse transcriptase and integrase, into the cell. This RNA molecule from the retrovirus must produce a DNA copy from its RNA molecule before it can be integrated into the genetic material of the host cell. The process of producing a DNA copy from an RNA molecule is termed reverse transcription. It is carried out by one of the enzymes carried in the virus, called reverse transcriptase. After this DNA copy is produced and is free in the nucleus of the host cell, it must be incorporated into the genome of the host cell. That is, it must be inserted into the large DNA molecules in the cell (the chromosomes). This process is done by another enzyme carried in the virus called integrase.

Now that the genetic material of the virus has been inserted, it can be said that the host cell has been modified to contain new genes. If this host cell divides later, its descendants will all contain the new genes. Sometimes the genes of the retrovirus do not express their information immediately.

One of the problems of gene therapy using retroviruses is that the integrase enzyme can insert the genetic material of the virus into any arbitrary position in the genome of the host; it randomly inserts the genetic material into a chromosome. If genetic material happens to be inserted in the middle of one of the original genes of the host cell, this gene will be disrupted (insertional mutagenesis). If the gene happens to be one regulating cell division, uncontrolled cell division (i.e., cancer) can occur. This problem has recently begun to be addressed by utilizing zinc finger nucleases or by including certain sequences such as the beta-globin locus control region to direct the site of integration to specific chromosomal sites.

Gene therapy trials using retroviral vectors to treat X-linked severe combined immunodeficiency (X-SCID) represent the most successful application of gene therapy to date. More than twenty patients have been treated in France and Britain, with a high rate of immune system reconstitution observed. Similar trials were restricted or halted in the USA when leukemia was reported in patients treated in the French X-SCID gene therapy trial. To date, four children in the French trial and one in the British trial have developed leukemia as a result of insertional mutagenesis by the retroviral vector. All but one of these children responded well to conventional anti-leukemia treatment. Gene therapy trials to treat SCID due to deficiency of the Adenosine Deaminase (ADA) enzyme (one form of SCID) continue with relative success in the USA, Britain, Ireland, Italy and Japan.

Adenoviruses

Adenoviruses are viruses that carry their genetic material in the form of double-stranded DNA. They cause respiratory, intestinal, and eye infections in humans (especially the common cold). When these viruses infect a host cell, they introduce their DNA molecule into the host. The genetic material of the adenoviruses is not incorporated (transient) into the host cell's genetic material. The DNA molecule is left free in the nucleus of the host cell, and the instructions in this extra DNA molecule are transcribed just like any other gene. The only difference is that these extra genes are not replicated when the cell is about to undergo cell division so the descendants of that cell will not have the extra gene.

As a result, treatment with the adenovirus will require readministration in a growing cell population although the absence of integration into the host cell's genome should prevent the type of cancer seen in the SCID trials. This vector system has been promoted for treating cancer and indeed the first gene therapy product to be licensed to treat cancer, Gendicine, is an adenovirus. Gendicine, an adenoviral p53-based gene therapy was approved by the Chinese food and drug regulators in 2003 for treatment of head and neck cancer. Advexin, a similar gene therapy approach from Introgen, was turned down by the US Food and Drug Administration (FDA) in 2008.

Concerns about the safety of adenovirus vectors were raised after the 1999 death of Jesse Gelsinger while participating in a gene therapy trial. Since then, work using adenovirus vectors has focused on genetically crippled versions of the virus.

Envelope protein pseudotyping of viral vectors

The viral vectors described above have natural host cell populations that they infect most efficiently. Retroviruses have limited natural host cell ranges, and although adenovirus and adeno-associated virus are able to infect a relatively broader range of cells efficiently, some cell types are resistant to infection by these viruses as well. Attachment to and entry into a susceptible cell is mediated by the protein envelope on the surface of a virus. Retroviruses and adeno-associated viruses have a single protein coating their membrane, while adenoviruses are coated with both an envelope protein and fibers that extend away from the surface of the virus. The envelope proteins on each of these viruses bind to cell-surface molecules such as heparin sulfate, which localizes them upon the surface of the potential host, as well as with the specific protein receptor that either induces entry-promoting structural changes in the viral protein, or localizes the virus in endosomes wherein acidification of the lumen induces this refolding of the viral coat. In either case, entry into potential host cells requires a favorable interaction between a protein on the surface of the virus and a protein on the surface of the cell.

For the purposes of gene therapy, one might either want to limit or expand the range of cells susceptible to transduction by a gene therapy vector. To this end, many vectors have been developed in which the endogenous viral envelope proteins have been replaced by either envelope proteins from other viruses, or by chimeric proteins. Such chimera would consist of those parts of the viral protein necessary for incorporation into the virion as well as sequences meant to interact with specific host cell proteins. Viruses in which the envelope proteins have been replaced as described are referred to as pseudotyped viruses. For example, the most popular retroviral vector for use in gene therapy trials has been the lentivirus Simian immunodeficiency virus coated with the envelope proteins, G-protein, from Vesicular stomatitis virus. This vector is referred to as VSV G-pseudotyped lentivirus, and infects an almost universal set of cells. This tropism is characteristic of the VSV G-protein with which this vector is coated. Many attempts have been made to limit the tropism of viral vectors to one or a few host cell populations. This advance would allow for the systemic administration of a relatively small amount of vector. The potential for off-target cell modification would be limited, and many concerns from the medical community would be alleviated. Most attempts to limit tropism have used chimeric envelope proteins bearing antibody fragments. These vectors show great promise for the development of "magic bullet" gene therapies.

Replication-competent vectors

A replication-competent vector called ONYX-015 is used in replicating tumor cells. It was found that in the absence of the E1B-55Kd viral protein, adenovirus caused very rapid apoptosis of infected, p53(+) cells, and this results in dramatically reduced virus progeny and no subsequent spread. Apoptosis was mainly the result of the ability of EIA to inactivate p300. In p53(-) cells, deletion of E1B 55kd has no consequence in terms of apoptosis, and viral replication is similar to that of wild-type virus, resulting in massive killing of cells.

A replication-defective vector deletes some essential genes. These deleted genes are still necessary in the body so they are replaced with either a helper virus or a DNA molecule.

Cis and trans-acting elements

Replication-defective vectors always contain a “transfer construct”. The transfer construct carries the gene to be transduced or “transgene”. The transfer construct also carries the sequences which are necessary for the general functioning of the viral genome: packaging sequence, repeats for replication and, when needed, priming of reverse transcription. These are denominated cis-acting elements, because they need to be on the same piece of DNA as the viral genome and the gene of interest. Trans-acting elements are viral elements, which can be encoded on a different DNA molecule. For example, the viral structural proteins can be expressed from a different genetic element than the viral genome.

Herpes simplex virus

The herpes simplex virus is a human neurotropic virus. This is mostly examined for gene transfer in the nervous system. The wild type HSV-1 virus is able to infect neurons and evade the host immune response, but may still become reactivated and produce a lytic cycle of viral replication. Therefore, it is typical to use mutant strains of HSV-1 that are deficient in their ability to replicate. Though the latent virus is not transcriptionally apparent, it does possess neuron specific promoters that can continue to function normally. Antibodies to HSV-1 are common in humans, however complications due to herpes infection are somewhat rare. Caution for rare cases of encephalitis must be taken and this provides some rationale to using HSV-2 as a viral vector as it generally has tropism for neuronal cells innervating the urogenital area of the body and could then spare the host of severe pathology in the brain.

Non-viral methods

Non-viral methods present certain advantages over viral methods, with simple large scale production and low host immunogenicity being just two. Previously, low levels of transfection and expression of the gene held non-viral methods at a disadvantage; however, recent advances in vector technology have yielded molecules and techniques with transfection efficiencies similar to those of viruses.

Injection of naked DNA

This is the simplest method of non-viral transfection. Clinical trials carried out of intramuscular injection of a naked DNA plasmid have occurred with some success; however, the expression has been very low in comparison to other methods of transfection. In addition to trials with plasmids, there have been trials with naked PCR product, which have had similar or greater success. Cellular uptake of naked DNA is generally inefficient. Research efforts focusing on improving the efficiency of naked DNA uptake have yielded several novel methods, such as electroporation, sonoporation, and the use of a "gene gun", which shoots DNA coated gold particles into the cell using high pressure gas.

Physical methods to enhance delivery


Electroporation

Electroporation is a method that uses short pulses of high voltage to carry DNA across the cell membrane. This shock is thought to cause temporary formation of pores in the cell membrane, allowing DNA molecules to pass through. Electroporation is generally efficient and works across a broad range of cell types. However, a high rate of cell death following electroporation has limited its use, including clinical applications.

More recently a newer method of electroporation, termed electron-avalanche transfection, has been used in gene therapy experiments. By using a high-voltage plasma discharge, DNA was efficiently delivered following very short (microsecond) pulses. Compared to electroporation, the technique resulted in greatly increased efficiency and less cellular damage. 

Gene gun

The use of particle bombardment, or the gene gun, is another physical method of DNA transfection. In this technique, DNA is coated onto gold particles and loaded into a device which generates a force to achieve penetration of the DNA into the cells, leaving the gold behind on a "stopping" disk. 

Sonoporation

Sonoporation uses ultrasonic frequencies to deliver DNA into cells. The process of acoustic cavitation is thought to disrupt the cell membrane and allow DNA to move into cells. 

Magnetofection

In a method termed magnetofection, DNA is complexed to magnetic particles, and a magnet is placed underneath the tissue culture dish to bring DNA complexes into contact with a cell monolayer. 

Hydrodynamic delivery

Hydrodynamic delivery involves rapid injection of a high volume of a solution into vasculature (such as into the inferior vena cava, bile duct, or tail vein). The solution contains molecules that are to be inserted into cells, such as DNA plasmids or siRNA, and transfer of these molecules into cells is assisted by the elevated hydrostatic pressure caused by the high volume of injected solution.

Chemical methods to enhance delivery


Oligonucleotides

The use of synthetic oligonucleotides in gene therapy is to deactivate the genes involved in the disease process. There are several methods by which this is achieved. One strategy uses antisense specific to the target gene to disrupt the transcription of the faulty gene. Another uses small molecules of RNA called siRNA to signal the cell to cleave specific unique sequences in the mRNA transcript of the faulty gene, disrupting translation of the faulty mRNA, and therefore expression of the gene. A further strategy uses double stranded oligodeoxynucleotides as a decoy for the transcription factors that are required to activate the transcription of the target gene. The transcription factors bind to the decoys instead of the promoter of the faulty gene, which reduces the transcription of the target gene, lowering expression. Additionally, single stranded DNA oligonucleotides have been used to direct a single base change within a mutant gene. The oligonucleotide is designed to anneal with complementarity to the target gene with the exception of a central base, the target base, which serves as the template base for repair. This technique is referred to as oligonucleotide mediated gene repair, targeted gene repair, or targeted nucleotide alteration.

Lipoplexes

To improve the delivery of the new DNA into the cell, the DNA must be protected from damage and positively charged. Initially, anionic and neutral lipids were used for the construction of lipoplexes for synthetic vectors. However, in spite of the facts that there is little toxicity associated with them, that they are compatible with body fluids and that there was a possibility of adapting them to be tissue specific; they are complicated and time consuming to produce so attention was turned to the cationic versions.

Cationic lipids, due to their positive charge, were first used to condense negatively charged DNA molecules so as to facilitate the encapsulation of DNA into liposomes. Later it was found that the use of cationic lipids significantly enhanced the stability of lipoplexes. Also as a result of their charge, cationic liposomes interact with the cell membrane, endocytosis was widely believed as the major route by which cells uptake lipoplexes. Endosomes are formed as the results of endocytosis, however, if genes can not be released into cytoplasm by breaking the membrane of endosome, they will be sent to lysosomes where all DNA will be destroyed before they could achieve their functions. It was also found that although cationic lipids themselves could condense and encapsulate DNA into liposomes, the transfection efficiency is very low due to the lack of ability in terms of “endosomal escaping”. However, when helper lipids (usually electroneutral lipids, such as DOPE) were added to form lipoplexes, much higher transfection efficiency was observed. Later on, it was figured out that certain lipids have the ability to destabilize endosomal membranes so as to facilitate the escape of DNA from endosome, therefore those lipids are called fusogenic lipids. Although cationic liposomes have been widely used as an alternative for gene delivery vectors, a dose dependent toxicity of cationic lipids were also observed which could limit their therapeutic usages.

The most common use of lipoplexes has been in gene transfer into cancer cells, where the supplied genes have activated tumor suppressor control genes in the cell and decrease the activity of oncogenes. Recent studies have shown lipoplexes to be useful in transfecting respiratory epithelial cells.

Polymersomes

Polymersomes are synthetic versions of liposomes (vesicles with a lipid bilayer), made of amphiphilic block copolymers. They can encapsulate either hydrophilic or hydrophobic contents and can be used to deliver cargo such as DNA, proteins, or drugs to cells. Advantages of polymersomes over liposomes include greater stability, mechanical strength, blood circulation time, and storage capacity.

Polyplexes

Complexes of polymers with DNA are called polyplexes. Most polyplexes consist of cationic polymers and their fabrication is based on self-assembly by ionic interactions. One important difference between the methods of action of polyplexes and lipoplexes is that polyplexes cannot directly release their DNA load into the cytoplasm. As a result, co-transfection with endosome-lytic agents such as inactivated adenovirus was required to facilitate nanoparticle escape from the endocytic vesicle made during particle uptake. However, a better understanding of the mechanisms by which DNA can escape from endolysosomal pathway, i.e. proton sponge effect, has triggered new polymer synthesis strategies such as incorporation of protonable residues in polymer backbone and has revitalized research on polycation-based systems.

Due to their low toxicity, high loading capacity, and ease of fabrication, polycationic nanocarriers demonstrate great promise compared to their rivals such as viral vectors which show high immunogenicity and potential carcinogenicity, and lipid-based vectors which cause dose dependence toxicity. Polyethyleneimine and chitosan are among the polymeric carriers that have been extensively studied for development of gene delivery therapeutics. Other polycationic carriers such as poly(beta-amino esters) and polyphosphoramidate are being added to the library of potential gene carriers. In addition to the variety of polymers and copolymers, the ease of controlling the size, shape, surface chemistry of these polymeric nano-carriers gives them an edge in targeting capability and taking advantage of enhanced permeability and retention effect.

Dendrimers

A dendrimer is a highly branched macromolecule with a spherical shape. The surface of the particle may be functionalized in many ways and many of the properties of the resulting construct are determined by its surface. 

In particular it is possible to construct a cationic dendrimer, i.e. one with a positive surface charge. When in the presence of genetic material such as DNA or RNA, charge complementarity leads to a temporary association of the nucleic acid with the cationic dendrimer. On reaching its destination the dendrimer-nucleic acid complex is then taken into the cell via endocytosis. 

In recent years the benchmark for transfection agents has been cationic lipids. Limitations of these competing reagents have been reported to include: the lack of ability to transfect some cell types, the lack of robust active targeting capabilities, incompatibility with animal models, and toxicity. Dendrimers offer robust covalent construction and extreme control over molecule structure, and therefore size. Together these give compelling advantages compared to existing approaches.

Producing dendrimers has historically been a slow and expensive process consisting of numerous slow reactions, an obstacle that severely curtailed their commercial development. The Michigan-based company Dendritic Nanotechnologies discovered a method to produce dendrimers using kinetically driven chemistry, a process that not only reduced cost by a magnitude of three, but also cut reaction time from over a month to several days. These new "Priostar" dendrimers can be specifically constructed to carry a DNA or RNA payload that transfects cells at a high efficiency with little or no toxicity.

Inorganic nanoparticles

Inorganic nanoparticles, such as gold, silica, iron oxide (ex. magnetofection) and calcium phosphates have been shown to be capable of gene delivery. Some of the benefits of inorganic vectors is in their storage stability, low manufacturing cost and often time, low immunogenicity, and resistance to microbial attack. Nanosized materials less than 100 nm have been shown to efficiently trap the DNA or RNA and allows its escape from the endosome without degradation. Inorganics have also been shown to exhibit improved in vitro transfection for attached cell lines due to their increased density and preferential location on the base of the culture dish. Quantum dots have also been used successfully and permits the coupling of gene therapy with a stable fluorescence marker. Engineered organic nanoparticles are also under development, which could be used for co-delivery of genes and therapeutic agents.

Cell-penetrating peptides

Cell-penetrating peptides (CPPs), also known as peptide transduction domains (PTDs), are short peptides (< 40 amino acids) that efficiently pass through cell membranes while being covalently or non-covalently bound to various molecules, thus facilitating these molecules’ entry into cells. Cell entry occurs primarily by endocytosis but other entry mechanisms also exist. Examples of cargo molecules of CPPs include nucleic acids, liposomes, and drugs of low molecular weight.

CPP cargo can be directed into specific cell organelles by incorporating localization sequences into CPP sequences. For example, nuclear localization sequences are commonly used to guide CPP cargo into the nucleus. For guidance into mitochondria, a mitochondrial targeting sequence can be used; this method is used in protofection (a technique that allows for foreign mitochondrial DNA to be inserted into cells' mitochondria).

Hybrid methods

Due to every method of gene transfer having shortcomings, there have been some hybrid methods developed that combine two or more techniques. Virosomes are one example; they combine liposomes with an inactivated HIV or influenza virus. This has been shown to have more efficient gene transfer in respiratory epithelial cells than either viral or liposomal methods alone. Other methods involve mixing other viral vectors with cationic lipids or hybridising viruses.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...