In futures studies and the history of technology, accelerating change is the observed exponential nature of the rate of technological change
in recent history, which may suggest faster and more profound change in
the future and may or may not be accompanied by equally profound social
and cultural change.
Early observations
In 1910, during the town planning conference of London, Daniel Burnham
noted, "But it is not merely in the number of facts or sorts of
knowledge that progress lies: it is still more in the geometric ratio of
sophistication, in the geometric widening of the sphere of knowledge, which every year is taking in a larger percentage of people as time goes on."
And later on, "It is the argument with which I began, that a mighty
change having come about in fifty years, and our pace of development
having immensely accelerated, our sons and grandsons are going to demand
and get results that would stagger us."
In 1938, Buckminster Fuller introduced the word ephemeralization to describe the trends of "doing more with less" in chemistry, health and other areas of industrial development.
In 1946, Fuller published a chart of the discoveries of the chemical
elements over time to highlight the development of accelerating
acceleration in human knowledge acquisition.
One
conversation centered on the ever accelerating progress of technology
and changes in the mode of human life, which gives the appearance of
approaching some essential singularity in the history of the race beyond
which human affairs, as we know them, could not continue.
Moravec's Mind Children
In a series of published articles from 1974 to 1979, and then in his 1988 book Mind Children, computer scientist and futurist Hans Moravec generalizes Moore's law to make predictions about the future of artificial life. Moore's law describes an exponential growth
pattern in the complexity of integrated semiconductor circuits. Moravec
extends this to include technologies from long before the integrated
circuit to future forms of technology. Moravec outlines a timeline and a
scenario in which robots will evolve into a new series of artificial species, starting around 2030–2040.
In Robot: Mere Machine to Transcendent Mind, published in 1998, Moravec further considers the implications of evolving robot intelligence, generalizing Moore's law to technologies predating the integrated circuit,
and also plotting the exponentially increasing computational power of
the brains of animals in evolutionary history. Extrapolating these
trends, he speculates about a coming "mind fire" of rapidly expanding superintelligence similar to the explosion of intelligence predicted by Vinge.
In his TV series Connections (1978)—and sequels Connections² (1994) and Connections³ (1997)—James Burke
explores an "Alternative View of Change" (the subtitle of the series)
that rejects the conventional linear and teleological view of historical
progress. Burke contends that one cannot consider the development of
any particular piece of the modern world in isolation. Rather, the
entire gestalt of the modern world is the result of a web of
interconnected events, each one consisting of a person or group acting
for reasons of their own motivations (e.g., profit, curiosity,
religious) with no concept of the final, modern result to which the
actions of either them or their contemporaries would lead. The interplay
of the results of these isolated events is what drives history and
innovation, and is also the main focus of the series and its sequels.
Burke also explores three corollaries to his initial thesis. The
first is that, if history is driven by individuals who act only on what
they know at the time, and not because of any idea as to where their
actions will eventually lead, then predicting the future course of
technological progress is merely conjecture. Therefore, if we are
astonished by the connections Burke is able to weave among past events,
then we will be equally surprised to what the events of today eventually
will lead, especially events we were not even aware of at the time.
The second and third corollaries are explored most in the
introductory and concluding episodes, and they represent the downside of
an interconnected history. If history progresses because of the
synergistic interaction of past events and innovations, then as history
does progress, the number of these events and innovations increases.
This increase in possible connections causes the process of innovation
to not only continue, but to accelerate. Burke poses the question of
what happens when this rate of innovation, or more importantly change
itself, becomes too much for the average person to handle, and what this
means for individual power, liberty, and privacy.
Gerald Hawkins' Mindsteps
In his book Mindsteps to the Cosmos (HarperCollins, August 1983), Gerald S. Hawkins elucidated his notion of mindsteps, dramatic and irreversible changes to paradigms
or world views. He identified five distinct mindsteps in human history,
and the technology that accompanied these "new world views": the
invention of imagery, writing, mathematics, printing, the telescope,
rocket, radio, TV, computer... "Each one takes the collective mind
closer to reality, one stage further along in its understanding of the
relation of humans to the cosmos." He noted: "The waiting period between
the mindsteps is getting shorter. One can't help noticing the
acceleration." Hawkins' empirical 'mindstep equation' quantified this,
and gave dates for (to him) future mindsteps. The date of the next
mindstep (5; the series begins at 0) he cited as 2021, with two further,
successively closer mindsteps in 2045 and 2051, until the limit of the
series in 2053. His speculations ventured beyond the technological:
The mindsteps... appear to have
certain things in common—a new and unfolding human perspective, related
inventions in the area of memes and communications, and a long
formulative waiting period before the next mindstep comes along. None of
the mindsteps can be said to have been truly anticipated, and most were
resisted at the early stages. In looking to the future we may equally
be caught unawares. We may have to grapple with the presently
inconceivable, with mind-stretching discoveries and concepts.
Vinge's exponentially accelerating change
The mathematician Vernor Vinge popularized his ideas about exponentially accelerating technological change in the science fiction novel Marooned in Realtime
(1986), set in a world of rapidly accelerating progress leading to the
emergence of more and more sophisticated technologies separated by
shorter and shorter time intervals, until a point beyond human
comprehension is reached. His subsequent Hugo award-winning novel A Fire Upon the Deep (1992) starts with an imaginative description of the evolution of a superintelligence passing through exponentially accelerating developmental stages ending in a transcendent, almost omnipotent power unfathomable by mere humans. His already mentioned influential 1993 paper on the technological singularity compactly summarizes the basic ideas.
Kurzweil's The Law of Accelerating Returns
In his 1999 book The Age of Spiritual Machines, Ray Kurzweil
proposed "The Law of Accelerating Returns", according to which the rate
of change in a wide variety of evolutionary systems (including but not
limited to the growth of technologies) tends to increase exponentially. He gave further focus to this issue in a 2001 essay entitled "The Law of Accelerating Returns". In it, Kurzweil, after Moravec, argued for extending Moore's Law to describe exponential growth of diverse forms of technological
progress. Whenever a technology approaches some kind of a barrier,
according to Kurzweil, a new technology will be invented to allow us to
cross that barrier. He cites numerous past examples of this to
substantiate his assertions. He predicts that such paradigm shifts
have and will continue to become increasingly common, leading to
"technological change so rapid and profound it represents a rupture in
the fabric of human history". He believes the Law of Accelerating
Returns implies that a technological singularity will occur before the end of the 21st century, around 2045. The essay begins:
An analysis of the history of
technology shows that technological change is exponential, contrary to
the common-sense 'intuitive linear' view. So we won't experience 100
years of progress in the 21st century—it will be more like 20,000 years
of progress (at today's rate). The 'returns,' such as chip speed and
cost-effectiveness, also increase exponentially. There's even
exponential growth in the rate of exponential growth. Within a few
decades, machine intelligence will surpass human intelligence, leading
to the Singularity—technological change so rapid and profound it
represents a rupture in the fabric of human history. The implications
include the merger of biological and nonbiological intelligence,
immortal software-based humans, and ultra-high levels of intelligence
that expand outward in the universe at the speed of light.
Moore's Law expanded to other technologies.
An updated version of Moore's Law over 120 years (based on Kurzweil'sgraph). The seven most recent data points are all Nvidia GPUs.
The Law of Accelerating Returns has in many ways altered public perception of Moore's law. It is a common (but mistaken) belief that Moore's law makes predictions regarding all forms of technology, when really it only concerns semiconductor circuits. Many futurists still use the term "Moore's law" to describe ideas like those put forth by Moravec, Kurzweil and others.
According to Kurzweil, since the beginning of evolution,
more complex life forms have been evolving exponentially faster, with
shorter and shorter intervals between the emergence of radically new
life forms, such as human beings, who have the capacity to engineer
(i.e. intentionally design with efficiency) a new trait which replaces
relatively blind evolutionary mechanisms of selection for efficiency. By
extension, the rate of technical progress amongst humans has also been
exponentially increasing: as we discover more effective ways to do
things, we also discover more effective ways to learn, e.g. language, numbers, written language, philosophy, scientific method,
instruments of observation, tallying devices, mechanical calculators,
computers; each of these major advances in our ability to account for
information occurs increasingly close to the previous. Already within
the past sixty years, life in the industrialized world has changed
almost beyond recognition except for living memories from the first half
of the 20th century. This pattern will culminate in unimaginable
technological progress in the 21st century, leading to a singularity.
Kurzweil elaborates on his views in his books The Age of Spiritual Machines and The Singularity Is Near.
Limits of accelerating change
In
the natural sciences, it is typical that processes characterized by
exponential acceleration in their initial stages go into the saturation
phase. This clearly makes it possible to realize that if an increase
with acceleration is observed over a certain period of time, this does
not mean an endless continuation of this process. On the contrary, in
many cases this means an early exit to the plateau of speed. The
processes occurring in natural science allow us to suggest that the
observed picture of accelerating scientific and technological progress,
after some time (in physical processes, as a rule, is short) will be
replaced by a slowdown and a complete stop. Despite the possible
termination / attenuation of the acceleration of the progress of science
and technology in the foreseeable future, progress itself, and as a
result, social transformations, will not stop or even slow down - it
will continue with the achieved (possibly huge) speed, which has become
constant.
Accelerating change may not be restricted to the Anthropocene Epoch, but a general and predictable developmental feature of the universe.
The physical processes that generate an acceleration such as Moore's
law are positive feedback loops giving rise to exponential or
superexponential technological change.
These dynamics lead to increasingly efficient and dense configurations
of Space, Time, Energy, and Matter (STEM efficiency and density, or STEM
"compression").
At the physical limit, this developmental process of accelerating
change leads to black hole density organizations, a conclusion also
reached by studies of the ultimate physical limits of computation in the
universe.
Applying this vision to the search for extraterrestrial intelligence
leads to the idea that advanced intelligent life reconfigures itself
into a black hole. Such advanced life forms would be interested in inner
space, rather than outer space and interstellar expansion. They would thus in some way transcend reality, not be observable and it would be a solution to Fermi's paradox called the "transcension hypothesis".
Another solution is that the black holes we observe could actually be
interpreted as intelligent super-civilizations feeding on stars, or
"stellivores". This dynamics of evolution and development is an invitation to study the universe itself as evolving, developing. If the universe is a kind of superorganism, it may possibly tend to reproduce, naturally or artificially, with intelligent life playing a role.
Other estimates
Dramatic
changes in the rate of economic growth have occurred in the past
because of some technological advancement. Based on population growth,
the economy doubled every 250,000 years from the Paleolithic era until the Neolithic Revolution.
The new agricultural economy doubled every 900 years, a remarkable
increase. In the current era, beginning with the Industrial Revolution,
the world's economic output doubles every fifteen years, sixty times
faster than during the agricultural era. If the rise of superhuman
intelligence causes a similar revolution, argues Robin Hanson, then one would expect the economy to double at least quarterly and possibly on a weekly basis.
In his 1981 book Critical Path, futurist and inventor R. Buckminster Fuller estimated
that if we took all the knowledge that mankind had accumulated and
transmitted by the year One CE as equal to one unit of information, it
probably took about 1500 years (or until the sixteenth century) for that
amount of knowledge to double. The next doubling of knowledge from two
to four 'knowledge units' took only 250 years, until about 1750 CE. By
1900, one hundred and fifty years later, knowledge had doubled again to 8
units. The observed speed at which information doubled was getting
faster and faster.
In modern times, exponential knowledge progressions therefore change at
an ever-increasing rate. Depending on the progression, this tends to
lead toward explosive growth at some point. A simple exponential curve
that represents this accelerating change phenomenon could be modeled by a
doubling function. This fast rate of knowledge doubling leads up to the basic proposed hypothesis of the technological singularity: the rate at which technology progression surpasses human biological evolution.
Criticisms
Both Theodore Modis
and Jonathan Huebner have argued—each from different perspectives—that
the rate of technological innovation has not only ceased to rise, but is
actually now declining.
Genome editing was pioneered in the 1990s,
before the advent of the common current nuclease-based gene editing
platforms but its use was limited by low efficiencies of editing. Genome
editing with engineered nucleases, i.e. all three major classes of
these enzymes—zinc finger nucleases (ZFNs), transcription activator-like
effector nucleases (TALENs) and engineered meganucleases—were selected
by Nature Methods as the 2011 Method of the Year. The CRISPR-Cas system was selected by Science as 2015 Breakthrough of the Year.
In May 2019, lawyers in China reported, in light of the purported creation by Chinese scientist He Jiankui of the first gene-edited humans (see Lulu and Nana controversy), the drafting of regulations that anyone manipulating the human genome by gene-editing techniques, like CRISPR, would be held responsible for any related adverse consequences.
A cautionary perspective on the possible blind spots and risks of
CRISPR and related biotechnologies has been recently discussed, focusing on the stochastic nature of cellular control processes.
In February 2020, a US trial safely showed CRISPR gene editing on 3 cancer patients.
In 2020 Sicilian Rouge High GABA, a tomato that makes more of an amino
acid said to promote relaxation, was approved for sale in Japan.
In 2021, England (not the rest of the UK) planned to remove restrictions on gene-edited plants and animals, moving from European Union-compliant
regulation to rules closer to those of the US and some other countries.
An April 2021 European Commission report found "strong indications"
that the current regulatory regime was not appropriate for gene editing.
Later in 2021, researchers announced a CRISPR alternative, labeled
obligate mobile element–guided activity (OMEGA) proteins including IscB,
IsrB and TnpB as endonucleases found in transposons, and guided by small ωRNAs.
Background
Genetic engineering
as method of introducing new genetic elements into organisms has been
around since the 1970s. One drawback of this technology has been the
random nature with which the DNA is inserted into the hosts genome,
which can impair or alter other genes within the organism. Although,
several methods have been discovered which target the inserted genes to specific sites within an organism genome.
It has also enabled the editing of specific sequences within a genome
as well as reduced off target effects. This could be used for research
purposes, by targeting mutations to specific genes, and in gene therapy.
By inserting a functional gene into an organism and targeting it to
replace the defective one it could be possible to cure certain genetic diseases.
Gene targeting
Homologous recombination
Early methods to target genes to certain sites within a genome of an organism (called gene targeting) relied on homologous recombination (HR).
By creating DNA constructs that contain a template that matches the
targeted genome sequence it is possible that the HR processes within the
cell will insert the construct at the desired location. Using this
method on embryonic stem cells led to the development of transgenic mice with targeted genes knocked out. It has also been possible to knock in genes or alter gene expression patterns.
In recognition of their discovery of how homologous recombination can
be used to introduce genetic modifications in mice through embryonic
stem cells, Mario Capecchi, Martin Evans and Oliver Smithies were awarded the 2007 Nobel Prize for Physiology or Medicine.
Conditional targeting
If a vital gene is knocked out it can prove lethal to the organism. In order to study the function of these genes site specific recombinases (SSR) were used. The two most common types are the Cre-LoxP and Flp-FRT systems. Cre recombinase
is an enzyme that removes DNA by homologous recombination between
binding sequences known as Lox-P sites. The Flip-FRT system operates in a
similar way, with the Flip recombinase recognising FRT sequences. By
crossing an organism containing the recombinase sites flanking the gene
of interest with an organism that express the SSR under control of tissue specific promoters,
it is possible to knock out or switch on genes only in certain cells.
These techniques were also used to remove marker genes from transgenic
animals. Further modifications of these systems allowed researchers to
induce recombination only under certain conditions, allowing genes to be
knocked out or expressed at desired times or stages of development.
Process
Double strand break repair
A common form of Genome editing relies on the concept of DNA double stranded break (DSB) repair mechanics. There are two major pathways that repair DSB; non-homologous end joining (NHEJ) and homology directed repair
(HDR). NHEJ uses a variety of enzymes to directly join the DNA ends
while the more accurate HDR uses a homologous sequence as a template for
regeneration of missing DNA sequences at the break point. This can be
exploited by creating a vector with the desired genetic elements within a sequence that is homologous
to the flanking sequences of a DSB. This will result in the desired
change being inserted at the site of the DSB. While HDR based gene
editing is similar to the homologous recombination based gene targeting,
the rate of recombination is increased by at least three orders of
magnitude.
Engineered nucleases
The key to genome editing is creating a DSB at a specific point
within the genome. Commonly used restriction enzymes are effective at
cutting DNA, but generally recognize and cut at multiple sites. To
overcome this challenge and create site-specific DSB, three distinct
classes of nucleases have been discovered and bioengineered to date.
These are the Zinc finger nucleases (ZFNs), transcription-activator like effector nucleases (TALEN), meganucleases and the clustered regularly interspaced short palindromic repeats (CRISPR/Cas9) system.
Meganucleases
Meganucleases, discovered in the late 1980s, are enzymes in the endonuclease family which are characterized by their capacity to recognize and cut large DNA sequences (from 14 to 40 base pairs). The most widespread and best known meganucleases are the proteins in the LAGLIDADG family, which owe their name to a conserved amino acid sequence.
Meganucleases, found commonly in microbial species, have the
unique property of having very long recognition sequences (>14bp)
thus making them naturally very specific.
However, there is virtually no chance of finding the exact meganuclease
required to act on a chosen specific DNA sequence. To overcome this
challenge, mutagenesis and high throughput screening methods have been used to create meganuclease variants that recognize unique sequences. Others have been able to fuse various meganucleases and create hybrid enzymes that recognize a new sequence.
Yet others have attempted to alter the DNA interacting aminoacids of
the meganuclease to design sequence specific meganucelases in a method
named rationally designed meganuclease.
Another approach involves using computer models to try to predict as
accurately as possible the activity of the modified meganucleases and
the specificity of the recognized nucleic sequence.
A large bank containing several tens of thousands of protein
units has been created. These units can be combined to obtain chimeric
meganucleases that recognize the target site, thereby providing research
and development tools that meet a wide range of needs (fundamental
research, health, agriculture, industry, energy, etc.) These include the
industrial-scale production of two meganucleases able to cleave the
human XPC gene; mutations in this gene result in Xeroderma pigmentosum, a severe monogenic disorder that predisposes the patients to skin cancer and burns whenever their skin is exposed to UV rays.
Meganucleases have the benefit of causing less toxicity in cells than methods such as Zinc finger nuclease (ZFN), likely because of more stringent DNA sequence recognition;
however, the construction of sequence-specific enzymes for all possible
sequences is costly and time-consuming, as one is not benefiting from
combinatorial possibilities that methods such as ZFNs and TALEN-based
fusions utilize.
Zinc finger nucleases
As
opposed to meganucleases, the concept behind ZFNs and TALEN technology
is based on a non-specific DNA cutting catalytic domain, which can then
be linked to specific DNA sequence recognizing peptides such as zinc
fingers and transcription activator-like effectors (TALEs).
The first step to this was to find an endonuclease whose DNA
recognition site and cleaving site were separate from each other, a
situation that is not the most common among restriction enzymes.
Once this enzyme was found, its cleaving portion could be separated
which would be very non-specific as it would have no recognition
ability. This portion could then be linked to sequence recognizing
peptides that could lead to very high specificity.
Zinc fingermotifs occur in several transcription factors.
The zinc ion, found in 8% of all human proteins, plays an important
role in the organization of their three-dimensional structure. In
transcription factors, it is most often located at the protein-DNA
interaction sites, where it stabilizes the motif. The C-terminal part of
each finger is responsible for the specific recognition of the DNA
sequence.
The recognized sequences are short, made up of around 3 base
pairs, but by combining 6 to 8 zinc fingers whose recognition sites have
been characterized, it is possible to obtain specific proteins for
sequences of around 20 base pairs. It is therefore possible to control
the expression of a specific gene. It has been demonstrated that this
strategy can be used to promote a process of angiogenesis in animals.
It is also possible to fuse a protein constructed in this way with the
catalytic domain of an endonuclease in order to induce a targeted DNA
break, and therefore to use these proteins as genome engineering tools.
The method generally adopted for this involves associating two
DNA binding proteins – each containing 3 to 6 specifically chosen zinc
fingers – with the catalytic domain of the FokI
endonuclease which need to dimerize to cleave the double-strand DNA.
The two proteins recognize two DNA sequences that are a few nucleotides
apart. Linking the two zinc finger proteins to their respective
sequences brings the two FokI domains closer together. FokI requires
dimerization to have nuclease activity and this means the specificity
increases dramatically as each nuclease partner would recognize a unique
DNA sequence. To enhance this effect, FokI nucleases have been engineered that can only function as heterodimers.
Several approaches are used to design specific zinc finger
nucleases for the chosen sequences. The most widespread involves
combining zinc-finger units with known specificities (modular assembly).
Various selection techniques, using bacteria, yeast or mammal cells
have been developed to identify the combinations that offer the best
specificity and the best cell tolerance. Although the direct genome-wide
characterization of zinc finger nuclease activity has not been
reported, an assay that measures the total number of double-strand DNA
breaks in cells found that only one to two such breaks occur above
background in cells treated with zinc finger nucleases with a 24 bp
composite recognition site and obligate heterodimer FokI nuclease domains.
The heterodimer functioning nucleases would avoid the possibility
of unwanted homodimer activity and thus increase specificity of the
DSB. Although the nuclease portions of both ZFNs and TALEN constructs
have similar properties, the difference between these engineered
nucleases is in their DNA recognition peptide. ZFNs rely on Cys2-His2
zinc fingers and TALEN constructs on TALEs. Both of these DNA
recognizing peptide domains have the characteristic that they are
naturally found in combinations in their proteins. Cys2-His2 Zinc
fingers typically happen in repeats that are 3 bp apart and are found in
diverse combinations in a variety of nucleic acid interacting proteins
such as transcription factors.
Each finger of the Zinc finger domain is completely independent and the
binding capacity of one finger is impacted by its neighbor. TALEs on
the other hand are found in repeats with a one-to-one recognition ratio
between the amino acids and the recognized nucleotide pairs. Because
both zinc fingers and TALEs happen in repeated patterns, different
combinations can be tried to create a wide variety of sequence
specificities.
Zinc fingers have been more established in these terms and approaches
such as modular assembly (where Zinc fingers correlated with a triplet
sequence are attached in a row to cover the required sequence), OPEN
(low-stringency selection of peptide domains vs. triplet nucleotides
followed by high-stringency selections of peptide combination vs. the
final target in bacterial systems), and bacterial one-hybrid screening
of zinc finger libraries among other methods have been used to make site
specific nucleases.
Zinc finger nucleases
are research and development tools that have already been used to
modify a range of genomes, in particular by the laboratories in the Zinc
Finger Consortium. The US company Sangamo BioSciences uses zinc finger nucleases to carry out research into the genetic engineering of stem cells and the modification of immune cells for therapeutic purposes. Modified T lymphocytes are currently undergoing phase I clinical trials to treat a type of brain tumor (glioblastoma) and in the fight against AIDS.
TALEN
Transcription activator-like effector nucleases
(TALENs) are specific DNA-binding proteins that feature an array of 33
or 34-amino acid repeats. TALENs are artificial restriction enzymes
designed by fusing the DNA cutting domain of a nuclease to TALE domains,
which can be tailored to specifically recognize a unique DNA sequence.
These fusion proteins serve as readily targetable "DNA scissors" for
gene editing applications that enable to perform targeted genome
modifications such as sequence insertion, deletion, repair and
replacement in living cells. The DNA binding domains, which can be designed to bind any desired DNA sequence, comes from TAL effectors, DNA-binding proteins excreted by plant pathogenic Xanthomanos app.
TAL effectors consists of repeated domains, each of which contains a
highly conserved sequence of 34 amino acids, and recognize a single DNA
nucleotide within the target site. The nuclease can create double strand
breaks at the target site that can be repaired by error-prone non-homologous end-joining
(NHEJ), resulting in gene disruptions through the introduction of small
insertions or deletions. Each repeat is conserved, with the exception
of the so-called repeat variable di-residues (RVDs) at amino acid
positions 12 and 13. The RVDs determine the DNA sequence to which the
TALE will bind. This simple one-to-one correspondence between the TALE
repeats and the corresponding DNA sequence makes the process of
assembling repeat arrays to recognize novel DNA sequences
straightforward. These TALEs can be fused to the catalytic domain from a
DNA nuclease, FokI, to generate a transcription activator-like effector
nuclease (TALEN). The resultant TALEN constructs combine specificity
and activity, effectively generating engineered sequence-specific
nucleases that bind and cleave DNA sequences only at pre-selected sites.
The TALEN target recognition system is based on an easy-to-predict
code. TAL nucleases are specific to their target due in part to the
length of their 30+ base pairs binding site. TALEN can be performed
within a 6 base pairs range of any single nucleotide in the entire
genome.
TALEN constructs are used in a similar way to designed zinc
finger nucleases, and have three advantages in targeted mutagenesis: (1)
DNA binding specificity is higher, (2) off-target effects are lower, and (3) construction of DNA-binding domains is easier.
CRISPRs (Clustered Regularly Interspaced Short Palindromic Repeats) are genetic elements that bacteria use as a kind of acquired immunity
to protect against viruses. They consist of short sequences that
originate from viral genomes and have been incorporated into the
bacterial genome. Cas (CRISPR associated proteins) process these
sequences and cut matching viral DNA sequences. By introducing plasmids
containing Cas genes and specifically constructed CRISPRs into
eukaryotic cells, the eukaryotic genome can be cut at any desired
position.
Editing by nucleobase modification (Base editing)
One
of the earliest methods of efficiently editing nucleic acids employs
nucleobase modifying enzymes directed by nucleic acid guide sequences
was first described in the 1990s and has seen resurgence more recently.
This method has the advantage that it does not require breaking the
genomic DNA strands, and thus avoids the random insertion and deletions
associated with DNA strand breakage. It is only appropriate for precise
editing requiring single nucleotide changes and has found to be highly
efficient for this type of editing.
ARCUT
ARCUT
stands for artificial restriction DNA cutter, it is a technique
developed by Komiyama. This method uses pseudo-complementary peptide
nucleic acid (pcPNA), for identifying cleavage site within the
chromosome. Once pcPNA specifies the site, excision is carried out by
cerium (CE) and EDTA (chemical mixture), which performs the splicing
function.
Precision and efficiency of engineered nucleases
Meganucleases
method of gene editing is the least efficient of the methods mentioned
above. Due to the nature of its DNA-binding element and the cleaving
element, it is limited to recognizing one potential target every 1,000
nucleotides.
ZFN was developed to overcome the limitations of meganuclease. The
number of possible targets ZFN can recognized was increased to one in
every 140 nucleotides.
However, both methods are unpredictable because of their DNA-binding
elements affecting each other. As a result, high degrees of expertise
and lengthy and costly validations processes are required.
TALE nucleases being the most precise and specific method yields a
higher efficiency than the previous two methods. It achieves such
efficiency because the DNA-binding element consists of an array of TALE
subunits, each of them having the capability of recognizing a specific
DNA nucleotide chain independent from others, resulting in a higher
number of target sites with high precision. New TALE nucleases take
about one week and a few hundred dollars to create, with specific
expertise in molecular biology and protein engineering.
CRISPR nucleases have a slightly lower precision when compared to
the TALE nucleases. This is caused by the need of having a specific
nucleotide at one end in order to produce the guide RNA that CRISPR uses
to repair the double-strand break it induces. It has been shown to be
the quickest and cheapest method, only costing less than two hundred
dollars and a few days of time.
CRISPR also requires the least amount of expertise in molecular biology
as the design lays in the guide RNA instead of the proteins. One major
advantage that CRISPR has over the ZFN and TALEN methods is that it can
be directed to target different DNA sequences using its ~80nt CRISPR
sgRNAs, while both ZFN and TALEN methods required construction and
testing of the proteins created for targeting each DNA sequence.
Because off-target activity
of an active nuclease would have potentially dangerous consequences at
the genetic and organismal levels, the precision of meganucleases, ZFNs,
CRISPR, and TALEN-based fusions has been an active area of research.
While variable figures have been reported, ZFNs tend to have more
cytotoxicity than TALEN methods or RNA-guided nucleases, while TALEN and
RNA-guided approaches tend to have the greatest efficiency and fewer
off-target effects.
Based on the maximum theoretical distance between DNA binding and
nuclease activity, TALEN approaches result in the greatest precision.
Multiplex Automated Genomic Engineering (MAGE)
The methods for scientists and researchers wanting to study genomic
diversity and all possible associated phenotypes were very slow,
expensive, and inefficient. Prior to this new revolution, researchers
would have to do single-gene manipulations and tweak the genome one
little section at a time, observe the phenotype, and start the process
over with a different single-gene manipulation.
Therefore, researchers at the Wyss Institute at Harvard University
designed the MAGE, a powerful technology that improves the process of in
vivo genome editing. It allows for quick and efficient manipulations of
a genome, all happening in a machine small enough to put on top of a
small kitchen table. Those mutations combine with the variation that
naturally occurs during cell mitosis creating billions of cellular
mutations.
Chemically combined, synthetic single-stranded DNA (ssDNA) and a
pool of oligionucleotides are introduced at targeted areas of the cell
thereby creating genetic modifications. The cyclical process involves
transformation of ssDNA (by electroporation)
followed by outgrowth, during which bacteriophage homologous
recombination proteins mediate annealing of ssDNAs to their genomic
targets. Experiments targeting selective phenotypic markers are screened
and identified by plating the cells on differential medias. Each cycle
ultimately takes 2.5 hours to process, with additional time required to
grow isogenic cultures and characterize mutations. By iteratively
introducing libraries of mutagenic ssDNAs targeting multiple sites, MAGE
can generate combinatorial genetic diversity in a cell population.
There can be up to 50 genome edits, from single nucleotide base pairs to
whole genome or gene networks simultaneously with results in a matter
of days.
MAGE experiments can be divided into three classes, characterized
by varying degrees of scale and complexity: (i) many target sites,
single genetic mutations; (ii) single target site, many genetic
mutations; and (iii) many target sites, many genetic mutations. An example of class three was reflected in 2009, where Church and colleagues were able to program Escherichia coli
to produce five times the normal amount of lycopene, an antioxidant
normally found in tomato seeds and linked to anti-cancer properties.
They applied MAGE to optimize the 1-deoxy-D-xylulose 5-phosphate (DXP) metabolic pathway in Escherichia coli
to overproduce isoprenoid lycopene. It took them about 3 days and just
over $1,000 in materials. The ease, speed, and cost efficiency in which
MAGE can alter genomes can transform how industries approach the
manufacturing and production of important compounds in the
bioengineering, bioenergy, biomedical engineering, synthetic biology,
pharmaceutical, agricultural, and chemical industries.
Applications
As of 2012 efficient genome editing had been developed for a wide
range of experimental systems ranging from plants to animals, often
beyond clinical interest, and was becoming a standard experimental
strategy in research labs. The recent generation of rat, zebrafish, maize and tobacco
ZFN-mediated mutants and the improvements in TALEN-based approaches
testify to the significance of the methods, and the list is expanding
rapidly. Genome editing with engineered nucleases will likely contribute
to many fields of life sciences from studying gene functions in plants
and animals to gene therapy in humans. For instance, the field of synthetic biology
which aims to engineer cells and organisms to perform novel functions,
is likely to benefit from the ability of engineered nuclease to add or
remove genomic elements and therefore create complex systems. In addition, gene functions can be studied using stem cells with engineered nucleases.
Listed below are some specific tasks this method can carry out:
The
combination of recent discoveries in genetic engineering, particularly
gene editing and the latest improvement in bovine reproduction
technologies (e.g. in vitro embryo culture) allows for genome
editing directly in fertilised oocytes using synthetic highly specific
endonucleases. RNA-guided endonucleases:clustered regularly interspaced
short palindromic repeats associated Cas9 (CRISPR/Cas9) are a new tool,
further increasing the range of methods available. In particular
CRISPR/Cas9 engineered endonucleases allows the use of multiple guide
RNAs for simultaneous Knockouts (KO) in one step by cytoplasmic direct
injection (CDI) on mammalian zygotes.
Furthermore, gene editing can be applied to certain types of fish
in aquaculture such as Atlantic salmon. Gene editing in fish is
currently experimental, but the possibilities include growth, disease
resistance, sterility, controlled reproduction, and colour. Selecting
for these traits can allow for a more sustainable environment and better
welfare for the fish.
AquAdvantage salmon
is a genetically modified Atlantic salmon developed by AquaBounty
Technologies. The growth hormone-regulating gene in the Atlantic salmon
is replaced with the growth hormone-regulating gene from the Pacific
Chinook salmon and a promoter sequence from the ocean pout.
Thanks to the parallel development of single-cell
transcriptomics, genome editing and new stem cell models we are now
entering a scientifically exciting period where functional genetics is
no longer restricted to animal models but can be performed directly in
human samples. Single-cell gene expression analysis has resolved a
transcriptional road-map of human development from which key candidate
genes are being identified for functional studies. Using global
transcriptomics data to guide experimentation, the CRISPR based genome
editing tool has made it feasible to disrupt or remove key genes in
order to elucidate function in a human setting.
Targeted gene modification in plants
Genome editing using Meganuclease,
ZFNs, and TALEN provides a new strategy for genetic manipulation in
plants and are likely to assist in the engineering of desired plant
traits by modifying endogenous genes. For instance, site-specific gene
addition in major crop species can be used for 'trait stacking' whereby
several desired traits are physically linked to ensure their
co-segregation during the breeding processes. Progress in such cases have been recently reported in Arabidopsis thaliana and Zea mays. In Arabidopsis thaliana,
using ZFN-assisted gene targeting, two herbicide-resistant genes
(tobacco acetolactate synthase SuRA and SuRB) were introduced to SuR
loci with as high as 2% transformed cells with mutations.
In Zea mays, disruption of the target locus was achieved by ZFN-induced
DSBs and the resulting NHEJ. ZFN was also used to drive
herbicide-tolerance gene expression cassette (PAT) into the targeted endogenous locus IPK1 in this case.
Such genome modification observed in the regenerated plants has been
shown to be inheritable and was transmitted to the next generation.
A potentially successful example of the application of genome editing
techniques in crop improvement can be found in banana, where scientists
used CRISPR/Cas9 editing to inactivate the endogenous banana streak virus in the B genome of banana (Musa spp.) to overcome a major challenge in banana breeding.
In addition, TALEN-based genome engineering has been extensively tested and optimized for use in plants. TALEN fusions have also been used by a U.S. food ingredient company, Calyxt, to improve the quality of soybean oil products and to increase the storage potential of potatoes
Several optimizations need to be made in order to improve editing plant genomes using ZFN-mediated targeting.
There is a need for reliable design and subsequent test of the
nucleases, the absence of toxicity of the nucleases, the appropriate
choice of the plant tissue for targeting, the routes of induction of
enzyme activity, the lack of off-target mutagenesis, and a reliable detection of mutated cases.
A common delivery method for CRISPR/Cas9 in plants is Agrobacterium-based transformation. T-DNA is introduced directly into the plant genome by a T4SS mechanism. Cas9 and gRNA-based expression cassettes are turned into Ti plasmids, which are transformed in Agrobacterium for plant application. To improve Cas9 delivery in live plants, viruses are being used more effective transgene delivery.
Research
Gene therapy
The ideal gene therapy
practice is that which replaces the defective gene with a normal allele
at its natural location. This is advantageous over a virally delivered
gene as there is no need to include the full coding sequences and
regulatory sequences when only a small proportions of the gene needs to
be altered as is often the case.
The expression of the partially replaced genes is also more consistent
with normal cell biology than full genes that are carried by viral
vectors.
The first clinical use of TALEN-based genome editing was in the treatment of CD19+ acute lymphoblastic leukemia in an 11-month old child in 2015. Modified donor T cells were engineered to attack the leukemia cells, to be resistant to Alemtuzumab, and to evade detection by the host immune system after introduction.
Extensive research has been done in cells and animals using
CRISPR-Cas9 to attempt to correct genetic mutations which cause genetic
diseases such as Down syndrome, spina bifida, anencephaly, and Turner
and Klinefelter syndromes.
Researchers have used CRISPR-Cas9 gene drives to modify genes associated with sterility in A. gambiae, the vector for malaria. This technique has further implications in eradicating other vector borne diseases such as yellow fever, dengue, and Zika.
The CRISPR-Cas9 system can be programmed to modulate the
population of any bacterial species by targeting clinical genotypes or
epidemiological isolates. It can selectively enable the beneficial
bacterial species over the harmful ones by eliminating pathogen, which
gives it an advantage over broad-spectrum antibiotics.
Antiviral applications for therapies targeting human viruses such
as HIV, herpes, and hepatitis B virus are under research. CRISPR can be
used to target the virus or the host to disrupt genes encoding the
virus cell-surface receptor proteins. In November 2018, He Jiankui announced that he had edited two human embryos, to attempt to disable the gene for CCR5, which codes for a receptor that HIV uses to enter cells. He said that twin girls, Lulu and Nana, had been born a few weeks earlier. He said that the girls still carried functional copies of CCR5 along with disabled CCR5 (mosaicism) and were still vulnerable to HIV. The work was widely condemned as unethical, dangerous, and premature.
In January 2019, scientists in China reported the creation of five identical cloned gene-edited monkeys, using the same cloning technique that was used with Zhong Zhong and Hua Hua – the first ever cloned monkeys - and Dolly the sheep, and the same gene-editing Crispr-Cas9 technique allegedly used by He Jiankui in creating the first ever gene-modified human babies Lulu and Nana. The monkey clones were made in order to study several medical diseases.
Prospects and limitations
In
the future, an important goal of research into genome editing with
engineered nucleases must be the improvement of the safety and
specificity of the nucleases action.
For example, improving the ability to detect off-target events can
improve our ability to learn about ways of preventing them. In addition,
zinc-fingers used in ZFNs are seldom completely specific, and some may
cause a toxic reaction. However, the toxicity has been reported to be
reduced by modifications done on the cleavage domain of the ZFN.
In addition, research by Dana Carroll
into modifying the genome with engineered nucleases has shown the need
for better understanding of the basic recombination and repair machinery
of DNA. In the future, a possible method to identify secondary targets
would be to capture broken ends from cells expressing the ZFNs and to
sequence the flanking DNA using high-throughput sequencing.
Because of the ease of use and cost-efficiency of CRISPR,
extensive research is currently being done on it. There are now more
publications on CRISPR than ZFN and TALEN despite how recent the
discovery of CRISPR is.
Both CRISPR and TALEN are favored to be the choices to be implemented
in large-scale productions due to their precision and efficiency.
Genome editing occurs also as a natural process without
artificial genetic engineering. The agents that are competent to edit
genetic codes are viruses or subviral RNA-agents.
Although GEEN has higher efficiency than many other methods in
reverse genetics, it is still not highly efficient; in many cases less
than half of the treated populations obtain the desired changes.
For example, when one is planning to use the cell's NHEJ to create a
mutation, the cell's HDR systems will also be at work correcting the DSB
with lower mutational rates.
Traditionally, mice have been the most common choice for
researchers as a host of a disease model. CRISPR can help bridge the gap
between this model and human clinical trials by creating transgenic
disease models in larger animals such as pigs, dogs, and non-human
primates.
Using the CRISPR-Cas9 system, the programmed Cas9 protein and the sgRNA
can be directly introduced into fertilized zygotes to achieve the
desired gene modifications when creating transgenic models in rodents.
This allows bypassing of the usual cell targeting stage in generating
transgenic lines, and as a result, it reduces generation time by 90%.
One potential that CRISPR brings with its effectiveness is the
application of xenotransplantation. In previous research trials, CRISPR
demonstrated the ability to target and eliminate endogenous
retroviruses, which reduces the risk of transmitting diseases and
reduces immune barriers. Eliminating these problems improves donor organ function, which brings this application closer to a reality.
In plants, genome editing is seen as a viable solution to the conservation of biodiversity. Gene drive are a potential tool to alter the reproductive rate of invasive species, although there are significant associated risks.
Human enhancement
Many transhumanists see genome editing as a potential tool for human enhancement. Australian biologist and Professor of Genetics David Andrew Sinclair
notes that "the new technologies with genome editing will allow it to
be used on individuals (...) to have (...) healthier children" – designer babies.
According to a September 2016 report by the Nuffield Council on
Bioethics in the future it may be possible to enhance people with genes
from other organisms or wholly synthetic genes to for example improve night vision and sense of smell. George Church has compiled a list of potential genetic modifications for possibly advantageous traits such as less need for sleep, cognition-related changes that protect against Alzheimer's disease, disease resistances and enhanced learning abilities along with some of the associated studies and potential negative effects.
The American National Academy of Sciences and National Academy of Medicine issued a report in February 2017 giving qualified support to human genome editing.
They recommended that clinical trials for genome editing might one day
be permitted once answers have been found to safety and efficiency
problems "but only for serious conditions under stringent oversight."
Risks
In the 2016 Worldwide Threat Assessment of the US Intelligence Community statement United States Director of National Intelligence, James R. Clapper, named genome editing as a potential weapon of mass destruction,
stating that genome editing conducted by countries with regulatory or
ethical standards "different from Western countries" probably increases
the risk of the creation of harmful biological agents or products.
According to the statement the broad distribution, low cost, and
accelerated pace of development of this technology, its deliberate or
unintentional misuse might lead to far-reaching economic and national
security implications. For instance technologies such as CRISPR could be used to make "killer
mosquitoes" that cause plagues that wipe out staple crops.
According to a September 2016 report by the Nuffield Council on Bioethics, the simplicity and low cost of tools to edit the genetic code will allow amateurs – or "biohackers" –
to perform their own experiments, posing a potential risk from the
release of genetically modified bugs. The review also found that the
risks and benefits of modifying a person's genome – and having those
changes pass on to future generations – are so complex that they demand
urgent ethical scrutiny. Such modifications might have unintended
consequences which could harm not only the child, but also their future
children, as the altered gene would be in their sperm or eggs. In 2001 Australian researchers Ronald Jackson and Ian Ramshaw were criticized for publishing a paper in the Journal of Virology that explored the potential control of mice, a major pest in Australia, by infecting them with an altered mousepox virus that would cause infertility as the provided sensitive information could lead to the manufacture of biological weapons by potential bioterrorists who might use the knowledge to create vaccine resistant strains of other pox viruses, such as smallpox, that could affect humans. Furthermore, there are additional concerns about the ecological risks of releasing gene drives into wild populations.
Nobel prize
In 2007, the Nobel Prize for Physiology or Medicine
was awarded to Mario Capecchi, Martin Evans and Oliver Smithies "for
their discoveries of principles for introducing specific gene
modifications in mice by the use of embryonic stem cells."
Predictive medicine is a field of medicine that entails predicting the probability of disease
and instituting preventive measures in order to either prevent the
disease altogether or significantly decrease its impact upon the patient
(such as by preventing mortality or limiting morbidity).
While different prediction methodologies exist, such as genomics, proteomics, and cytomics,
the most fundamental way to predict future disease is based on
genetics. Although proteomics and cytomics allow for the early detection
of disease, much of the time those detect biological markers that exist because a disease process has already started. However, comprehensive genetic testing (such as through the use of DNA arrays or full genome sequencing) allows for the estimation of disease risk years to decades before any disease even exists, or even whether a healthy fetus
is at higher risk for developing a disease in adolescence or adulthood.
Individuals who are more susceptible to disease in the future can be
offered lifestyle advice or medication with the aim of preventing the
predicted illness.
Current genetic testing guidelines supported by the health care
professionals discourage purely predictive genetic testing of minors
until they are competent to understand the relevancy of genetic
screening so as to allow them to participate in the decision about
whether or not it is appropriate for them. Genetic screening
of newborns and children in the field of predictive medicine is deemed
appropriate if there is a compelling clinical reason to do so, such as
the availability of prevention or treatment as a child that would
prevent future disease.
The goal
The
goal of predictive medicine is to predict the probability of future
disease so that health care professionals and the patient themselves can
be proactive in instituting lifestyle modifications and increased
physician surveillance, such as bi-annual full body skin exams by a dermatologist or internist if their patient is found to have an increased risk of melanoma, an EKG and cardiology examination by a cardiologist if a patient is found to be at increased risk for a cardiac arrhythmia or alternating MRIs or mammograms every six months if a patient is found to be at increased risk for breast cancer.
Predictive medicine is intended for both healthy individuals
("predictive health") and for those with diseases ("predictive
medicine"), its purpose being to predict susceptibility to a particular
disease and to predict progression and treatment response for a given
disease.
A number of association studies
have been published in scientific literature that show associations
between specific genetic variants in a person's genetic code and a
specific disease. Association and correlation studies have found that a
female individual with a mutation in the BRCA1 gene has a 65% cumulative risk of breast cancer. Additionally, new tests from Genetic Technologies LTD and Phenogen Sciences Inc. comparing non-coding DNA to a woman's lifetime exposure to estrogen can now determine a woman's probability of developing estrogen positive breast cancer also known as sporadic breast cancer (the most prevalent form of breast cancer). Genetic variants in the Factor V gene is associated with an increased tendency to form blood clots, such as deep vein thrombosis (DVTs). Genetics tests are expected to reach the market more quickly than new medicines. Myriad Genetics is already generating revenue from genetic tests for BRCA1 and BRCA2.
Aside from genetic testing, predictive medicine utilizes a wide
variety of tools to predict health and disease, including assessments of
exercise, nutrition, spirituality, quality of life, and so on. This
integrative approach was adopted when Emory University and Georgia
Institute of Technology partnered to launch the Predictive Health Institute.
Predictive medicine changes the paradigm of medicine from being
reactive to being proactive and has the potential to significantly
extend the duration of health and to decrease the incidence, prevalence
and cost of diseases.
Types
Notable types of predictive medicine through health care professionals include:
Carrier testing:
Carrier testing is done to identify people who carry one copy of a gene
mutation that, when present in both copies, causes a genetic disorder.
This type of testing is offered to individuals who have genetic disorder
in their family history or to people in ethnic groups with increased
risk of certain genetic diseases. If both parents are tested, carrier
testing can provide information about a couple's risk of having a child
with a genetic disorder.
Diagnostic testing:
Diagnostic testing is conducted to aid in the specificity diagnosis or
detection of a disease. It is often used to confirm a particular
diagnosis when a certain condition is suspected based on the subject's
mutations and physical symptoms. The diversity in diagnostic testing
ranges from common consulting room tests such as measuring blood pressure and urine tests to more invasive protocols such as biopsies.
Newborn screening:
Newborn screening is conducted just after birth to identify genetic
disorders that can be treated early in life. This testing of infants for
certain disorders is one of the most widespread uses of genetic
screening - all US states currently test infants for phenylketonuria and congenital hypothyroidism.
US state law mandates collecting a sample by pricking the heel of a
newborn baby to obtain enough blood to fill a few circles on filter
paper labeled with names of infant, parent, hospital, and primary
physician.
Prenatal testing: Prenatal testing is used to look for diseases and conditions in a fetus or embryo
before it is born. This type of testing is offered for couples who
have an increased risk of having a baby with a genetic or chromosomal
disorder. Screening can determine the sex of the fetus. Prenatal
testing can help a couple decide whether to abort
the pregnancy. Like diagnostic testing, prenatal testing can be
noninvasive or invasive. Non-invasive techniques include examinations of
the woman's womb through ultrasonography
or maternal serum screens. These non-invasive techniques can evaluate
risk of a condition, but cannot determine with certainty if the fetus
has a condition. More invasive prenatal methods are slightly more risky
for the fetus and involve needles or probes being inserted into the placenta or chorionic villus sampling.
Health benefits
The
future of medicine's focus may potentially shift from treating existing
diseases, typically late in their progression, to preventing disease
before it sets in. Predictive health and predictive medicine is based
on probabilities: while it evaluates susceptibility to diseases, it is
not able to predict with 100% certainty that a specific disease will
occur. Unlike many preventive interventions that are directed at groups
(e.g., immunization programs), predictive medicine is conducted on an
individualized basis. For example, glaucoma
is a monogenic disease whose early detection can allow to prevent
permanent loss of vision. Predictive medicine is expected to be most
effective when applied to polygenic multifactorial disease that are
prevalent in industrialized countries, such as diabetes mellitus, hypertension, and myocardial infarction.
With careful usage, predictive medicine methods such as genetic
screens can help diagnose inherited genetic disease caused by problems
with a single gene (such as cystic fibrosis) and help early treatment.
Some forms of cancer and heart disease are inherited as single-gene
diseases and some people in these high-risk families may also benefit
from access to genetic tests. As more and more genes associated with
increased susceptibility to certain diseases are reported, predictive
medicine becomes more useful.
Direct-to-Consumer (DTC) genetic testing enables a consumer to screen
his or her own genes without having to go through a health care
professional. They can be ordered without the permission of a
physician. Variety in DTC tests range from those testing for mutations
associated with cystic fibrosis to breast cancer alleles.
DTC tests make the applicability of predictive medicine very real and
accessible to consumers. Benefits of DTC testing include this
accessibility, privacy of genetic information, and promotion of
proactive health care. Risks of obtaining DTC testing are the lack of
governmental regulation and the interpreting of genetic information
without professional counseling.
Limitations of predictive medicine
On a protein
level, structure is less conserved than sequence. Therefore, in many
diseases, having the faulty gene still does not necessarily mean someone
will get the disease.
Common, complex diseases in the wider population are affected not only
by heredity, but also by external causes such as lifestyle and
environment. Therefore, genes are not perfect predictors of future
health; individuals with both the high risk form of the gene and those
without are all candidates to get the disease. Multiple factors in the
environment, particular smoking, diet and exercise, infection, and pollution; play important roles and can be more important than genetic make-up.
This makes the results and risks determined by predictive medicine
more difficult to quantify. Furthermore, the potential false positives
or false negatives that may arise from a predictive genetic screen can
cause substantial unnecessary strain on the individual.
Targeting medication to people who are genetically susceptible to
a disease but do not yet show the symptoms of it can be a questionable
measure. In large populations, there is concern that likely most of the
people taking preventative medications would never have developed the
disease anyway. Many medications carry undesirable side effects that
high risk individuals must then cope with. In contrast, several
populations-based prevention measures (such as encouraging healthy diets
or banning tobacco advertising) carry a far lower likelihood of adverse
effects and are also less expensive.
Another potential downfall of commercially available genetic
testing lies within the psychological impacts of accessibility to such
data. For single-gene inherited diseases, counseling and the right to
refuse a test (the right "not to know") have been found to be important.
However, adequate individual counseling can be difficult to employ to
the potentially large proportion of the population likely to be
identified as at high risk of common complex disease. Some people are
vulnerable to adverse psychological reactions to genetic predictions of
stigmatized or feared conditions, such as cancer or mental illness.
Ethics and law
Predictive medicine ushers in a number of sensitive legal and ethical issues.
There is a delicate balance that presides over predictive medicine and
occupational health: if an employee were dismissed because he was found
to be at risk of a certain chemical agent used in his workplace, would
his termination be considered discrimination or an act of prevention?
Several organizations believe that legislation is needed to prevent
insurers and employers from using predictive genetic test results to
decide who gets insurance or a job: "Ethical considerations, and legal,
are fundamental to the whole issue of genetic testing. The consequences
for individuals with regard to insurance and employment are also of the
greatest importance, together with the implications for stigma and
discrimination."
In the future, people may be required to reveal genetic predictions
about their health to their employers or insurers. The grim prospect of
discrimination based on a person's genetic make-up can lead to a
"genetic underclass" which does not receive equal opportunity for
insurance and employment.
Currently in the United States, health insurers do not require
applicants for coverage to undergo genetic testing. Genetic information
is under the same protection of confidentiality as other sensitive
health information under the Health Insurance Portability and Accountability Act (HIPAA) when health insurers come across it.