Search This Blog

Saturday, September 3, 2022

Exascale computing

From Wikipedia, the free encyclopedia

Exascale computing refers to computing systems capable of calculating at least "1018 IEEE 754 Double Precision (64-bit) operations (multiplications and/or additions) per second (exaFLOP)"; it is a measure of supercomputer performance.

Exascale computing is a significant achievement in computer engineering: primarily it will allow improved scientific applications and better prediction such as in weather forecasting, climate modeling and personalised medicine. Exascale also reaches the estimated processing power of the human brain at the neural level, a target of the Human Brain Project. There has been a race to be the first country to build an exascale computer, typically ranked in the TOP500 list.

In 2022, the world's first public exascale computer, Frontier, was announced; As of June 2022, it is the world's fastest supercomputer.

Definitions

Floating point operations per second (FLOPS) are one measure of computer performance. FLOPS can be recorded in different measures of precision, however the standard measure (used by the TOP500 supercomputer list) uses 64 bit (double-precision floating-point format) operations per second using the High Performance LINPACK (HPLinpack) benchmark.

Whilst a distributed computing system had broken the 1 exaFLOP barrier before Frontier, the metric typically refers to single computing systems. Supercomputers had also previously broken the 1 exaFLOP barrier using alternative precision measures; again these do not meet the criteria for exascale computing using the standard metric. It has been recognised that HPLinpack may not be a good general measure of supercomputer utility in real world application, however it is the common standard for performance measurement.

Technological challenges

It has been recognized that enabling applications to fully exploit capabilities of exascale computing systems is not straightforward. Developing data-intensive applications over exascale platforms requires the availability of new and effective programming paradigms and runtime systems. The Folding@home project, the first to break this barrier, relied on a network of servers sending pieces of work to hundreds of thousands of clients using a client–server model network architecture.

History

The first petascale (1015 FLOPS) computer entered operation in 2008. At a supercomputing conference in 2009, Computerworld projected exascale implementation by 2018. In June 2014, the stagnation of the Top500 supercomputer list had observers question the possibility of exascale systems by 2020.

Although exascale computing was not achieved by 2018, in the same year the Summit OLCF-4 supercomputer performed 1.8×1018 calculations per second using an alternative metric whilst analysing genomic information. The team performing this won the Gordon Bell Prize at the 2018 ACM/IEEE Supercomputing Conference.

The exaFLOPS barrier was first broken in March 2020 by the distributed computing network Folding@home coronavirus research project.

In June 2020 the Japanese supercomputer Fugaku achieved 1.42 exaFLOPS using the alternative HPL-AI benchmark.

Development

United States

In 2008, two United States of America governmental organisations within the US Department of Energy, the Office of Science and the National Nuclear Security Administration, provided funding to the Institute for Advanced Architectures for the development of an exascale supercomputer; Sandia National Laboratory and the Oak Ridge National Laboratory were also to collaborate on exascale designs. The technology was expected to be applied in various computation-intensive research areas, including basic research, engineering, earth science, biology, materials science, energy issues, and national security.

In January 2012, Intel purchased the InfiniBand product line from QLogic for US$125 million in order to fulfill its promise of developing exascale technology by 2018.

By 2012, the United States had allotted $126 million for exascale computing development.

In February 2013, the Intelligence Advanced Research Projects Activity started the Cryogenic Computer Complexity (C3) program, which envisions a new generation of superconducting supercomputers that operate at exascale speeds based on superconducting logic. In December 2014 it announced a multi-year contract with IBM, Raytheon BBN Technologies and Northrop Grumman to develop the technologies for the C3 program.

On 29 July 2015, Barack Obama signed an executive order creating a National Strategic Computing Initiative calling for the accelerated development of an exascale system and funding research into post-semiconductor computing. The Exascale Computing Project (ECP) hopes to build an exascale computer by 2021.

On 18 March 2019, the United States Department of Energy and Intel announced the first exaFLOPS supercomputer would be operational at Argonne National Laboratory by late 2022. The computer, named Aurora is to be delivered to Argonne by Intel and Cray (now Hewlett Packard Enterprise), and is expected to use Intel Xe GPGPUs alongside a future Xeon Scalable CPU, and cost US$600 Million.

On 7 May 2019, the U.S. Department of Energy announced a contract with Cray (now Hewlett Packard Enterprise) to build the Frontier supercomputer at Oak Ridge National Laboratory. Frontier is anticipated to be fully operational in 2022 and, with a performance of greater than 1.5 exaFLOPS, should then be the world's most powerful computer.

On 4 March 2020, the U.S. Department of Energy announced a contract with Hewlett Packard Enterprise and AMD to build the El Capitan supercomputer at a cost of US$600 million, to be installed at the Lawrence Livermore National Laboratory (LLNL). It is expected to be used primarily (but not exclusively) for nuclear weapons modeling. El Capitan was first announced in August 2019, when the DOE and LLNL revealed the purchase of a Shasta supercomputer from Cray. El Capitan will be operational in early 2023 and have a performance of 2 exaFLOPS. It will use AMD CPUs and GPUs, with 4 Radeon Instinct GPUs per EPYC Zen 4 CPU, to speed up artificial intelligence tasks. El Capitan should consume around 40 MW of electric power.

As of November 2021, the United States has three of the five fastest supercomputers in the world.

Japan

In Japan, in 2013, the RIKEN Advanced Institute for Computational Science began planning an exascale system for 2020, intended to consume less than 30 megawatts. In 2014, Fujitsu was awarded a contract by RIKEN to develop a next-generation supercomputer to succeed the K computer. The successor is called Fugaku, and aims to have a performance of at least 1 exaFLOPS, and be fully operational in 2021. In 2015, Fujitsu announced at the International Supercomputing Conference that this supercomputer would use processors implementing the ARMv8 architecture with extensions it was co-designing with ARM Limited. It was partially put into operation in June 2020 and achieved 1.42 exaFLOPS (fp16 with fp64 precision) in HPL-AI benchmark making it the first ever supercomputer that achieved 1 exaOPS. Named after Mount Fuji, Japan's tallest peak, Fugaku retained the No. 1 ranking on the Top 500 supercomputer calculation speed ranking announced on November 17 2020, reaching a calculation speed of 442 quadrillion calculations per second, or 0.442 exaFLOPS.

China

As of June 2022, China had two of the Top Ten fastest supercomputers in the world. According to the national plan for the next generation of high performance computers and the head of the school of computing at the National University of Defense Technology (NUDT), China was supposed to develop an exascale computer during the 13th Five-Year-Plan period (2016–2020) which would enter service in the latter half of 2020. The government of Tianjin Binhai New Area, NUDT and the National Supercomputing Center in Tianjin are working on the project. After Tianhe-1 and Tianhe-2, the exascale successor is planned to be named Tianhe-3.

European Union

See also Supercomputing in Europe

In 2011, several projects aiming at developing technologies and software for exascale computing were started in the EU. The CRESTA project (Collaborative Research into Exascale Systemware, Tools and Applications), the DEEP project (Dynamical ExaScale Entry Platform), and the project Mont-Blanc. A major European project based on exascale transition is the MaX (Materials at the Exascale) project. The Energy oriented Centre of Excellence (EoCoE) exploits exascale technologies to support carbon-free energy research and applications.

In 2015, the Scalable, Energy-Efficient, Resilient and Transparent Software Adaptation (SERT) project, a major research project between the University of Manchester and the STFC Daresbury Laboratory in Cheshire, was awarded c. £1million from the UK's Engineering and Physical Sciences Research Council. The SERT project was due to start in March 2015. It will be funded by EPSRC under the Software for the Future II programme, and the project will partner with the Numerical Analysis Group (NAG), Cluster Vision and the Science and Technology Facilities Council (STFC).

On 28 September 2018, the European High-Performance Computing Joint Undertaking (EuroHPC JU) was formally established by the EU. The EuroHPC JU aims to build an exascale supercomputer by 2022/2023. The EuroHPC JU will be jointly funded by its public members with a budget of around €1 billion. The EU's financial contribution is €486 million.

Taiwan

In June 2017, Taiwan's National Center for High-Performance Computing initiated the effort towards designing and building the first Taiwanese exascale supercomputer by funding construction of a new intermediary supercomputer based on a full technology transfer from Fujitsu corporation of Japan, which is currently building the fastest and most powerful A.I. based supercomputer in Japan. Additionally, numerous other independent efforts have been made in Taiwan with the focus on the rapid development of exascale supercomputing technology, such as Foxconn Corporation which recently designed and built the largest and fastest supercomputer in all of Taiwan. This new Foxconn supercomputer is designed to serve as a stepping stone in research and development towards the design and building of a state of the art exascale supercomputer.

India

In 2012, the Indian Government proposed to commit US$2.5 billion to supercomputing research during the 12th five-year plan period (2012–2017). The project was to be handled by Indian Institute of Science (IISc), Bangalore. Additionally, it was later revealed that India plans to develop a supercomputer with processing power in the exaFLOPS range. It will be developed by C-DAC within the subsequent five years of approval. These supercomputers will use indigenously developed microprocessors by C-DAC in India.

Carotene

From Wikipedia, the free encyclopedia

A 3-dimensional stick diagram of β-carotene
 
Carotene is responsible for the orange colour of carrots and the colours of many other fruits and vegetables and even some animals.
 
Lesser Flamingos in the Ngorongoro Crater, Tanzania. The pink colour of wild flamingos is due to astaxanthin (a carotenoid) they absorb from their diet of brine shrimp. If fed a carotene-free diet they become white.

The term carotene (also carotin, from the Latin carota, "carrot") is used for many related unsaturated hydrocarbon substances having the formula C40Hx, which are synthesized by plants but in general cannot be made by animals (with the exception of some aphids and spider mites which acquired the synthesizing genes from fungi). Carotenes are photosynthetic pigments important for photosynthesis. Carotenes contain no oxygen atoms. They absorb ultraviolet, violet, and blue light and scatter orange or red light, and (in low concentrations) yellow light.

Carotenes are responsible for the orange colour of the carrot, after which this class of chemicals is named, and for the colours of many other fruits, vegetables and fungi (for example, sweet potatoes, chanterelle and orange cantaloupe melon). Carotenes are also responsible for the orange (but not all of the yellow) colours in dry foliage. They also (in lower concentrations) impart the yellow coloration to milk-fat and butter. Omnivorous animal species which are relatively poor converters of coloured dietary carotenoids to colourless retinoids have yellowed-coloured body fat, as a result of the carotenoid retention from the vegetable portion of their diet. The typical yellow-coloured fat of humans and chickens is a result of fat storage of carotenes from their diets.

Carotenes contribute to photosynthesis by transmitting the light energy they absorb to chlorophyll. They also protect plant tissues by helping to absorb the energy from singlet oxygen, an excited form of the oxygen molecule O2 which is formed during photosynthesis.

β-Carotene is composed of two retinyl groups, and is broken down in the mucosa of the human small intestine by β-carotene 15,15'-monooxygenase to retinal, a form of vitamin A. β-Carotene can be stored in the liver and body fat and converted to retinal as needed, thus making it a form of vitamin A for humans and some other mammals. The carotenes α-carotene and γ-carotene, due to their single retinyl group (β-ionone ring), also have some vitamin A activity (though less than β-carotene), as does the xanthophyll carotenoid β-cryptoxanthin. All other carotenoids, including lycopene, have no beta-ring and thus no vitamin A activity (although they may have antioxidant activity and thus biological activity in other ways).

Animal species differ greatly in their ability to convert retinyl (beta-ionone) containing carotenoids to retinals. Carnivores in general are poor converters of dietary ionone-containing carotenoids. Pure carnivores such as ferrets lack β-carotene 15,15'-monooxygenase and cannot convert any carotenoids to retinals at all (resulting in carotenes not being a form of vitamin A for this species); while cats can convert a trace of β-carotene to retinol, although the amount is totally insufficient for meeting their daily retinol needs.

Molecular structure

Carotenes are polyunsaturated hydrocarbons containing 40 carbon atoms per molecule, variable numbers of hydrogen atoms, and no other elements. Some carotenes are terminated by rings, on one or both ends of the molecule. All are coloured, due to the presence of conjugated double bonds. Carotenes are tetraterpenes, meaning that they are derived from eight 5-carbon isoprene units (or four 10-carbon terpene units).

Carotenes are found in plants in two primary forms designated by characters from the Greek alphabet: alpha-carotene (α-carotene) and beta-carotene (β-carotene). Gamma-, delta-, epsilon-, and zeta-carotene (γ, δ, ε, and ζ-carotene) also exist. Since they are hydrocarbons, and therefore contain no oxygen, carotenes are fat-soluble and insoluble in water (in contrast with other carotenoids, the xanthophylls, which contain oxygen and thus are less chemically hydrophobic).

History

The discovery of carotene from carrot juice is credited to Heinrich Wilhelm Ferdinand Wackenroder, a finding made during a search for antihelminthics, which he published in 1831. He obtained it in small ruby-red flakes soluble in ether, which when dissolved in fats gave 'a beautiful yellow colour'. William Christopher Zeise recognised its hydrocarbon nature in 1847, but his analyses gave him a composition of C5H8. It was Léon-Albert Arnaud in 1886 who confirmed its hydrocarbon nature and gave the formula C26H38, which is close to the theoretical composition of C40H56. Adolf Lieben in studies, also published in 1886, of the colouring matter in corpora lutea, first came across carotenoids in animal tissue, but did not recognise the nature of the pigment. Johann Ludwig Wilhelm Thudichum, in 1868–1869, after stereoscopic spectral examination, applied the term 'luteine'(lutein) to this class of yellow crystallizable substances found in animals and plants. Richard Martin Willstätter, who gained the Nobel Prize in Chemistry in 1915, mainly for his work on chlorophyll, assigned the composition of C40H56, distinguishing it from the similar but oxygenated xanthophyll, C40H56O2. With Heinrich Escher, in 1910, lycopene was isolated from tomatoes and shown to be an isomer of carotene. Later work by Escher also differentiated the 'luteal' pigments in egg yolk from that of the carotenes in cow corpus luteum.

Dietary sources

The following foods contain carotenes in notable amounts:

Absorption from these foods is enhanced if eaten with fats, as carotenes are fat soluble, and if the food is cooked for a few minutes until the plant cell wall splits and the color is released into any liquid. 12 μg of dietary β-carotene supplies the equivalent of 1 μg of retinol, and 24 µg of α-carotene or β-cryptoxanthin provides the equivalent of 1 µg of retinol.

Forms of carotene

α-carotene
 
β-carotene
 
γ-carotene
 
δ-carotene

The two primary isomers of carotene, α-carotene and β-carotene, differ in the position of a double bond (and thus a hydrogen) in the cyclic group at one end (the right end in the diagram at right).

β-Carotene is the more common form and can be found in yellow, orange, and green leafy fruits and vegetables. As a rule of thumb, the greater the intensity of the orange colour of the fruit or vegetable, the more β-carotene it contains.

Carotene protects plant cells against the destructive effects of ultraviolet light so β-Carotene is an antioxidant.

β-Carotene and physiology

β-Carotene and cancer

An article on the American Cancer Society says that The Cancer Research Campaign has called for warning labels on β-carotene supplements to caution smokers that such supplements may increase the risk of lung cancer.

The New England Journal of Medicine published an article in 1994 about a trial which examined the relationship between daily supplementation of β-carotene and vitamin E (α-tocopherol) and the incidence of lung cancer. The study was done using supplements and researchers were aware of the epidemiological correlation between carotenoid-rich fruits and vegetables and lower lung cancer rates. The research concluded that no reduction in lung cancer was found in the participants using these supplements, and furthermore, these supplements may, in fact, have harmful effects.

The Journal of the National Cancer Institute and The New England Journal of Medicine published articles in 1996 about a trial with a goal to determine if vitamin A (in the form of retinyl palmitate) and β-carotene (at about 30 mg/day, which is 10 times the Reference Daily Intake) supplements had any beneficial effects to prevent cancer. The results indicated an increased risk of lung and prostate cancers for the participants who consumed the β-carotene supplement and who had lung irritation from smoking or asbestos exposure, causing the trial to be stopped early.

A review of all randomized controlled trials in the scientific literature by the Cochrane Collaboration published in JAMA in 2007 found that synthetic β-carotene increased mortality by 1-8 % (Relative Risk 1.05, 95% confidence interval 1.01–1.08). However, this meta-analysis included two large studies of smokers, so it is not clear that the results apply to the general population. The review only studied the influence of synthetic antioxidants and the results should not be translated to potential effects of fruits and vegetables.

β-Carotene and photosensitivity

Oral β-carotene is prescribed to people suffering from erythropoietic protoporphyria. It provides them some relief from photosensitivity.

Carotenemia

Carotenemia or hypercarotenemia is excess carotene, but unlike excess vitamin A, carotene is non-toxic. Although hypercarotenemia is not particularly dangerous, it can lead to an oranging of the skin (carotenodermia), but not the conjunctiva of eyes (thus easily distinguishing it visually from jaundice). It is most commonly associated with consumption of an abundance of carrots, but it also can be a medical sign of more dangerous conditions.

Production

Algae farm ponds in Whyalla, South Australia, used to produce β-carotene.

Carotenes are produced in a general manner for other terpenoids and terpenes, i.e. by coupling, cyclization, and oxygenation reactions of isoprene derivatives. Lycopene is the key precursor to carotenoids. It is formed by coupling of geranylgeranyl pyrophosphate and[geranyllinalyl pyrophosphate.

Most of the world's synthetic supply of carotene comes from a manufacturing complex located in Freeport, Texas and owned by DSM. The other major supplier BASF also uses a chemical process to produce β-carotene. Together these suppliers account for about 85% of the β-carotene on the market. In Spain Vitatene produces natural β-carotene from fungus Blakeslea trispora, as does DSM but at much lower amount when compared to its synthetic β-carotene operation. In Australia, organic β-carotene is produced by Aquacarotene Limited from dried marine algae Dunaliella salina grown in harvesting ponds situated in Karratha, Western Australia. BASF Australia is also producing β-carotene from microalgae grown in two sites in Australia that are the world's largest algae farms. In Portugal, the industrial biotechnology company Biotrend is producing natural all-trans-β-carotene from a non-genetically-modified bacteria of the genus Sphingomonas isolated from soil.

Carotenes are also found in palm oil, corn, and in the milk of dairy cows, causing cow's milk to be light yellow, depending on the feed of the cattle, and the amount of fat in the milk (high-fat milks, such as those produced by Guernsey cows, tend to be yellower because their fat content causes them to contain more carotene).

Carotenes are also found in some species of termites, where they apparently have been picked up from the diet of the insects.

Synthesis

There are currently two commonly used methods of total synthesis of β-carotene. The first was developed by BASF and is based on the Wittig reaction with Wittig himself as patent holder:

Carotene synthesis by Wittig

The second is a Grignard reaction, elaborated by Hoffman-La Roche from the original synthesis of Inhoffen et al. They are both symmetrical; the BASF synthesis is C20 + C20, and the Hoffman-La Roche synthesis is C19 + C2 + C19.

Nomenclature

Carotenes are carotenoids containing no oxygen. Carotenoids containing some oxygen are known as xanthophylls.

The two ends of the β-carotene molecule are structurally identical, and are called β-rings. Specifically, the group of nine carbon atoms at each end form a β-ring.

The α-carotene molecule has a β-ring at one end; the other end is called an ε-ring. There is no such thing as an "α-ring".

These and similar names for the ends of the carotenoid molecules form the basis of a systematic naming scheme, according to which:

  • α-carotene is β,ε-carotene;
  • β-carotene is β,β-carotene;
  • γ-carotene (with one β ring and one uncyclized end that is labelled psi) is β,ψ-carotene;
  • δ-carotene (with one ε ring and one uncyclized end) is ε,ψ-carotene;
  • ε-carotene is ε,ε-carotene
  • lycopene is ψ,ψ-carotene

ζ-Carotene is the biosynthetic precursor of neurosporene, which is the precursor of lycopene, which, in turn, is the precursor of the carotenes α through ε.

Food additive

Carotene is used to colour products such as juice, cakes, desserts, butter and margarine. It is approved for use as a food additive in the EU (listed as additive E160a) Australia and New Zealand (listed as 160a) and the US.

Friday, September 2, 2022

Viral quasispecies

From Wikipedia, the free encyclopedia

A viral quasispecies is a population structure of viruses with a large number of variant genomes (related by mutations). Quasispecies result from high mutation rates as mutants arise continually and change in relative frequency as viral replication and selection proceeds.

The theory predicts that a viral quasispecies at a low but evolutionarily neutral and highly connected (that is, flat) region in the fitness landscape will outcompete a quasispecies located at a higher but narrower fitness peak in which the surrounding mutants are unfit. This phenomenon has been called 'the quasispecies effect' or, more recently, the 'survival of the flattest'.

The term quasispecies was adopted from a theory of the origin of life in which primitive replicons consisted of mutant distributions, as found experimentally with present-day RNA viruses within their host. The theory provided a new definition of wild type when describing viruses, and a conceptual framework for a deeper understanding of the adaptive potential of RNA viruses than is offered by classical studies based on simplified consensus sequences.

The quasispecies model is most applicable when the genome size is limited and the mutation rate is high, and so is most relevant to RNA viruses (including important pathogens) because they have high mutation rates (approx one error per round of replication), though the concepts can apply to other biological entities such as reverse translating DNA viruses like hepatitis B. In such scenarios, complex distributions of closely related variant genomes are subjected to genetic variation, competition and selection, and may act as a unit of selection. Therefore, the evolutionary trajectory of the viral infection cannot be predicted solely from the characteristics of the fittest sequence. High mutation rates also place an upper limit compatible with inheritable information. Crossing such a limit leads to RNA virus extinction, a transition that is the basis of an antiviral design termed lethal mutagenesis, and of relevance to antiviral medicine.

The relevance of quasispecies in virology has been the subject of extended debate. However, standard clonal analyses and deep sequencing methodologies have confirmed the presence of myriads of mutant genomes in viral populations, and their participation in adaptive processes.

History

The equations are the mathematical expression of the major concepts implied by quasispecies theory. The first equation describes the change of concentration of molecule i as a function of replication parameters, and its production from other molecules of the same ensemble. The second equation is the error threshold relationship, indicating the maximum amount of information (ʋmax) and the maximum average error rate pmax (p = 1- q; q is the copying fidelity) for maintenance of genetic information. Terms are defined in the box on the right. Below, an evolving mutant spectrum (with mutations represented as symbols on the genomes), with an invariant consensus sequence.

Quasispecies theory was developed in the 1970s by Manfred Eigen and Peter Schuster to explain self-organization and adaptability of primitive replicons (a term used to refer to any replicating entity), as an ingredient of hypercyclic organizations that link genotypic and phenotypic information, as an essential step in the origin of life. The theory portrayed early replicon populations as organized mutant spectra dominated by a master sequence, the one endowed with the highest fitness (replicative capacity) in the distribution. It introduced the notion of a mutant ensemble as a unit of selection, thus emphasizing the relevance of intra-population interactions to understand the response to selective constraints. One of its corollaries is the error threshold relationship, which marks the maximum mutation rate at which the master (or dominant) sequence can stabilize the mutant ensemble. Violation of the error threshold results in loss of dominance of the master sequence and drift of the population in sequence space.

The core quasispecies concepts are described by two fundamental equations: replication with production of error copies, and the error threshold relationship. They capture two major features of RNA viruses at the population level: the presence of a mutant spectrum, and the adverse effect of an increase of mutation rate on virus survival, each with several derivations.

Flow of conceptual derivations of quasispecies theory for viral populations, and some biological consequences.

The existence of a mutant spectrum was experimentally evidenced first by clonal analyses of RNA bacteriophage Qβ populations whose replication had been initiated by a single virus particle. Individual genomes differed from the consensus sequence in an average of one to two mutations per individual genome. Fitness of biological clones was inferior to that of the parental, uncloned population, a difference also documented for vesicular stomatitis virus (VSV). The replicative capacity of a population ensemble need not coincide with that of its individual components. The finding that a viral population was essentially a pool of mutants came at a time when mutations in general genetics were considered rare events, and virologists associated a viral genome with a defined nucleotide sequence, as still implied today in the contents of data banks. The cloud nature of Qβ was understood as a consequence of its high mutation rate, calculated in 10−4 mutations introduced per nucleotide copied, together with tolerance of individual genomes to accept an undetermined proportion of the newly arising mutations, despite fitness costs. The error rate estimated for bacteriophage Qβ has been confirmed, and is comparable to values calculated for other RNA viruses.

High mutation rates and quasispecies were verified for other RNA viruses based on dissection of viral populations by molecular or biological cloning, and sequence analysis of individual clones. John Holland and colleagues were the first to recognize that a rapidly evolving RNA world inserted in a DNA-based biosphere had multiple evolutionary and medical implications. Genome plasticity of RNA viruses had been suspected for many decades. Key early observations were variations in viral traits described by Findley in the 1930s, the studies of Granoff on transitions of plaque morphology of Newcastle disease virus, or the high frequency of conversions between drug resistance and dependence in Coxsackie A9 virus, among other studies with animal and plant viruses in the middle of the 20th century. When put in the context of present-day knowledge, we realize that these observations on phenotypic changes were the tip of the iceberg of an extremely complex reality of viral populations. High mutation rates and population heterogeneity characterize RNA viruses, with consequences for viral pathogenesis and the control of viral disease. Detailed studies on quasispecies dynamics in vivo have been performed with human immunodeficiency virus type 1 (HIV-1) and hepatitis C virus.

Current scope

The first mathematical formulation of quasispecies was deterministic; it assumed steady state mutant distributions in genetic equilibrium without perturbations derived from modifications of the environment or population size. These conditions are common in initial theoretical formulations of complex phenomena because they confer mathematical tractability. Since then, several extensions of the theory to non-equilibrium conditions with stochastic components have been developed, with the aim of finding general solutions for multi-peak fitness landscapes. These objectives approximate quasispecies to the real case of RNA viruses, which are compelled to deal with dramatic variations in population size and environment. Research on quasispecies has proceeded through several theoretical and experimental avenues that include continuing studies on evolutionary optimization and the origin of life, RNA-RNA interactions and replicator networks, the error threshold in variable fitness landscapes, consideration of chemical mutagenesis and proofreading mechanisms, evolution of tumor cells, bacterial populations or stem cells, chromosomal instability, drug resistance, and conformation distributions in prions (a class of proteins with conformation-dependent pathogenic potential; in this case the quasispecies is defined by a distribution of conformations). New inputs into experimental quasispecies research have come from deep sequencing to probe viral and cellular populations, recognition of interactions within mutant spectra, models of viral population dynamics related to disease progression and pathogen transmission, and new teachings from fidelity variants of viruses. Here we summarize the main aspects of quasispecies dynamics, and recent developments relevant to virus evolution and pathogenesis.

Dynamic heterogeneity

The molecular basis of high error rates is the limited template-copying fidelity of RNA-dependent RNA polymerases (RdRps) and RNA-dependent DNA polymerases (also termed reverse transcriptases, RTs). In addition, these enzymes are defective in proofreading because they lack a 3’ to 5’ exonuclease domain present in replicative cellular DNA polymerases. Also, postreplicative-repair pathways, abundant to correct genetic lesions in replicating cellular DNA, appear as ineffective for double-stranded RNA or RNA-DNA hybrids. The presence of a proofreading-repair activity in coronaviruses increases their copying accuracy about 15-fold. This and other repair activities, that may act on standard RNA or retroviral genomes, do not prevent the formation of mutant spectra, although their amplitude may be lower than for other RNA viruses, at least in populations close to a clonal (single genome) origin. Quasispecies dynamics will operate in any viral or cellular system in which due to high mutation rates (as a result of low fidelity nucleic acid polymerases or environmental alterations) mutant spectra are rapidly generated.

Studies with different virus-host systems have established some general observations on the mechanisms of mutant generation, and implications of quasispecies dynamics. In RNA virus genetics when we speak of "a mutant" the entity we handle is a cloud of mutants in which the specific mutation to which we direct our attention is present in all (or the great majority of) individual genomes. There is no such a thing as "a" wild type or "a" mutant virus. They are always clouds of mutants. Changes in the relative dominance of components of mutant spectra are particularly severe during in vivo infections, with complex dynamics of intra-host heterogeneity and variations. Bioinformatic procedures have been developed to unveil the relationships among different but closely related genome types that may suggest some hierarchical order of mutation acquisition or identification of transmission clusters (examples are Partition Analysis of Quasispecies, PAQ or QUasispecies Evolution, Network-based Transmission Inference, QUENTIN).

Phenotypic reservoirs

Upon isolation from an infected host (middle boxes), a virus sample may be adapted to cultured cells and subjected to large population or bottleneck transfers (left box), or be adapted to a different host in vivo (right box). Relevant adaptive mutations are highlighted with colored symbols.

The crux of the matter regarding quasispecies implications is that at any given time, the viral population includes a reservoir not only of genotypic but also of phenotypic variants, conferring upon the population some adaptive pluripotency. Accumulating laboratory and clinical evidence renders untenable that minority components of mutant spectra should be dismissed on the grounds of their being neutral. They can participate in selective processes and cannot be excluded from interpretations of virus behavior. Variation universally involves point mutations and it can also include recombination (in its replicative and non-replicative modes), and genome segment reassortment. All modes of molecular variation are compatible, only restricted by the scope of mechanisms accessible to the replicative machinery, and for the need for viral genomes to remain functional. David Evans and colleagues identified many recombination events associated with enterovirus replication, and only a few recombinants made their way towards continued replication. Recombination can mediate adaptability and virulence. High mutation and recombination rates have led to the conceptual distinction between mechanistically unavoidable and evolutionarily relevant variation, in connection with the issue of clonal versus non-clonal nature of virus evolution (microbial evolution in general). Only a minority of the nascent variation during replication can be successfully propagated. Within limits that are set by biological constraints, each population is made of an array of variant genomes, with a total number which is commensurate with the virus population size. To infect a plant, animal or cell culture with 103 infectious units can have very different consequences than to infect with 1010 infectious units, not only because the host defense systems may be overwhelmed by the high infectious dose, but also because the mutant repertoire that engages in adaptive explorations is larger. Part of the variants of a mutant spectrum, either in isolation or in consortium with others, may perform better than other members of the same population in the event of an environmental change. Selective pressures favor replication of some components of a mutant spectrum over others, despite all of them being interconnected by mutation. Differential performance can be at the level of viral genomes (during replication, intracellular gene expression, interaction with host factors, etc.) or viral particles (for thermal stability, entry into or exit from cells, to withstand neutralizing antibodies, etc.). Adaptability of RNA viruses is linked to parameters that facilitate exploration of sequence space: genome size (1.8 to 33 Kb), population size (variable but that can attain an impressive 1012 individual genomes in an infected host at a given time), replication rate, mutation rate, fecundity (yield of viral particles per cell), and number of mutations required for a phenotypic change (surprisingly low for several relevant traits).

Mutant spectrum dynamics has been depicted in different ways, and we have chosen one that encompasses frequent events in natural populations and research designs, such as virus isolation from an infected host, adaptation to cell culture for studies on experimental evolution, or adaptation to alternative hosts in vivo. The reality is even more complex, given the large population sizes, with an indeterminate proportion of genomes actively replicating at any given time (sometimes equated with the effective population size in general genetics), and harboring multiple mutations per genome. The scenarios suggested by current experimental data defy our imagination. The relative frequency of individual mutations fluctuates in an unceasing exploration of sequence space, with phenotypic changes (not only genotypic changes) being far more frequent than previously thought. The experimental evolution design that consists of passaging viral populations for long time periods (many sequential infections) is often extremely revealing. In foot-and-mouth disease virus (FMDV) such a design led to a remarkable phenotypic diversification into subpopulations of colonizers and competitors, that modulated virulence of the mutant ensemble. In HCV such a design unveiled continuous mutation waves and a more accurate understanding of the types of fitness landscapes occupied by high fitness viruses.

Limitations and indeterminacies

The nucleotide sequence of an individual genome from a population (no matter which the degree of population complexity might be), can be determined either following a biological or molecular cloning event or by deep sequencing of entire viral genomes, in a manner that mutation linkage (assignment of different mutations to the same genome molecule) can be established. Each of these procedures implies some limitations: biological cloning can bias the representation in favor of infectious genomes, while molecular cloning can introduce non-infectious (defective) genomes in the analysis. Whole genome quasispecies description is still technically challenging due to the artifactual introduction of mutations. Most current deep sequencing platforms yield sequences of short reads for a given amplicon (sequence under analysis); minority mutations in an amplicon cannot be reliably linked to mutations in a different amplicon of the same genome; at most, statistical inferences on linkage can be proposed. Despite these limitations, control experiments and improvements of bioinformatic procedures support that the majority of sequence heterogeneity analyzed in viral populations indeed reflects differences in the natural template populations. If mutation linkage can be solved on a routine basis, a new wave of molecular information relevant to epistatic interactions will enter the picture.

There are additional levels of indeterminacy in the sequential analysis of viral populations, in particular those replicating in vivo. Components of the mutant spectrum represented at a given time in the sample taken for sequencing may differ from those in the next time point, due either to sampling uncertainties or bona fide fluctuations of genome frequencies. It is not justified to accept a rough similarity because even a single mutation in a given sequence context may affect biological properties. In the words of John Holland and colleagues: "It is important to remember that every quasispecies genome swarm in an infected individual is unique and "new" in the sense that no identical population of genomes has ever existed before and none such will ever exist again". On top of the fleeting nature of any mutant distribution, the standard methods available for quasispecies characterization provide genomic sequences of a minority of the population (estimated in 10−8 to 10−13 for molecular cloning-Sanger sequencing, and in 10−6 to 10−11 for deep sequencing). We can only have an approximate representation of viral populations and their dynamics, as evidenced by many experimental studies.

Non-consensus-based descriptors

The points summarized in previous sections fully justifies addressing analytical tools towards the mutant spectrum rather than ignoring it or considering its presence a side issue. Use of consensus sequences to describe the genome of a virus isolate, despite being warranted by the difficulties of conveying the information recapitulated in a mutant spectrum, blurs and enfeebles biological interpretations. Experimental results have demonstrated that minority genomes from a mutant spectrum (that cannot be identified by examining the consensus sequence) can include mutations that confer resistance to antiviral inhibitors, neutralizing antibodies or cytotoxic T cells, or that can alter the capacity to induce interferon (IFN) or to respond to IFN, virulence or particle stability, among other phenotypic traits. Mutant spectra can also mediate cyclical adaptation to different cell types. A mutant spectrum defines a consensus but the consensus is an abstraction; it may not be represented in the population. Many events in viral pathogenesis and evolution are due to mutant spectrum modifications or interactions which cannot be properly interpreted solely on the basis of consensus sequences.

Collective response

Mutant spectra are not mere aggregates of mutants acting independently. They are often engaged in collective responses. Two major types are those that depend on the presence of sets of variants, and those that rely on intra-mutant spectrum interactions.

Variants that drive responses to selective constraints

Behavior of reconstructed quasispecies

In some cases of sweeping selection (very strong selection for a trait), an individual (or a limited number of individuals) that encodes signatures prone to be selected, may approach dominance while becoming the founder of a mutant cloud (because formation of a cloud is inherent to replication). Conditions for dominance (in this case in response to selection) are that the genome senses the selective sweep and that its replication in the new selective environment is permitted. In other cases, a collection of mutants is selected. This was illustrated with a FMDV quasispecies that was reconstructed in the laboratory with multiple antigenic variants (each at low frequency) that belonged to two different categories, and shared resistance to the same monoclonal antibody. One category included mutants with an amino acid substitution that affected receptor recognition (since the antigenic determinant overlapped with the integrin receptor recognition site); in the other category, the substitutions affected the antigenic determinant but not the receptor recognition site. Passages of the virus in absence of the monoclonal antibody resulted in dominance of antigenic variants that maintained the receptor recognition capacity, but the dominant variants were surrounded by a cloud of mutants of the other antigenic variant category. Conversely, passages in the presence of the antibody led to selection of variants with altered receptor recognition, surrounded by a cloud of antigenic variants that maintained receptor recognition. The results underlined the role of mutant clouds in selective events, and unveiled a new mechanism of antigenic flexibility.

Quasispecies memory

Quasispecies memory is a type of molecular memory dependent on the recent history of the evolutionary lineage and the integrity of the mutant spectrum. The search for memory was prompted by the complex adaptive system behavior of a viral quasispecies, suggested by the presence of core information (considered the one that defines viral identity) despite variation of constitutive elements (the mutant spectrum). A well-known example is memory in the immune system that mobilizes and expands minority components in response to stimuli previously faced by the system. In the experiments designed to identify memory in viral quasispecies, members of the mutant spectrum increased in frequency as a consequence of their replication during a selection event that drove them towards dominance. When the selective constraint was withdrawn, memory genomes remained at levels that were 10- to 100-fold higher than the basal levels attributable solely to their generation by mutation, as documented with independent FMDV genetic markers, and with HIV-1 in vivo. Thus, memory is a history-dependent, collective property of the quasispecies that confers a selective advantage to respond to environmental changes previously experienced by the same evolutionary lineage. It can be manifested only if the mutant spectrum maintains its completeness, since memory is lost when the population undergoes a bottleneck event that excludes minorities. A relevant example of the consequences of memory occurs in antiviral pharmacology with the administration for a second time of the same or a related antiviral agent (capable of evoking shared resistance mutations) used in a previous treatment. The second intervention may face inhibitor-resistant memory genomes from the earlier treatment, thus contributing to virus escape. This is an aspect that has not received adequate attention in the planning of antiviral interventions for patients who fail a first treatment and have to be subjected to a second treatment.

Intra-mutant spectrum interactions for interference, complementation or cooperation

Individual genomes surrounded by a cloud of related mutants can be either suppressed to be kept at low frequency, or helped to be maintained in the population. The two alternative fates are dependent on several factors, one being the surrounding mutant spectrum in those steps of the infectious cycle in which an effective competition among variants is established, for example within replication complexes. This important concept was first derived theoretically, and then approached experimentally with several viruses. In an early study, Juan Carlos de la Torre and John Holland described suppression of high fitness VSV by mutant spectra of inferior fitness. Suppressive effects have since been documented with standard and mutagenized viral populations. Some examples are:

  • Suppression of high fitness antigenic variants of FMDV by low fitness antibody-escape mutants.
  • Suppression of virulent poliovirus (PV) by attenuated virus in poliovirus vaccines.
  • Suppression of pathogenic lymphocytic choriomengitis virus (LCMV) (that cause growth hormone deficiency in mice) by non-pathogenic LCMV variants.
  • Suppression of FMDV by a mutagenized FMDV population.
  • Suppression of FMDV by capsid and polymerase FMDV mutants.
  • Suppression of drug-resistant viral mutants during antiviral therapy.

Opposite to suppression is maintenance of a mutant either by a favorable position in a fitness landscape or by interactions of complementation or cooperation with members of the mutant spectrum. The position in a fitness landscape influences vulnerability to mutations, as popularized with the terms "advantage of the flattest" or "survival of the flattest", indicating that a variant located at the top of a sharp fitness peak has higher probability to decrease fitness as a result of new mutations than the same variant located at a fitness plateau. Survival of the flattest has been also proposed as an ingredient in some models of the error threshold.

Collective behavior of viruses was documented with mutant RNA viruses resistant to nucleotide analogues. The study of this class of mutants has been instrumental for the understanding of the molecular basis of template copying fidelity, and the consequences of fidelity alterations in the adaptive capacity and pathogenic potential of RNA viruses. In the first mutant studied, amino acid substitution G46S in the PV polymerase resulted in about four-fold increase in template-copying fidelity. This modification reduced PV adaptability and infective potential in vivo. The mutant in isolation did not replicate efficiently in the brain of susceptible mice, but it did when its mutant spectrum was broadened by 5-fluorouracil mutagenesis or when it was co-inoculated with wild type PV.

Complementation (often occurring when a functional protein encoded by a set of genomes is used by another set of genomes whose encoded protein is not functional) may underlie some collective responses of quasispecies such as fitness of individuals isolated from a population being inferior to fitness of the population. Complementation was described between two truncated FMDV genomic forms. The genomes with internal deletions became detectable upon high multiplicity passage of a clonal population of standard FMDV, a virus with a monopartite single stranded RNA genome. Infectivity was generated by complementation of the two truncated forms, in absence of standard, full length FMDV genomes. For complementation to be effective, prior exploration of sequence space through point mutations was a requirement. The system underwent a remarkable evolutionary transition akin to genome segmentation. Drastic genetic lesions in viral genomes are difficult to observe unless a mechanism such as complementation comes into the rescue of the deviant genomes. Additional examples of complementation among RNA viruses have been reported. Complementation is a means to maintain defective genomes at detectable frequencies in viral populations.

A distinction has been made between complementation and cooperation, in which two different genomes give rise to a new phenotype through the interaction between two variant proteins. An example of cooperation was characterized during studies with measles virus on membrane fusion which is essential for virus entry into cells. For this virus fusion is mediated by two proteins termed H and F. A truncated H was deficient in cell fusion but the activity was regained when the truncated H was accompanied by two forms of F but not one of the forms individually.

Therefore, complementation, cooperation, interference and suppression can emerge from interactions among components of mutant spectra that have their origin in random mutations. Selection acts on whatever sets of mutants can provide a useful trait, to turn random occurrences into biological meaning.

Bottlenecks

Illustration of bottleneck of different severity, defined by the different circles inserted in the entire population (large rectangle) and the outer rectangles. Symbols represent mutant classes.

A means to interrupt the participation of individual genomes in interactions with their mutant spectrum is for the quasispecies swarm to undergo drastic reductions in population size that isolate one or few individual genomes from their surroundings. Such reductions are termed bottlenecks, and they have an important participation in shaping evolutionary lineages for all kinds of organisms, and also for viruses. They occur frequently not only upon host-to host transmission but also inside infected hosts, and they can perturb positive and negative selection events in processes that are difficult to identify and characterize.

Drastic bottleneck events have been reproduced with laboratory populations of viruses in the form of plaque-to-plaque transfers. This design served to verify experimentally the operation of Müller’s ratchet, or fitness decrease by the irreversible incorporation of mutations in asexual organisms in absence of compensatory mechanisms. The serial bottleneck transfers unveiled the presence rare mutations, not seen in standard laboratory or natural viral populations. In absence of forced bottleneck events, such rare mutations would be lost by negative selection because of their fitness cost. The investigation of how FMDV clones debilitated by Müller’s ratchet regained replicative fitness revealed several alternative molecular pathways for fitness recovery. The implications of this observation went largely unnoticed until recent results with hepatitis C virus (HCV) have also suggested the accessibility of multiple pathways for fitness gain. Also, extensive passage of a biological clone of FMDV in BHK-21 cells conferred the capacity to infect several human cell lines in addition to the expected fitness increase for multiplication in BHK-21 cells. Thus, several lines of evidence suggest that fitness gain in a specific environment may paradoxically broaden the phenotypic potential of a virus. It will be interesting to investigate whether focused adaptation of other viruses to a specific environment may also entail a broadening of diversity, with many phenotypic variants attaining similar fitness levels. If generalized, this broadening of phenotypic space would provide a new interpretation of the molecular basis of adaptation, and explain why adaptation to alternative environments may not lead to attenuation.

Deprivation of an individual virus from possible suppression, complementation or cooperation, may represent a liberation to initiate a new evolutionary process, or a condemnation to extinction. If liberated from suppression, the isolated genome must replicate and be able to reconstruct a mutant cloud to regain adaptive capability. This has led to the suggestion that high mutation rates evolved to allow such mutant spectrum recovery following bottlenecks. Other models attribute high mutation rates to adaptive optimization independent of bottlenecks, or to a mechanistic consequence of rapid replication. Whatever their ultimate origins, high mutation rates serve the purpose of adaptation in multiple circumstances, not only following bottlenecks. A founder virus can introduce a different phenotype for the ensuing evolution. Evolution of viruses in nature and as disease agents can be viewed as succession of mutant spectrum alterations, subjected to expansions and reductions of population size in a continuous interplay of positive and negative selection and random drift. While short-term (for example, intra-host) evolution is observable and measurable, viruses may appear to be relatively static in the long term for decades (as seen with antigenic variants of FMDV) or longer. Intra-host evolution is generally more rapid than inter-host evolution, as documented with viruses and other biological systems. Apparent invariance may be the result of selection for long-term survival of populations that have previously frenziedly tested evolutionary outcomes in short-term processes.

Viral disease

Soon after quasispecies was evidenced for viruses, some medical implications were made explicit. Several specific and general points below.

  • High mutation rates and population heterogeneity endow viruses with the potential to escape immune pressures (including those due to vaccination) and antiviral inhibitors used in therapy. It is an open question if vaccination can promote long-term evolution of antigenic determinants.
  • Attenuated RNA virus vaccines can revert to virulent forms. RNA viruses released in nature for pest control purposes can mutate to new phenotypes.
  • Virus attenuation and virulence is dependent on viral genetic traits. Variant forms of a given virus may display increased virulence or atypical disease.
  • Components of a mutant spectrum can exhibit a different cell tropism or host range than most genomes in the same population, with implications for the emergence and re-emergence of viral disease.
  • Viral pathogenesis is influenced by microevolutionary processes in which some viral subpopulations are replaced by others to persist or to invade new cell types, tissues or organs.
  • The larger the actively replicating (effective) population size and the replication rate, the most effective is exploration of sequence space for phenotypic expansions that favor survival and persistence.
  • There is a connection between four parameters that characterize viruses during infection processes: replication rate (the rate at which viral RNA or DNA is synthesized intracellularly for viral progeny production), viral load (the total amount of virus quantified in an infected host or host compartment), genetic heterogeneity, and replicative fitness (the yield of infectious particles that can contribute to the next generation). They can influence disease progression, and any of them can be targeted for disease control.

In all interactions conductive to disease, the host cells individually and as groups in tissues and organs play decisive roles. The consequences of a viral infection are always host-dependent. However, the virus itself poses a major challenge that a deeper understanding of quasispecies dynamics is helping to confront.

Antiviral strategies

There is an increasing perception that Darwinian principles should assist in the planning of antiviral designs. The aim of vaccination is to evoke a protective response that either prevents virus replication or disease. The aim of an antiviral pharmacological intervention is to inhibit virus replication to provide the immune system with an opportunity to clear the virus. Expressed simply, the direct danger for vaccination and treatment is that the virus can escape through selection of mutants resistant to vaccine-triggered defense components or to the externally administered inhibitors. This has led to several proposals to confront viral disease, that can be summarized below.

Vaccine exposure of multiple B cell and T cell epitopes

Vaccines should include repertoires of B cell and T cell epitopes to evoke an ample immune response. The broad response should minimize selection of escape mutants that may be present as minority components in mutant spectra, as repeatedly documented experimentally. With the current types of available vaccines, those that best comply with the multiple epitope requirement are, in the order of expected efficacy to confer protection against highly variable viruses: attenuated > inactivated whole virus > several expressed proteins > one expressed protein > multiple synthetic peptide antigens > single peptide antigen. The scarcity of effective synthetic vaccines for RNA viral pathogens despite huge scientific and economic efforts is a reflection of the underlying problems.

Antiviral agents used in combination

Antiviral monotherapy (use of a single antiviral agent) is to be avoided. The following recommendations have been made and in some cases successfully implemented:

  • Inhibitors used in combination should target different viral gene products.
  • Splitting a treatment into two steps: first an induction regimen, and a second maintenance regimen. Drugs administered in the two steps should be different.
  • Targeting of cellular functions needed for the virus life cycle.
  • Use of innate immune response-stimulating drugs (for example, inhibitors of enzymes involved in pyrimidine biosynthesis).
  • Combined use of immunotherapy and chemotherapy.
  • Lethal mutagenesis or virus extinction by excess of mutations introduced during viral replication.

These strategies have as their main objective to avoid selection of treatment-escape mutants by multiple selective constraints that cannot be surmounted by the virus. Control is effective either because exploration of sequence space cannot reach the required multiple mutations (even when recombination is available) or because the multiple mutations inflict a severe fitness cost. Vaccines exposing multiple epitopes and combination therapies follow the same strategy whose aim is to limit possible escape routes to viral quasispecies in the face of the suppressive constraint.

Lethal mutagenesis

Lethal mutagenesis is the process of virus extinction at the error rate at which a virus can no longer maintain its genetic information. Application of lethal mutagenesis as an antiviral strategy deserves attention in the context of the present article because its origins lie in quasispecies theory, in the form of the error threshold relationship. Both the error threshold and lethal mutagenesis are highly fitness landscape-dependent, but both can occur in complex fitness landscapes as those pertinent to viral populations. The term lethal mutagenesis was coined by Lawerence Loeb and colleagues, and it is now widely used to describe the antiviral activity of base and nucleoside analogues that increase the viral mutation rate. Although several models have been proposed to account for virus extinction by excess mutations, an extension of the violation of the error threshold stands as a likely mechanism. Interestingly, some antiviral agents licensed for human use, initially thought to act only as inhibitors of viral replication, may actually exert their antviral activity against some RNA viruses at least partially by lethal mutagenesis. This is the case of favipiravir (T-705; 6-fluoro-3-hydroxy-2-pirazinecarboxamide) and ribavirin (1-β-D-ribofuranosyl-1-H-1,2,4-triazole-3-carboxamide) that are currently being intensively investigated as lethal mutagens.

Defense mechanisms based on genome modification of invading genetic parasites such as editing cellular activities that are recruited as part of the innate immune response (ADAR, APOBEC, RIP, etc.) represent a natural counterpart of the principle utilized by lethal mutagenesis. Applicability to pathogenic cellular elements is a real possibility, and lethal mutagenesis to control tumor cells is an active field of investigation. Thus, the recognition of quasispecies dynamics has suggested some fundamental guidelines for disease prevention and control that are gradually permeating clinical practice. This is in line with the recognized need to apply Darwinian principles to the control of infectious disease.

Error threshold

This may be defined as "The inability of a genetic element to be maintained in a population as the fidelity of its replication machinery decreases beyond a certain threshold value".

In theory, if the mutation rate was sufficiently high, the viral population would not be able to maintain the genotype with the highest fitness, and therefore the ability of the population to adapt to its environment would be compromised. A practical application of this dynamic is in antiviral drugs employing lethal mutagenesis. For example, increased doses of the mutagen ribavirin reduces the infectivity of poliovirus.

However, these models assume that only the mutations that occur in the fittest sequence are deleterious, and furthermore that they are non-lethal. It has been argued that, if we take into account the deleterious effect of mutations on the population of variants and the fact that many mutations are lethal, then the error threshold disappears, i.e. the fittest sequence always maintains itself. Empirical data on the effect of mutations in viruses is rare, but appears to correspond with this scenario.

Possible evolutionary consequences

This visualization of "survival of the flattest" in evolutionary biology.

Mutational robustness

The long-term evolution of the virus may be influenced in that it may be a better evolutionarily stable strategy to generate a broad quasispecies with members of approximately equal fitness than to have a sharply defined 'most fit' single genotype (with mutational neighbours substantially less fit). This has been called 'survival of the flattest' - referring to the fitness profiles of the two strategies respectively.

Over the long-term, a flatter fitness profile might better allow a quasispecies to exploit changes in selection pressure, analogous to the way sexual organisms use recombination to preserve diversity in a population. At least in simulations, a slower replicator can be shown to be able to outcompete a faster one in cases where it is more robust and the mutation rate is high.

However, whether mutational robustness evolved or is intrinsic to genetic systems is unconfirmed, because the basic mechanism behind robustness would depend upon the peculiarities of each system.

Cooperation

Experimental manipulation of poliovirus to give them a higher-fidelity polymerase – and hence reduce their mutation rate – showed these variants to have lower pathogenicity than wild-type sequences. Pathogenicity could then be restored by mutagen application. This was interpreted to mean lower mutation rates had reduced the adaptability (or breadth) of the quasispecies. The mutant viruses extracted from brain tissue were not themselves pathogenic, and the authors speculate that there may be complementation between variant members of the quasispecies that could enable viruses to colonize different host tissues and systems.

Distance education

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Distance_...