Search This Blog

Monday, March 21, 2022

Metabarcoding

From Wikipedia, the free encyclopedia
 
Differences in the standard methods for DNA barcoding and metabarcoding. While DNA barcoding focuses on a specific species, metabarcoding examines whole communities.

Metabarcoding is the barcoding of DNA/RNA (or eDNA/eRNA) in a manner that allows for the simultaneous identification of many taxa within the same sample. The main difference between barcoding and metabarcoding is that metabarcoding does not focus on one specific organism, but instead aims to determine species composition within a sample.

A barcode consists of a short variable gene region (for example, see different markers/barcodes) which is useful for taxonomic assignment flanked by highly conserved gene regions which can be used for primer design. This idea of general barcoding originated in 2003 from researchers at the University of Guelph.

The metabarcoding procedure, like general barcoding, proceeds in order through stages of DNA extraction, PCR amplification, sequencing and data analysis. Different genes are used depending if the aim is to barcode single species or metabarcoding several species. In the latter case, a more universal gene is used. Metabarcoding does not use single species DNA/RNA as a starting point, but DNA/RNA from several different organisms derived from one environmental or bulk sample.

Environmental DNA

Environmental DNA or eDNA describes the genetic material present in environmental samples such as sediment, water, and air, including whole cells, extracellular DNA and potentially whole organisms. eDNA can be captured from environmental samples and preserved, extracted, amplified, sequenced, and categorized based on its sequence. From this information, detection and classification of species is possible. eDNA may come from skin, mucous, saliva, sperm, secretions, eggs, feces, urine, blood, roots, leaves, fruit, pollen, and rotting bodies of larger organisms, while microorganisms may be obtained in their entirety. eDNA production is dependent on biomass, age and feeding activity of the organism as well as physiology, life history, and space use.

By 2019 methods in eDNA research had been expanded to be able to assess whole communities from a single sample. This process involves metabarcoding, which can be precisely defined as the use of general or universal polymerase chain reaction (PCR) primers on mixed DNA samples from any origin followed by high-throughput next-generation sequencing (NGS) to determine the species composition of the sample. This method has been common in microbiology for years, but, as of 2020, it is only just finding its footing in the assessment of macroorganisms. Ecosystem-wide applications of eDNA metabarcoding have the potential to not only describe communities and biodiversity, but also to detect interactions and functional ecology over large spatial scales, though it may be limited by false readings due to contamination or other errors. Altogether, eDNA metabarcoding increases speed, accuracy, and identification over traditional barcoding and decreases cost, but needs to be standardized and unified, integrating taxonomy and molecular methods for full ecological study.

Applications of environmental DNA metabarcoding in aquatic and terrestrial ecosystems 
 
Global ecosystem and biodiversity monitoring
with environmental DNA metabarcoding 

eDNA metabarcoding has applications to diversity monitoring across all habitats and taxonomic groups, ancient ecosystem reconstruction, plant-pollinator interactions, diet analysis, invasive species detection, pollution responses, and air quality monitoring. eDNA metabarcoding is a unique method still in development and will likely remain in flux for some time as technology advances and procedures become standardized. However, as metabarcoding is optimized and its use becomes more widespread, it is likely to become an essential tool for ecological monitoring and global conservation study.

Community DNA

Since the inception of high‐throughput sequencing (HTS), the use of metabarcoding as a biodiversity detection tool has drawn immense interest. However, there has yet to be clarity regarding what source material is used to conduct metabarcoding analyses (e.g., environmental DNA versus community DNA). Without clarity between these two source materials, differences in sampling, as well as differences in laboratory procedures, can impact subsequent bioinformatics pipelines used for data processing, and complicate the interpretation of spatial and temporal biodiversity patterns. Here, we seek to clearly differentiate among the prevailing source materials used and their effect on downstream analysis and interpretation for environmental DNA metabarcoding of animals and plants compared to that of community DNA metabarcoding.

With community DNA metabarcoding of animals and plants, the targeted groups are most often collected in bulk (e.g., soil, malaise trap or net), and individuals are removed from other sample debris and pooled together prior to bulk DNA extraction. In contrast, macro‐organism eDNA is isolated directly from an environmental material (e.g., soil or water) without prior segregation of individual organisms or plant material from the sample and implicitly assumes that the whole organism is not present in the sample. Of course, community DNA samples may contain DNA from parts of tissues, cells and organelles of other organisms (e.g., gut contents, cutaneous intracellular or extracellular DNA). Likewise, macro‐organism eDNA samples may inadvertently capture whole microscopic nontarget organisms (e.g., protists, bacteria). Thus, the distinction can at least partly break down in practice.

Another important distinction between community DNA and macro‐organism eDNA is that sequences generated from community DNA metabarcoding can be taxonomically verified when the specimens are not destroyed in the extraction process. Here, sequences can then be generated from voucher specimens using Sanger sequencing. As the samples for eDNA metabarcoding lack whole organisms, no such in situ comparisons can be made. Taxonomic affinities can therefore only be established by directly comparing obtained sequences (or through bioinformatically generated operational taxonomic units (MOTUs)), to sequences that are taxonomically annotated such as NCBI's GenBank nucleotide database, BOLD, or to self‐generated reference databases from Sanger‐sequenced DNA. (The molecular operational taxonomic unit (MOTU) is a group identified through use of cluster algorithms and a predefined percentage sequence similarity, for example, 97%)). Then, to at least partially corroborate the resulting list of taxa, comparisons are made with conventional physical, acoustic or visual‐based survey methods conducted at the same time or compared with historical records from surveys for a location (see Table 1).

The difference in source material between community DNA and eDNA therefore has distinct ramifications for interpreting the scale of inference for time and space about the biodiversity detected. From community DNA, it is clear that the individual species were found in that time and place, but for eDNA, the organism that produced the DNA may be upstream from the sampled location, or the DNA may have been transported in the faeces of a more mobile predatory species (e.g., birds depositing fish eDNA, or was previously present, but no longer active in the community and detection is from DNA that was shed years to decades before. The latter means that the scale of inference both in space and in time must be considered carefully when inferring the presence for the species in the community based on eDNA.

Metabarcoding stages

Six steps in DNA barcoding and metabarcoding 

There are six stages or steps in DNA barcoding and metabarcoding. The DNA barcoding of animals (and specifically of bats) is used as an example in the diagram at the right and in the discussion immediately below.

First, suitable DNA barcoding regions are chosen to answer some specific research question. The most commonly used DNA barcode region for animals is a segment about 600 base pairs long of the mitochondrial gene cytochrome oxidase I (CO1). This locus provides large sequence variation between species yet relatively small amount of variation within species. Other commonly used barcode regions used for species identification of animals are ribosomal DNA (rDNA) regions such as 16S, 18S and 12S and mitochondrial regions such as cytochrome B. These markers have advantages and disadvantages and are used for different purposes. Longer barcode regions (at least 600 base pairs long) are often needed for accurate species delimitation, especially to differentiate close relatives. Identification of the producer of organism's remains such as faeces, hairs and saliva can be used as a proxy measure to verify absence/presence of a species in an ecosystem. The DNA in these remains is usually of low quality and quantity, and therefore, shorter barcodes of around 100 base pairs long are used in these cases. Similarly, DNA remains in dung are often degraded as well, so short barcodes are needed to identify prey consumed.

Second, a reference database needs to be built of all DNA barcodes likely to occur in a study. Ideally, these barcodes need to be generated from vouchered specimens deposited in a publicly accessible place, such as for instance a natural history museum or another research institute. Building up such reference databases is currently being done all over the world. Partner organizations collaborate in international projects such as the International Barcode of Life Project (iBOL) and Consortium for the Barcode of Life (CBOL), aiming to construct a DNA barcode reference that will be the foundation for DNA‐based identification of the world's biome. Well‐known barcode repositories are NCBI GenBank and the Barcode of Life Data System (BOLD).

Third, the cells containing the DNA of interest must be broken open to expose its DNA. This step, DNA extractions and purifications, should be performed from the substrate under investigation. There are several procedures available for this. Specific techniques must be chosen to isolate DNA from substrates with partly degraded DNA, for example fossil samples, and samples containing inhibitors, such as blood, faeces and soil. Extractions in which DNA yield or quality is expected to be low should be carried out in an ancient DNA facility, together with established protocols to avoid contamination with modern DNA. Experiments should always be performed in duplicate  and with positive controls included.

Fourth, amplicons have to be generated from DNA extracted, either from a single specimen or from complex mixtures with primers based on DNA barcodes selected under step 1. To keep track of their origin, labelled nucleotides (molecular IDs or MID labels) need to be added in case of metabarcoding. These labels are needed later on in the analyses to trace reads from a bulk data set back to their origin.

History of sequencing technology 

Fifth, the appropriate techniques should be chosen for DNA sequencing. The classic Sanger chain‐termination method relies on the selective incorporation of chain‐elongating inhibitors of DNA polymerase during DNA replication. These four bases are separated by size using electrophoresis and later identified by laser detection. The Sanger method is limited and can produce a single read at the same time and is therefore suitable to generate DNA barcodes from substrates that contain only a single species. Emerging technologies such as nanopore sequencing have resulted in the cost of DNA sequencing reducing from about USD 30,000 per megabyte in 2002 to about USD 0.60 in 2016. Modern next-generation sequencing (NGS) technologies can handle thousands to millions reads in parallel and are therefore suitable for mass identification of a mix of different species present in a substrate, summarized as metabarcoding.

Finally, bioinformatic analyses need to be carried out to match DNA barcodes obtained with Barcode Index Numbers (BINs) in reference libraries. Each BIN, or BIN cluster, can be identified to species level when it shows high (>97%) concordance with DNA barcodes linked to a species present in a reference library, or when taxonomic identification to the species level is still lacking, an operational taxonomic unit (OTU), which refers to a group of species (i.e. genus, family or higher taxonomic rank). (See binning (metagenomics)). The results of the bioinformatics pipeline must be pruned, for example by filtering out unreliable singletons, superfluous duplicates, low‐quality reads and/or chimeric reads. This is generally done by carrying out serial BLAST searches in combination with automatic filtering and trimming scripts. Standardized thresholds are needed to discriminate between different species or a correct and a wrong identification.

Metabarcoding workflow

Despite the obvious power of the approach, eDNA metabarcoding is affected by precision and accuracy challenges distributed throughout the workflow in the field, in the laboratory and at the keyboard. As set out in the diagram at the right, following the initial study design (hypothesis/question, targeted taxonomic group etc) the current eDNA workflow consists of three components: field, laboratory and bioinformatics. The field component consists of sample collection (e.g., water, sediment, air) that is preserved or frozen prior to DNA extraction. The laboratory component has four basic steps: (i) DNA is concentrated (if not performed in the field) and purified, (ii) PCR is used to amplify a target gene or region, (iii) unique nucleotide sequences called “indexes” (also referred to as “barcodes”) are incorporated using PCR or are ligated (bound) onto different PCR products, creating a “library” whereby multiple samples can be pooled together, and (iv) pooled libraries are then sequenced on a high‐throughput machine. The final step after laboratory processing of samples is to computationally process the output files from the sequencer using a robust bioinformatics pipeline.

Questions for consideration in the design and implementation phases
of an environmental DNA metabarcoding study 
 
Decisions involved in a molecular ecology workflow
Samples can be collected from a variety of different environments using appropriate collection techniques. DNA is then prepared and used to answer a variety of ecological questions: metabarcoding is used to answer questions about "who" is present, while the function of communities or individuals can be established using a metagenomics, single‐cell genomics or metatranscriptomics.

Method and visualisation

Visualization and diversity metrics from environmental sequencing data
a) Alpha diversity displayed as taxonomy bar charts, showing relative abundance of taxa across samples using the Phinch data visualization framework (Bik & Pitch Interactive 2014).
b) Beta diversity patterns illustrated via Principal Coordinate Analyses carried out in QIIME, where each dot represents a sample and colors distinguish different classes of sample. The closer two sample points in 3D space, the more similar their community assemblages
c) GraPhalAn phylogenetic visualization of environmental data, with circular heatmaps and abundance bars used to convey quantitative taxon traits.
d) Edge PCA, a tree‐based diversity metric that identifies specific lineages (green/orange branches) that contribute most to community changes observed in samples distributed across different PCA axes.

The method requires each collected DNA to be archived with its corresponding "type specimen" (one for each taxon), in addition to the usual collection data. These types are stored in specific institutions (museums, molecular laboratories, universities, zoological gardens, botanical gardens, herbaria, etc.) one for each country, and in some cases, the same institution is assigned to contain the types of more than a country, in cases where some nations do not have the technology or financial resources to do so.

In this way, the creation of type specimens of genetic codes represents a methodology parallel to that carried out by traditional taxonomy.

In a first stage, the region of the DNA that would be used to make the barcode was defined. It had to be short and achieve a high percentage of unique sequences. For animals, algae and fungi, a portion of a mitochondrial gene which codes for subunit 1 of the cytochrome oxidase enzyme, CO1, has provided high percentages (95%), a region around 648 base pairs.

In the case of plants, the use of CO1 has not been effective since they have low levels of variability in that region, in addition to the difficulties that are produced by the frequent effects of polyploidy, introgression, and hybridization, so the chloroplast genome seems more suitable.

Applications

Pollinator networks

↑ metabarcoding                                  ↑ visit surveys
(a,b) plant-pollinator groups
(c,d) plant-pollinator species
(e,f) individual pollinator-plant species
(Empis leptempis pandellei)

Apis: Apis mellifera; Bomb.: Bombus sp.; W.bee: wild bees; O.Hym.: other Hymenoptera; O.Dipt.: Other Diptera; Emp.: Empididae; Syrph.: Syrphidae; Col.: Coleoptera; Lep.: Lepidoptera; Musc.: Muscidae.
Line thickness highlights the proportion of interactions

The diagram on the right shows a comparison of pollination networks based on DNA metabarcoding with more traditional networks based on direct observations of insect visits to plants. By detecting numerous additional hidden interactions, metabarcoding data largely alters the properties of the pollination networks compared to visit surveys. Molecular data shows that pollinators are much more generalist than expected from visit surveys. However, pollinator species were composed of relatively specialized individuals and formed functional groups highly specialized upon floral morphs.

As a consequence of the ongoing global changes, a dramatic and parallel worldwide decline in pollinators and animal-pollinated plant species has been observed. Understanding the responses of pollination networks to these declines is urgently required to diagnose the risks the ecosystems may incur as well as to design and evaluate the effectiveness of conservation actions. Early studies on animal pollination dealt with simplified systems, i.e. specific pairwise interactions or involved small subsets of plant-animal communities. However, the impacts of disturbances occur through highly complex interaction networks  and, nowadays, these complex systems are currently a major research focus. Assessing the true networks (determined by ecological process) from field surveys that are subject to sampling effects still provides challenges.

Recent research studies have clearly benefited from network concepts and tools to study the interaction patterns in large species assemblages. They showed that plant-pollinator networks were highly structured, deviating significantly from random associations. Commonly, networks have (1) a low connectance (the realized fraction of all potential links in the community) suggesting a low degree of generalization; (2) a high nestedness (the more-specialist organisms are more likely to interact with subsets of the species that more-generalist organisms interact with) the more specialist species interact only with proper subsets of those species interacting with the more generalist ones; (3) a cumulative distribution of connectivity (number of links per species, s) that follows a power or a truncated power law function  characterized by few supergeneralists with more links than expected by chance and many specialists; (4) a modular organization. A module is a group of plant and pollinator species that exhibits high levels of within-module connectivity, and that is poorly connected to species of other groups.

The low level of connectivity and the high proportion of specialists in pollination networks contrast with the view that generalization rather than specialization is the norm in networks. Indeed, most plants species are visited by a diverse array of pollinators which exploit floral resources from a wide range of plant species. A main cause evoked to explain this apparent contradiction is the incomplete sampling of interactions. Indeed, most network properties are highly sensitive to sampling intensity and network size. Network studies are basically phytocentric i.e. based on the observations of pollinator visits to flowers. This plant-centered approach suffers nevertheless from inherent limitations which may hamper the comprehension of mechanisms contributing to community assembly and biodiversity patterns. First, direct observations of pollinator visits to certain taxa such as orchids are often scarce  and rare interactions are very difficult to detect in field in general. Pollinator and plant communities usually are composed of few abundant species and many rare species that are poorly recorded in visit surveys. These rare species appear as specialists, whereas in fact they could be typical generalists. Because of the positive relationship between interaction frequency (f) and connectivity (s), undersampled interactions may lead to overestimating the degree of specialization in networks. Second, network analyses have mostly operated at species levels. Networks have very rarely been up scaled to the functional groups or down scaled to the individual-based networks, and most of them have been focused on one or two species only. The behavior of either individuals or colonies is commonly ignored, although it may influence the structure of the species networks. Species accounted as generalists in species networks could, therefore, entail cryptic specialized individuals or colonies. Third, flower visitors are by no means always effective pollinators as they may deposit no conspecific pollen and/or a lot of heterospecific pollen. Animal-centered approaches based on the investigation of pollen loads on visitors and plant stigmas may be more efficient at revealing plant-pollinator interactions.

Disentangling food webs

Arthropod predators and vertebrate predators in a millet field 

(A) Trophic network:
of arthropod and vertebrate predators – arrows represent biomass flow between predators and preys.
(B) Intraguild interactions: * Arthropod predators * Parasitoids of arthropods: * Insectivorous vertebrates:

Metabarcoding offers new opportunities for deciphering trophic linkages between predators and their prey within food webs. Compared to traditional, time-consuming methods, such as microscopic or serological analyses, the development of DNA metabarcoding allows the identification of prey species without prior knowledge of the predator’s prey range. In addition, metabarcoding can also be used to characterize a large number of species in a single PCR reaction, and to analyze several hundred samples simultaneously. Such an approach is increasingly used to explore the functional diversity and structure of food webs in agroecosystems. Like other molecular-based approaches, metabarcoding only gives qualitative results on the presence/absence of prey species in the gut or fecal samples. However, this knowledge of the identity of prey consumed by predators of the same species in a given environment enables a "pragmatic and useful surrogate for truly quantitative information.

In food web ecology, "who eats whom" is a fundamental issue for gaining a better understanding of the complex trophic interactions existing between pests and their natural enemies within a given ecosystem. The dietary analysis of arthropod and vertebrate predators allows the identification of key predators involved in the natural control of arthropod pests and gives insights into the breadth of their diet (generalist vs. specialist) and intraguild predation.

The diagram on the right summarises results from a 2020 study which used metabarcoding to untangle the functional diversity and structure of the food web associated with a couple of millet fields in Senegal. After assigning the identified OTUs as species, 27 arthropod prey taxa were identified from nine arthropod predators. The mean number of prey taxa detected per sample was the highest in carabid beetles , ants and spiders, and the lowest in the remaining predators including anthocorid bugs, pentatomid bugs, and earwigs. Across predatory arthropods, a high diversity of arthropod preys was observed in spiders, carabid beetles, ants, and anthocorid bugs. In contrast, the diversity of prey species identified in earwigs and pentatomid bugs was relatively low. Lepidoptera, Hemiptera, Diptera and Coleoptera were the most common insect prey taxa detected from predatory arthropods.

Conserving functional biodiversity and related ecosystem services, especially by controlling pests using their natural enemies, offers new avenues to tackle challenges for the sustainable intensification of food production systems. Predation of crop pests by generalist predators, including arthropods and vertebrates, is a major component of natural pest control. A particularly important trait of most generalist predators is that they can colonize crops early in the season by first feeding on alternative prey. However, the breadth of the "generalist" diet entails some drawbacks for pest control, such as intra-guild predation. A tuned diagnosis of diet breadth in generalist predators, including predation of non-pest prey, is thus needed to better disentangle food webs (e.g., exploitation competition and apparent competition) and ultimately to identify key drivers of natural pest control in agroecosystems. However, the importance of generalist predators in the food web is generally difficult to assess, due to the ephemeral nature of individual predator–prey interactions. The only conclusive evidence of predation results from direct observation of prey consumption, identification of prey residues within predators’ guts, and analyses of regurgitates or feces.

Marine biosecurity

Metabarcoding eDNA and eRNA in marine biosecurity
Global biodiversity of operational taxonomic units (OTUs) for DNA-only, shared eDNA/eRNA, and RNA-only datasets. Charts show the relative abundance of sequences at highest assigned taxonomic levels.
 
Tunicate colony of Didemnum vexillum
 
Species like these survive passage through unfiltered pumping systems

The spread of non-indigenous species (NIS) represents significant and increasing risks to ecosystems. In marine systems, NIS that survive the transport and adapt to new locations can have significant adverse effects on local biodiversity, including the displacement of native species, and shifts in biological communities and associated food webs. Once NIS are established, they are extremely difficult and costly to eradicate, and further regional spread may occur through natural dispersal or via anthropogenic transport pathways. While vessel hull fouling and ships’ ballast waters are well known as important anthropogenic pathways for the international spread of NIS, comparatively little is known about the potential of regionally transiting vessels to contribute to the secondary spread of marine pests through bilge water translocation.

Recent studies have revealed that the water and associated debris entrained in bilge spaces of small vessels (<20 m) can act as a vector for the spread of NIS at regional scales. Bilge water is defined as any water that is retained on a vessel (other than ballast), and that is not deliberately pumped on board. It can accumulate on or below the vessel’s deck (e.g., under floor panels) through a variety of mechanisms, including wave actions, leaks, via the propeller stern glands, and through the loading of items such as diving, fishing, aquaculture or scientific equipment. Bilge water, therefore, may contain seawater as well as living organisms at various life stages, cell debris and contaminants (e.g., oil, dirt, detergent, etc.), all of which are usually discharged using automatic bilge pumps or are self-drained using duckbill valves. Bilge water pumped from small vessels (manually or automatically) is not usually treated prior to discharge to sea, contrasting with larger vessels that are required to separate oil and water using filtration systems, centrifugation, or carbon absorption. If propagules are viable through this process, the discharge of bilge water may result in the spread of NIS.

In 2017, Fletcher et al. used a combination of laboratory and field experiments to investigate the diversity, abundance, and survival of biological material contained in bilge water samples taken from small coastal vessels. Their laboratory experiment showed that ascidian colonies or fragments, and bryozoan larvae, can survive passage through an unfiltered pumping system largely unharmed. They also conducted the first morpho-molecular assessment (using eDNA metabarcoding) on the biosecurity risk posed by bilge water discharges from 30 small vessels (sailboats and motorboats) of various origins and sailing time. Using eDNA metabarcoding they characterised approximately three times more taxa than via traditional microscopic methods, including the detection of five species recognised as non-indigenous in the study region.

To assist in understanding the risks associated with different NIS introduction vectors, traditional microscope biodiversity assessments are increasingly being complemented by eDNA metabarcoding. This allows a wide range of diverse taxonomic assemblages, at many life stages to be identified. It can also enable the detection of NIS that may have been overlooked using traditional methods. Despite the great potential of eDNA metabarcoding tools for broad-scale taxonomic screening, a key challenge for eDNA in the context of environmental monitoring of marine pests, and particularly when monitoring enclosed environments such as some bilge spaces or ballast tanks, is differentiating dead and viable organisms. Extracellular DNA can persist in dark/cold environments for extended periods of time (months to years, thus many of the organisms detected using eDNA metabarcoding may have not been viable in the location of sample collection for days or weeks. In contrast, ribonucleic acid (RNA) deteriorates rapidly after cell death, likely providing a more accurate representation of viable communities. Recent metabarcoding studies have explored the use of co-extracted eDNA and eRNA molecules for monitoring benthic sediment samples around marine fish farms and oil drilling sites, and have collectively found slightly stronger correlations between biological and physico-chemical variables along impact gradients when using eRNA. From a marine biosecurity prospective, the detection of living NIS may represent a more serious and immediate threat than the detection of NIS based purely on a DNA signal. Environmental RNA may therefore offer a useful method for identifying living organisms in samples.

Miscellaneous

The construction of the genetic barcode library was initially focused on fish  and the birds, which were followed by butterflies and other invertebrates. In the case of birds, the DNA sample is usually obtained from the chest.

Researchers have already developed specific catalogs for large animal groups, such as bees, birds, mammals or fish. Another use is to analyze the complete zoocenosis of a given geographic area, such as the "Polar Life Bar Code" project that aims to collect the genetic traits of all organisms that live in polar regions; both poles of the Earth. Related to this form is the coding of all the ichthyofauna of a hydrographic basin, for example the one that began to develop in the Rio São Francisco, in the northeast of Brazil.

The potential of the use of Barcodes is very wide, since the discovery of numerous cryptic species (it has already yielded numerous positive results), the use in the identification of species at any stage of their life, the secure identification in cases of protected species that are illegally trafficked, etc.

Potentials and shortcomings

A region of the gene for the cytochrome c oxidase enzyme is used to distinguish species in the Barcode of Life Data Systems database.

Potentials

DNA barcoding has been proposed as a way to distinguish species suitable even for non-specialists to use.

Shortcomings

In general, the shortcomings for DNA barcoding are valid also for metabarcoding. One particular drawback for metabarcoding studies is that there is no consensus yet regarding the optimal experimental design and bioinformatics criteria to be applied in eDNA metabarcoding. However, there are current joined attempts, like e.g. the EU COST network DNAqua-Net, to move forward by exchanging experience and knowledge to establish best-practice standards for biomonitoring.

The so-called barcode is a region of mitochondrial DNA within the gene for cytochrome c oxidase. A database, Barcode of Life Data Systems (BOLD), contains DNA barcode sequences from over 190,000 species. However, scientists such as Rob DeSalle have expressed concern that classical taxonomy and DNA barcoding, which they consider a misnomer, need to be reconciled, as they delimit species differently. Genetic introgression mediated by endosymbionts and other vectors can further make barcodes ineffective in the identification of species.

Status of barcode species

In microbiology, genes can move freely even between distantly related bacteria, possibly extending to the whole bacterial domain. As a rule of thumb, microbiologists have assumed that kinds of Bacteria or Archaea with 16S ribosomal RNA gene sequences more similar than 97% to each other need to be checked by DNA-DNA hybridisation to decide if they belong to the same species or not. This concept was narrowed in 2006 to a similarity of 98.7%.

DNA-DNA hybridisation is outdated, and results have sometimes led to misleading conclusions about species, as with the pomarine and great skua. Modern approaches compare sequence similarity using computational methods.

Sunday, March 20, 2022

Digitization

From Wikipedia, the free encyclopedia

Digitization is the process of converting information into a digital (i.e. computer-readable) format. The result is the representation of an object, image, sound, document or signal (usually an analog signal) obtained by generating a series of numbers that describe a discrete set of points or samples. The result is called digital representation or, more specifically, a digital image, for the object, and digital form, for the signal. In modern practice, the digitized data is in the form of binary numbers, which facilitates processing by digital computers and other operations, but, digitizing simply means the conversion of analog source material into a numerical format; the decimal or any other number system can be used instead.

Digitization is of crucial importance to data processing, storage and transmission, because it "allows information of all kinds in all formats to be carried with the same efficiency and also intermingled". Though analog data is typically more stable, digital data, has the potential to be more easily shared and accessed and, in theory, can be propagated indefinitely, without generation loss, provided it is migrated to new, stable formats as needed. This potential has led to institutional digitization projects designed to improve access and the rapid growth of the digital preservation field.

Sometimes digitization and digital preservation are mistaken for the same thing, however they are different, but digitization is often a vital first step in digital preservation. Libraries, archives, museums and other memory institutions digitize items to preserve fragile materials and create more access points for patrons. Doing this creates challenges for information professionals and solutions can be as varied as the institutions that implement them. Some analog materials, such as audio and video tapes, are nearing the end of their life-cycle and it is important to digitize them before equipment obsolescence and media deterioration makes the data irretrievable.

There are challenges and implications surrounding digitization including time, cost, cultural history concerns and creating an equitable platform for historically marginalized voices. Many digitizing institutions develop their own solutions to these challenges.

Mass digitization projects have had mixed results over the years, but some institutions have had success even if not in the traditional Google Books model.

Technological changes can happen often and quickly, so digitization standards are difficult to keep updated. Professionals in the field can attend conferences and join organizations and working groups to keep their knowledge current and add to the conversation.

Process

The term digitization is often used when diverse forms of information, such as an object, text, sound, image or voice, are converted into a single binary code. The core of the process is the compromise between the capturing device and the player device so that the rendered result represents the original source with the most possible fidelity, and the advantage of digitization is the speed and accuracy in which this form of information can be transmitted with no degradation compared with analog information.

Digital information exists as one of two digits, either 0 or 1. These are known as bits (a contraction of binary digits) and the sequences of 0s and 1s that constitute information are called bytes.

Analog signals are continuously variable, both in the number of possible values of the signal at a given time, as well as in the number of points in the signal in a given period of time. However, digital signals are discrete in both of those respects – generally a finite sequence of integers – therefore a digitization can, in practical terms, only ever be an approximation of the signal it represents.

Digitization occurs in two parts:

Discretization
The reading of an analog signal A, and, at regular time intervals (frequency), sampling the value of the signal at the point. Each such reading is called a sample and may be considered to have infinite precision at this stage;
Quantization
Samples are rounded to a fixed set of numbers (such as integers), a process known as quantization.

In general, these can occur at the same time, though they are conceptually distinct.

A series of digital integers can be transformed into an analog output that approximates the original analog signal. Such a transformation is called a DA conversion. The sampling rate and the number of bits used to represent the integers combine to determine how close such an approximation to the analog signal a digitization will be.

Examples

Digitization of the first number of Estonian popular science magazine Horisont published in January 1967.

The term is used to describe, for example, the scanning of analog sources (such as printed photos or taped videos) into computers for editing, 3D scanning that creates 3D modeling of an object's surface, and audio (where sampling rate is often measured in kilohertz) and texture map transformations. In this last case, as in normal photos, the sampling rate refers to the resolution of the image, often measured in pixels per inch.

Digitizing is the primary way of storing images in a form suitable for transmission and computer processing, whether scanned from two-dimensional analog originals or captured using an image sensor-equipped device such as a digital camera, tomographical instrument such as a CAT scanner, or acquiring precise dimensions from a real-world object, such as a car, using a 3D scanning device.

Digitizing is central to making digital representations of geographical features, using raster or vector images, in a geographic information system, i.e., the creation of electronic maps, either from various geographical and satellite imaging (raster) or by digitizing traditional paper maps or graphs (vector).

"Digitization" is also used to describe the process of populating databases with files or data. While this usage is technically inaccurate, it originates with the previously proper use of the term to describe that part of the process involving digitization of analog sources, such as printed pictures and brochures, before uploading to target databases.

Digitizing may also be used in the field of apparel, where an image may be recreated with the help of embroidery digitizing software tools and saved as embroidery machine code. This machine code is fed into an embroidery machine and applied to the fabric. The most supported format is DST file. Apparel companies also digitize clothing patterns.

History

  • 1957 The Standards Electronic Automatic Computer (SEAC) was invented. That same year, Russell Kirsch used a rotating drum scanner and photomultiplier connected to SEAC to create the first digital image (176x176 pixels) from a photo of his infant son. This image was stored in SEAC memory via a staticizer and viewed via a cathode ray oscilloscope.
  • 1971 Invention of Charge-Coupled Devices that made conversion from analog data to a digital format easy.
  • 1986 work started on the JPEG format.
  • 1990s Libraries began scanning collections to provide access via the world wide web.

Analog signals to digital

Analog signals are continuous electrical signals; digital signals are non-continuous. Analog signals can be converted to digital signals by using an analog-to-digital converter.

The process of converting analog to digital consists of two parts: sampling and quantizing. Sampling measures wave amplitudes at regular intervals, splits them along the vertical axis, and assigns them a numerical value, while quantizing looks for measurements that are between binary values and rounds them up or down.

Nearly all recorded music has been digitized, and about 12 percent of the 500,000+ movies listed on the Internet Movie Database are digitized and were released on DVD.

Digitization of home movies, slides, and photographs is a popular method of preserving and sharing personal multimedia. Slides and photographs may be scanned quickly using an image scanner, but analog video requires a video tape player to be connected to a computer while the item plays in real time. Slides can be digitized quicker with a slide scanner such as the Nikon Coolscan 5000ED.

Another example of digitization is the VisualAudio process developed by the Swiss Fonoteca Nazionale in Lugano, by scanning a high resolution photograph of a record, they are able to extract and reconstruct the sound from the processed image.

Digitization of analog tapes before they degrade, or after damage has already occurred, can rescue the only copies of local and traditional cultural music for future generations to study and enjoy.

Analog texts to digital

Image of a rare book in a book scanner where it will be digitized.
Book scanner in the digitization lab at the University of Liège, Belgium.

Academic and public libraries, foundations, and private companies like Google are scanning older print books and applying optical character recognition (OCR) technologies so they can be keyword searched, but as of 2006, only about 1 in 20 texts had been digitized. Librarians and archivists are working to increase this statistic and in 2019 began digitizing 480,000 books published between 1923 and 1964 that had entered the public domain.

Unpublished manuscripts and other rare papers and documents housed in special collections are being digitized by libraries and archives, but backlogs often slow this process and keep materials with enduring historical and research value hidden from most users (see digital libraries). Digitization has not completely replaced other archival imaging options, such as microfilming which is still used by institutions such as the National Archives and Records Administration (NARA) to provide preservation and access to these resources.

While digital versions of analog texts can potentially be accessed from anywhere in the world, they are not as stable as most print materials or manuscripts and are unlikely to be accessible decades from now without further preservation efforts, while many books manuscripts and scrolls have already been around for centuries. However, for some materials that have been damaged by water, insects, or catastrophes, digitization might be the only option for continued use.

Library preservation

In the context of libraries, archives, and museums, digitization is a means of creating digital surrogates of analog materials, such as books, newspapers, microfilm and videotapes, offers a variety of benefits, including increasing access, especially for patrons at a distance; contributing to collection development, through collaborative initiatives; enhancing the potential for research and education; and supporting preservation activities. Digitization can provide a means of preserving the content of the materials by creating an accessible facsimile of the object in order to put less strain on already fragile originals. For sounds, digitization of legacy analog recordings is essential insurance against technological obsolescence. A fundamental aspect of planning digitization projects is to ensure that the digital files themselves are preserved and remain accessible; the term "digital preservation," in its most basic sense, refers to an array of activities undertaken to maintain access to digital materials over time.

The prevalent Brittle Books issue facing libraries across the world is being addressed with a digital solution for long term book preservation. Since the mid-1800s, books were printed on wood-pulp paper, which turns acidic as it decays. Deterioration may advance to a point where a book is completely unusable. In theory, if these widely circulated titles are not treated with de-acidification processes, the materials upon those acid pages will be lost. As digital technology evolves, it is increasingly preferred as a method of preserving these materials, mainly because it can provide easier access points and significantly reduce the need for physical storage space.

Cambridge University Library is working on the Cambridge Digital Library, which will initially contain digitised versions of many of its most important works relating to science and religion. These include examples such as Isaac Newton's personally annotated first edition of his Philosophiæ Naturalis Principia Mathematica as well as college notebooks and other papers, and some Islamic manuscripts such as a Quran from Tipu Sahib's library.

Google, Inc. has taken steps towards attempting to digitize every title with "Google Book Search". While some academic libraries have been contracted by the service, issues of copyright law violations threaten to derail the project. However, it does provide – at the very least – an online consortium for libraries to exchange information and for researchers to search for titles as well as review the materials.

Digitization versus digital preservation

Digitizing something is not the same as digitally preserving it. To digitize something is to create a digital surrogate (copy or format) of an existing analog item (book, photograph, or record) and is often described as converting it from analog to digital, however both copies remain. An example would be scanning a photograph and having the original piece in a photo album and a digital copy saved to a computer. This is essentially the first step in digital preservation which is to maintain the digital copy over a long period of time and making sure it remains authentic and accessible.

Digitization is done once with the technology currently available, while digital preservation is more complicated because technology changes so quickly that a once popular storage format may become obsolete before it breaks. An example is a 5 1/4" floppy drive, computers are no longer made with them and obtaining the hardware to convert a file stored on 5 1/4" floppy disc can be expensive. To combat this risk, equipment must be upgraded as newer technology becomes affordable (about 2 to 5 years), but before older technology becomes unobtainable (about 5 to 10 years).

Digital preservation can also apply to born-digital material, such as a Microsoft Word document or a social media post. In contrast, digitization only applies exclusively to analog materials. Born-digital materials present a unique challenge to digital preservation not only due to technological obsolescence but also because of the inherently unstable nature of digital storage and maintenance. Most websites last between 2.5 and 5 years, depending on the purpose for which they were designed.

The Library of Congress provides numerous resources and tips for individuals looking to practice digitization and digital preservation for their personal collections.

Digital reformatting

Digital reformatting is the process of converting analog materials into a digital format as a surrogate of the original. The digital surrogates perform a preservation function by reducing or eliminating the use of the original. Digital reformatting is guided by established best practices to ensure that materials are being converted at the highest quality.

Digital reformatting at the Library of Congress

The Library of Congress has been actively reformatting materials for its American Memory project and developed best standards and practices pertaining to book handling during the digitization process, scanning resolutions, and preferred file formats. Some of these standards are:

  • The use of ISO 16067-1 and ISO 16067-2 standards for resolution requirements.
  • Recommended 400 ppi resolution for OCR'ed printed text.
  • The use of 24-bit color when color is an important attribute of a document.
  • The use of the scanning device's maximum resolution for digitally reproducing photographs
  • TIFF as the standard file format.
  • Attachment of descriptive, structural, and technical metadata to all digitized documents.

A list of archival standards for digital preservation can be found on the ARL website.

The Library of Congress has constituted a Preservation Digital Reformatting Program. The Three main components of the program include:

  • Selection Criteria for digital reformatting
  • Digital reformatting principles and specifications
  • Life cycle management of LC digital data

Audio digitization and reformatting

Audio media offers a rich source of historic ethnographic information, with the earliest forms of recorded sound dating back to 1890. According to the International Association of Sound and Audiovisual Archives (IASA), these sources of audio data, as well as the aging technologies used to play them back, are in imminent danger of permanent loss due to degradation and obsolescence. These primary sources are called “carriers” and exist in a variety of formats, including wax cylinders, magnetic tape, and flat discs of grooved media, among others. Some formats are susceptible to more severe, or quicker, degradation than others. For instance, lacquer discs suffer from delamination. Analog tape may deteriorate due to sticky shed syndrome.

1/4" analog tape being played back on a Studer A810 tape machine for digitization at Smithsonian Folkways Recordings.

Archival workflow and file standardization have been developed to minimize loss of information from the original carrier to the resulting digital file as digitization is underway. For most at-risk formats (magnetic tape, grooved cylinders, etc.), a similar workflow can be observed. Examination of the source carrier will help determine what, if any, steps need to be taken to repair material prior to transfer. A similar inspection must be undertaken for the playback machines. If satisfactory conditions are met for both carrier and playback machine, the transfer can take place, moderated by an analog-to-digital converter. The digital signal is then represented visually for the transfer engineer by a digital audio workstation, like Audacity, WaveLab, or Pro Tools. Reference access copies can be made at smaller sample rates. For archival purposes, it is standard to transfer at a sample rate of 96 kHz and a bit depth of 24 bits per channel.

Challenges

Many libraries, archives, museums, and other memory institutions, struggle with catching up and staying current regarding digitization and the expectation that everything should already be online. The time spent planning, doing the work, and processing the digital files along with the expense and fragility of some materials are some of the most common.

Time spent

Digitization is a time-consuming process, even more so when the condition or format of the analog resources requires special handling. Deciding what part of a collection to digitize can sometimes take longer than digitizing it in its entirety. Each digitization project is unique and workflows for one will be different from every other project that goes through the process, so time must be spent thoroughly studying and planning each one to create the best plan for the materials and the intended audience.

Expense

Cost of equipment, staff time, metadata creation, and digital storage media make large scale digitization of collections expensive for all types of cultural institutions.

Ideally all institutions want their digital copies to have the best image quality so a high-quality copy can be maintained over time. However, smaller institutions may not be able to afford such equipment or manpower, which limits how much material can be digitized, so archivists and librarians must know what their patrons need and prioritize digitization of those items. Often the cost of time and expertise involved with describing materials and adding metadata is more than the digitization process.

Fragility of materials

Some materials, such as brittle books, are so fragile that undergoing the process of digitization could damage them irreparably. Despite potential damage, one reason for digitizing fragile materials is because they are so heavily used that creating a digital surrogate will help preserve the original copy long past its expected lifetime and increase access to the item.

Copyright

Copyright is not only a problem faced by projects like Google Books, but by institutions that may need to contact private citizens or institutions mentioned in archival documents for permission to scan the items for digital collections. It can be time consuming to make sure all potential copyright holders have given permission, but if copyright cannot be determined or cleared, it may be necessary to restrict even digital materials to in library use.

Solutions

Institutions can make digitization more cost-effective by planning before a project begins, including outlining what they hope to accomplish and the minimum amount of equipment, time, and effort that can meet those goals. If a budget needs more money to cover the cost of equipment or staff, an institution might investigate if grants are available.

Collaboration

Collaborations between institutions have the potential to save money on equipment, staff, and training as individual members share their equipment, manpower, and skills rather than pay outside organizations to provide these services. Collaborations with donors can build long-term support of current and future digitization projects.

Outsourcing

Outsourcing can be an option if an institution does not want to invest in equipment but since most vendors require an inventory and basic metadata for materials, this is not an option for institutions hoping to digitize without processing.

Non-traditional staffing

Many institutions have the option of using volunteers, student employees, or temporary employees on projects. While this saves on staffing costs, it can add costs elsewhere such as on training or having to re-scan items due to poor quality.

MPLP

One way to save time and resources is by using the More Product, Less Process (MPLP) method to digitize materials while they are being processed. Since GLAM (Galleries, Libraries, Archives, and Museums) institutions are already committed to preserving analog materials from special collections, digital access copies do not need to be high-resolution preservation copies, just good enough to provide access to rare materials. Sometimes institutions can get by with 300 dpi JPGs rather than a 600 dpi TIFF for images, and a 300 dpi grayscale scan of a document rather than a color one at 600 dpi.

Digitizing marginalized voices

Digitization can be used to highlight voices of historically marginalized peoples and add them to the greater body of knowledge. Many projects, some community archives created by members of those groups, are doing this in a way that supports the people, values their input and collaboration, and gives them a sense of ownership of the collection. Examples of projects are Gi-gikinomaage-min and the South Asian American Digital Archive (SAADA).

Gi-gikinomaage-min

Gi-gikinomaage-min is Anishinaabemowin for "We are all teachers" and its main purpose is "to document the history of Native Americans in Grand Rapids, Michigan." It combines new audio and video oral histories with digitized flyers, posters, and newsletters from Grand Valley State University's analog collections. Although not entirely a newly digitized project, what was created also added item-level metadata to enhance context. At the start, collaboration between several university departments and the Native American population was deemed important and remained strong throughout the project.

SAADA

The South Asian American Digital Archive (SAADA) has no physical building, is entirely digital and everything is handled by volunteers. This archive was started by Michelle Caswell and Samip Mallick and collects a broad variety of materials "created by or about people residing in the United States who trace their  heritage to Bangladesh, Bhutan, India, Maldives, Nepal, Pakistan, Sri Lanka, and the many South Asian diaspora communities across the globe." (Caswell, 2015, 2). The collection of digitized items includes private, government, and university held materials.

Black Campus Movement Collection (BCM)

Kent State University began its BCM collection when it acquired the papers of African American alumnus Lafayette Tolliver, which included about 1,000 photographs that chronicled the black student experience at Kent State from 1968-1971. The collection continues to add materials from the 1960s up to and including the current student body and several oral histories have been added since it debuted. When digitizing the items, it was necessary to work with alumni to create descriptions for the images. This collaboration created changes in local controlled vocabularies the libraries used to create metadata for the images.

Mass digitization

The expectation that everything should be online has led to mass digitization practices, but it is an ongoing process with obstacles that have led to alternatives. As new technology makes automated scanning of materials safer for materials and decreases need for cropping and de-skewing, mass digitization should be able to increase.

Obstacles

Digitization can be a physically slow process involving selection and preparation of collections that can take years if materials need to be compared for completeness or are vulnerable to damage. Price of specialized equipment, storage costs, website maintenance, quality control, and retrieval system limitations all add to the problems of working on a large scale.

Successes

Digitization on demand

Scanning materials as users ask for them, provides copies for others to use and cuts down on repeated copying of popular items. If one part of a folder, document, or book is asked for, scanning the entire object can save time in the future by already having the material access if someone else needs the material. Digitizing on demand can increase volume because time spent on selection and prep has been used on scanning instead.

Google Books

From the start, Google has concentrated on text rather than images or special collections. Although criticized in the past for poor image quality, selection practices, and lacking long-term preservation plans, their focus on quantity over quality has enabled Google to digitize more books than other digitizers.

Standards

Digitization is not a static field and standards change with new technology, so it is up to digitization managers to stay current with new developments. Although each digitization project is different, common standards in formats, metadata, quality, naming, and file storage should be used to give the best chance of interoperability and patron access. As digitization is often the first step in digital preservation, questions about how to handle digital files should be addressed in institutional standards.

Resources to create local standards are available from the Society of American Archivists, the Smithsonian, and the Northeast Document Conservation Center.

Implications

Cultural Heritage Concerns

Digitization of community archives by indigenous and other marginalized people has led to traditional memory institutions reassessing how they digitize and handle objects in their collections that may have ties to these groups. The topics they are rethinking are varied and include how items are chosen for digitization projects, what metadata to use to convey proper context to be retrievable by the groups they represent, and whether an item should be accessed by the world or just those who the groups originally intended to have access, such as elders. Many navigate these concerns by collaborating with the communities they seek to represent through their digitized collections.

Lean philosophy

The broad use of internet and the increasing popularity of lean philosophy has also increased the use and meaning of "digitizing" to describe improvements in the efficiency of organizational processes. Lean philosophy refers to the approach which considers any use of time and resources, which does not lead directly to creating a product, as waste and therefore a target for elimination. This will often involve some kind of Lean process in order to simplify process activities, with the aim of implementing new "lean and mean" processes by digitizing data and activities. Digitization can help to eliminate time waste by introducing wider access to data, or by the implementation of enterprise resource planning systems.

Fiction

Works of science-fiction often include the term digitize as the act of transforming people into digital signals and sending them into digital technology. When that happens, the people disappear from the real world and appear in a virtual world (as featured in the cult film Tron, the animated series Code: Lyoko, or the late 1980s live-action series Captain Power and the Soldiers of the Future). In the video game Beyond Good & Evil, the protagonist's holographic friend digitizes the player's inventory items. One Super Friends cartoon episode showed Wonder Woman and Jayna freeing the world's men (including the male super heroes) onto computer tape by the female villainess Medula.

Politics of Europe

From Wikipedia, the free encyclopedia ...