Search This Blog

Sunday, March 20, 2022

Digitization

From Wikipedia, the free encyclopedia

Digitization is the process of converting information into a digital (i.e. computer-readable) format. The result is the representation of an object, image, sound, document or signal (usually an analog signal) obtained by generating a series of numbers that describe a discrete set of points or samples. The result is called digital representation or, more specifically, a digital image, for the object, and digital form, for the signal. In modern practice, the digitized data is in the form of binary numbers, which facilitates processing by digital computers and other operations, but, digitizing simply means the conversion of analog source material into a numerical format; the decimal or any other number system can be used instead.

Digitization is of crucial importance to data processing, storage and transmission, because it "allows information of all kinds in all formats to be carried with the same efficiency and also intermingled". Though analog data is typically more stable, digital data, has the potential to be more easily shared and accessed and, in theory, can be propagated indefinitely, without generation loss, provided it is migrated to new, stable formats as needed. This potential has led to institutional digitization projects designed to improve access and the rapid growth of the digital preservation field.

Sometimes digitization and digital preservation are mistaken for the same thing, however they are different, but digitization is often a vital first step in digital preservation. Libraries, archives, museums and other memory institutions digitize items to preserve fragile materials and create more access points for patrons. Doing this creates challenges for information professionals and solutions can be as varied as the institutions that implement them. Some analog materials, such as audio and video tapes, are nearing the end of their life-cycle and it is important to digitize them before equipment obsolescence and media deterioration makes the data irretrievable.

There are challenges and implications surrounding digitization including time, cost, cultural history concerns and creating an equitable platform for historically marginalized voices. Many digitizing institutions develop their own solutions to these challenges.

Mass digitization projects have had mixed results over the years, but some institutions have had success even if not in the traditional Google Books model.

Technological changes can happen often and quickly, so digitization standards are difficult to keep updated. Professionals in the field can attend conferences and join organizations and working groups to keep their knowledge current and add to the conversation.

Process

The term digitization is often used when diverse forms of information, such as an object, text, sound, image or voice, are converted into a single binary code. The core of the process is the compromise between the capturing device and the player device so that the rendered result represents the original source with the most possible fidelity, and the advantage of digitization is the speed and accuracy in which this form of information can be transmitted with no degradation compared with analog information.

Digital information exists as one of two digits, either 0 or 1. These are known as bits (a contraction of binary digits) and the sequences of 0s and 1s that constitute information are called bytes.

Analog signals are continuously variable, both in the number of possible values of the signal at a given time, as well as in the number of points in the signal in a given period of time. However, digital signals are discrete in both of those respects – generally a finite sequence of integers – therefore a digitization can, in practical terms, only ever be an approximation of the signal it represents.

Digitization occurs in two parts:

Discretization
The reading of an analog signal A, and, at regular time intervals (frequency), sampling the value of the signal at the point. Each such reading is called a sample and may be considered to have infinite precision at this stage;
Quantization
Samples are rounded to a fixed set of numbers (such as integers), a process known as quantization.

In general, these can occur at the same time, though they are conceptually distinct.

A series of digital integers can be transformed into an analog output that approximates the original analog signal. Such a transformation is called a DA conversion. The sampling rate and the number of bits used to represent the integers combine to determine how close such an approximation to the analog signal a digitization will be.

Examples

Digitization of the first number of Estonian popular science magazine Horisont published in January 1967.

The term is used to describe, for example, the scanning of analog sources (such as printed photos or taped videos) into computers for editing, 3D scanning that creates 3D modeling of an object's surface, and audio (where sampling rate is often measured in kilohertz) and texture map transformations. In this last case, as in normal photos, the sampling rate refers to the resolution of the image, often measured in pixels per inch.

Digitizing is the primary way of storing images in a form suitable for transmission and computer processing, whether scanned from two-dimensional analog originals or captured using an image sensor-equipped device such as a digital camera, tomographical instrument such as a CAT scanner, or acquiring precise dimensions from a real-world object, such as a car, using a 3D scanning device.

Digitizing is central to making digital representations of geographical features, using raster or vector images, in a geographic information system, i.e., the creation of electronic maps, either from various geographical and satellite imaging (raster) or by digitizing traditional paper maps or graphs (vector).

"Digitization" is also used to describe the process of populating databases with files or data. While this usage is technically inaccurate, it originates with the previously proper use of the term to describe that part of the process involving digitization of analog sources, such as printed pictures and brochures, before uploading to target databases.

Digitizing may also be used in the field of apparel, where an image may be recreated with the help of embroidery digitizing software tools and saved as embroidery machine code. This machine code is fed into an embroidery machine and applied to the fabric. The most supported format is DST file. Apparel companies also digitize clothing patterns.

History

  • 1957 The Standards Electronic Automatic Computer (SEAC) was invented. That same year, Russell Kirsch used a rotating drum scanner and photomultiplier connected to SEAC to create the first digital image (176x176 pixels) from a photo of his infant son. This image was stored in SEAC memory via a staticizer and viewed via a cathode ray oscilloscope.
  • 1971 Invention of Charge-Coupled Devices that made conversion from analog data to a digital format easy.
  • 1986 work started on the JPEG format.
  • 1990s Libraries began scanning collections to provide access via the world wide web.

Analog signals to digital

Analog signals are continuous electrical signals; digital signals are non-continuous. Analog signals can be converted to digital signals by using an analog-to-digital converter.

The process of converting analog to digital consists of two parts: sampling and quantizing. Sampling measures wave amplitudes at regular intervals, splits them along the vertical axis, and assigns them a numerical value, while quantizing looks for measurements that are between binary values and rounds them up or down.

Nearly all recorded music has been digitized, and about 12 percent of the 500,000+ movies listed on the Internet Movie Database are digitized and were released on DVD.

Digitization of home movies, slides, and photographs is a popular method of preserving and sharing personal multimedia. Slides and photographs may be scanned quickly using an image scanner, but analog video requires a video tape player to be connected to a computer while the item plays in real time. Slides can be digitized quicker with a slide scanner such as the Nikon Coolscan 5000ED.

Another example of digitization is the VisualAudio process developed by the Swiss Fonoteca Nazionale in Lugano, by scanning a high resolution photograph of a record, they are able to extract and reconstruct the sound from the processed image.

Digitization of analog tapes before they degrade, or after damage has already occurred, can rescue the only copies of local and traditional cultural music for future generations to study and enjoy.

Analog texts to digital

Image of a rare book in a book scanner where it will be digitized.
Book scanner in the digitization lab at the University of Liège, Belgium.

Academic and public libraries, foundations, and private companies like Google are scanning older print books and applying optical character recognition (OCR) technologies so they can be keyword searched, but as of 2006, only about 1 in 20 texts had been digitized. Librarians and archivists are working to increase this statistic and in 2019 began digitizing 480,000 books published between 1923 and 1964 that had entered the public domain.

Unpublished manuscripts and other rare papers and documents housed in special collections are being digitized by libraries and archives, but backlogs often slow this process and keep materials with enduring historical and research value hidden from most users (see digital libraries). Digitization has not completely replaced other archival imaging options, such as microfilming which is still used by institutions such as the National Archives and Records Administration (NARA) to provide preservation and access to these resources.

While digital versions of analog texts can potentially be accessed from anywhere in the world, they are not as stable as most print materials or manuscripts and are unlikely to be accessible decades from now without further preservation efforts, while many books manuscripts and scrolls have already been around for centuries. However, for some materials that have been damaged by water, insects, or catastrophes, digitization might be the only option for continued use.

Library preservation

In the context of libraries, archives, and museums, digitization is a means of creating digital surrogates of analog materials, such as books, newspapers, microfilm and videotapes, offers a variety of benefits, including increasing access, especially for patrons at a distance; contributing to collection development, through collaborative initiatives; enhancing the potential for research and education; and supporting preservation activities. Digitization can provide a means of preserving the content of the materials by creating an accessible facsimile of the object in order to put less strain on already fragile originals. For sounds, digitization of legacy analog recordings is essential insurance against technological obsolescence. A fundamental aspect of planning digitization projects is to ensure that the digital files themselves are preserved and remain accessible; the term "digital preservation," in its most basic sense, refers to an array of activities undertaken to maintain access to digital materials over time.

The prevalent Brittle Books issue facing libraries across the world is being addressed with a digital solution for long term book preservation. Since the mid-1800s, books were printed on wood-pulp paper, which turns acidic as it decays. Deterioration may advance to a point where a book is completely unusable. In theory, if these widely circulated titles are not treated with de-acidification processes, the materials upon those acid pages will be lost. As digital technology evolves, it is increasingly preferred as a method of preserving these materials, mainly because it can provide easier access points and significantly reduce the need for physical storage space.

Cambridge University Library is working on the Cambridge Digital Library, which will initially contain digitised versions of many of its most important works relating to science and religion. These include examples such as Isaac Newton's personally annotated first edition of his Philosophiæ Naturalis Principia Mathematica as well as college notebooks and other papers, and some Islamic manuscripts such as a Quran from Tipu Sahib's library.

Google, Inc. has taken steps towards attempting to digitize every title with "Google Book Search". While some academic libraries have been contracted by the service, issues of copyright law violations threaten to derail the project. However, it does provide – at the very least – an online consortium for libraries to exchange information and for researchers to search for titles as well as review the materials.

Digitization versus digital preservation

Digitizing something is not the same as digitally preserving it. To digitize something is to create a digital surrogate (copy or format) of an existing analog item (book, photograph, or record) and is often described as converting it from analog to digital, however both copies remain. An example would be scanning a photograph and having the original piece in a photo album and a digital copy saved to a computer. This is essentially the first step in digital preservation which is to maintain the digital copy over a long period of time and making sure it remains authentic and accessible.

Digitization is done once with the technology currently available, while digital preservation is more complicated because technology changes so quickly that a once popular storage format may become obsolete before it breaks. An example is a 5 1/4" floppy drive, computers are no longer made with them and obtaining the hardware to convert a file stored on 5 1/4" floppy disc can be expensive. To combat this risk, equipment must be upgraded as newer technology becomes affordable (about 2 to 5 years), but before older technology becomes unobtainable (about 5 to 10 years).

Digital preservation can also apply to born-digital material, such as a Microsoft Word document or a social media post. In contrast, digitization only applies exclusively to analog materials. Born-digital materials present a unique challenge to digital preservation not only due to technological obsolescence but also because of the inherently unstable nature of digital storage and maintenance. Most websites last between 2.5 and 5 years, depending on the purpose for which they were designed.

The Library of Congress provides numerous resources and tips for individuals looking to practice digitization and digital preservation for their personal collections.

Digital reformatting

Digital reformatting is the process of converting analog materials into a digital format as a surrogate of the original. The digital surrogates perform a preservation function by reducing or eliminating the use of the original. Digital reformatting is guided by established best practices to ensure that materials are being converted at the highest quality.

Digital reformatting at the Library of Congress

The Library of Congress has been actively reformatting materials for its American Memory project and developed best standards and practices pertaining to book handling during the digitization process, scanning resolutions, and preferred file formats. Some of these standards are:

  • The use of ISO 16067-1 and ISO 16067-2 standards for resolution requirements.
  • Recommended 400 ppi resolution for OCR'ed printed text.
  • The use of 24-bit color when color is an important attribute of a document.
  • The use of the scanning device's maximum resolution for digitally reproducing photographs
  • TIFF as the standard file format.
  • Attachment of descriptive, structural, and technical metadata to all digitized documents.

A list of archival standards for digital preservation can be found on the ARL website.

The Library of Congress has constituted a Preservation Digital Reformatting Program. The Three main components of the program include:

  • Selection Criteria for digital reformatting
  • Digital reformatting principles and specifications
  • Life cycle management of LC digital data

Audio digitization and reformatting

Audio media offers a rich source of historic ethnographic information, with the earliest forms of recorded sound dating back to 1890. According to the International Association of Sound and Audiovisual Archives (IASA), these sources of audio data, as well as the aging technologies used to play them back, are in imminent danger of permanent loss due to degradation and obsolescence. These primary sources are called “carriers” and exist in a variety of formats, including wax cylinders, magnetic tape, and flat discs of grooved media, among others. Some formats are susceptible to more severe, or quicker, degradation than others. For instance, lacquer discs suffer from delamination. Analog tape may deteriorate due to sticky shed syndrome.

1/4" analog tape being played back on a Studer A810 tape machine for digitization at Smithsonian Folkways Recordings.

Archival workflow and file standardization have been developed to minimize loss of information from the original carrier to the resulting digital file as digitization is underway. For most at-risk formats (magnetic tape, grooved cylinders, etc.), a similar workflow can be observed. Examination of the source carrier will help determine what, if any, steps need to be taken to repair material prior to transfer. A similar inspection must be undertaken for the playback machines. If satisfactory conditions are met for both carrier and playback machine, the transfer can take place, moderated by an analog-to-digital converter. The digital signal is then represented visually for the transfer engineer by a digital audio workstation, like Audacity, WaveLab, or Pro Tools. Reference access copies can be made at smaller sample rates. For archival purposes, it is standard to transfer at a sample rate of 96 kHz and a bit depth of 24 bits per channel.

Challenges

Many libraries, archives, museums, and other memory institutions, struggle with catching up and staying current regarding digitization and the expectation that everything should already be online. The time spent planning, doing the work, and processing the digital files along with the expense and fragility of some materials are some of the most common.

Time spent

Digitization is a time-consuming process, even more so when the condition or format of the analog resources requires special handling. Deciding what part of a collection to digitize can sometimes take longer than digitizing it in its entirety. Each digitization project is unique and workflows for one will be different from every other project that goes through the process, so time must be spent thoroughly studying and planning each one to create the best plan for the materials and the intended audience.

Expense

Cost of equipment, staff time, metadata creation, and digital storage media make large scale digitization of collections expensive for all types of cultural institutions.

Ideally all institutions want their digital copies to have the best image quality so a high-quality copy can be maintained over time. However, smaller institutions may not be able to afford such equipment or manpower, which limits how much material can be digitized, so archivists and librarians must know what their patrons need and prioritize digitization of those items. Often the cost of time and expertise involved with describing materials and adding metadata is more than the digitization process.

Fragility of materials

Some materials, such as brittle books, are so fragile that undergoing the process of digitization could damage them irreparably. Despite potential damage, one reason for digitizing fragile materials is because they are so heavily used that creating a digital surrogate will help preserve the original copy long past its expected lifetime and increase access to the item.

Copyright

Copyright is not only a problem faced by projects like Google Books, but by institutions that may need to contact private citizens or institutions mentioned in archival documents for permission to scan the items for digital collections. It can be time consuming to make sure all potential copyright holders have given permission, but if copyright cannot be determined or cleared, it may be necessary to restrict even digital materials to in library use.

Solutions

Institutions can make digitization more cost-effective by planning before a project begins, including outlining what they hope to accomplish and the minimum amount of equipment, time, and effort that can meet those goals. If a budget needs more money to cover the cost of equipment or staff, an institution might investigate if grants are available.

Collaboration

Collaborations between institutions have the potential to save money on equipment, staff, and training as individual members share their equipment, manpower, and skills rather than pay outside organizations to provide these services. Collaborations with donors can build long-term support of current and future digitization projects.

Outsourcing

Outsourcing can be an option if an institution does not want to invest in equipment but since most vendors require an inventory and basic metadata for materials, this is not an option for institutions hoping to digitize without processing.

Non-traditional staffing

Many institutions have the option of using volunteers, student employees, or temporary employees on projects. While this saves on staffing costs, it can add costs elsewhere such as on training or having to re-scan items due to poor quality.

MPLP

One way to save time and resources is by using the More Product, Less Process (MPLP) method to digitize materials while they are being processed. Since GLAM (Galleries, Libraries, Archives, and Museums) institutions are already committed to preserving analog materials from special collections, digital access copies do not need to be high-resolution preservation copies, just good enough to provide access to rare materials. Sometimes institutions can get by with 300 dpi JPGs rather than a 600 dpi TIFF for images, and a 300 dpi grayscale scan of a document rather than a color one at 600 dpi.

Digitizing marginalized voices

Digitization can be used to highlight voices of historically marginalized peoples and add them to the greater body of knowledge. Many projects, some community archives created by members of those groups, are doing this in a way that supports the people, values their input and collaboration, and gives them a sense of ownership of the collection. Examples of projects are Gi-gikinomaage-min and the South Asian American Digital Archive (SAADA).

Gi-gikinomaage-min

Gi-gikinomaage-min is Anishinaabemowin for "We are all teachers" and its main purpose is "to document the history of Native Americans in Grand Rapids, Michigan." It combines new audio and video oral histories with digitized flyers, posters, and newsletters from Grand Valley State University's analog collections. Although not entirely a newly digitized project, what was created also added item-level metadata to enhance context. At the start, collaboration between several university departments and the Native American population was deemed important and remained strong throughout the project.

SAADA

The South Asian American Digital Archive (SAADA) has no physical building, is entirely digital and everything is handled by volunteers. This archive was started by Michelle Caswell and Samip Mallick and collects a broad variety of materials "created by or about people residing in the United States who trace their  heritage to Bangladesh, Bhutan, India, Maldives, Nepal, Pakistan, Sri Lanka, and the many South Asian diaspora communities across the globe." (Caswell, 2015, 2). The collection of digitized items includes private, government, and university held materials.

Black Campus Movement Collection (BCM)

Kent State University began its BCM collection when it acquired the papers of African American alumnus Lafayette Tolliver, which included about 1,000 photographs that chronicled the black student experience at Kent State from 1968-1971. The collection continues to add materials from the 1960s up to and including the current student body and several oral histories have been added since it debuted. When digitizing the items, it was necessary to work with alumni to create descriptions for the images. This collaboration created changes in local controlled vocabularies the libraries used to create metadata for the images.

Mass digitization

The expectation that everything should be online has led to mass digitization practices, but it is an ongoing process with obstacles that have led to alternatives. As new technology makes automated scanning of materials safer for materials and decreases need for cropping and de-skewing, mass digitization should be able to increase.

Obstacles

Digitization can be a physically slow process involving selection and preparation of collections that can take years if materials need to be compared for completeness or are vulnerable to damage. Price of specialized equipment, storage costs, website maintenance, quality control, and retrieval system limitations all add to the problems of working on a large scale.

Successes

Digitization on demand

Scanning materials as users ask for them, provides copies for others to use and cuts down on repeated copying of popular items. If one part of a folder, document, or book is asked for, scanning the entire object can save time in the future by already having the material access if someone else needs the material. Digitizing on demand can increase volume because time spent on selection and prep has been used on scanning instead.

Google Books

From the start, Google has concentrated on text rather than images or special collections. Although criticized in the past for poor image quality, selection practices, and lacking long-term preservation plans, their focus on quantity over quality has enabled Google to digitize more books than other digitizers.

Standards

Digitization is not a static field and standards change with new technology, so it is up to digitization managers to stay current with new developments. Although each digitization project is different, common standards in formats, metadata, quality, naming, and file storage should be used to give the best chance of interoperability and patron access. As digitization is often the first step in digital preservation, questions about how to handle digital files should be addressed in institutional standards.

Resources to create local standards are available from the Society of American Archivists, the Smithsonian, and the Northeast Document Conservation Center.

Implications

Cultural Heritage Concerns

Digitization of community archives by indigenous and other marginalized people has led to traditional memory institutions reassessing how they digitize and handle objects in their collections that may have ties to these groups. The topics they are rethinking are varied and include how items are chosen for digitization projects, what metadata to use to convey proper context to be retrievable by the groups they represent, and whether an item should be accessed by the world or just those who the groups originally intended to have access, such as elders. Many navigate these concerns by collaborating with the communities they seek to represent through their digitized collections.

Lean philosophy

The broad use of internet and the increasing popularity of lean philosophy has also increased the use and meaning of "digitizing" to describe improvements in the efficiency of organizational processes. Lean philosophy refers to the approach which considers any use of time and resources, which does not lead directly to creating a product, as waste and therefore a target for elimination. This will often involve some kind of Lean process in order to simplify process activities, with the aim of implementing new "lean and mean" processes by digitizing data and activities. Digitization can help to eliminate time waste by introducing wider access to data, or by the implementation of enterprise resource planning systems.

Fiction

Works of science-fiction often include the term digitize as the act of transforming people into digital signals and sending them into digital technology. When that happens, the people disappear from the real world and appear in a virtual world (as featured in the cult film Tron, the animated series Code: Lyoko, or the late 1980s live-action series Captain Power and the Soldiers of the Future). In the video game Beyond Good & Evil, the protagonist's holographic friend digitizes the player's inventory items. One Super Friends cartoon episode showed Wonder Woman and Jayna freeing the world's men (including the male super heroes) onto computer tape by the female villainess Medula.

Endogenous retrovirus

From Wikipedia, the free encyclopedia

Dendrogram of various classes of endogenous retroviruses

Endogenous retroviruses (ERVs) are endogenous viral elements in the genome that closely resemble and can be derived from retroviruses. They are abundant in the genomes of jawed vertebrates, and they comprise up to 5–8% of the human genome (lower estimates of ~1%).

ERVs are a vertically inherited proviral sequence and a subclass of a type of gene called a transposon, which can normally be packaged and moved within the genome to serve a vital role in gene expression and in regulation. ERVs however lack most transposon functions, are typically not infectious and are often defective genomic remnants of the retroviral replication cycle. They are distinguished as germline provirus retroelements due to their integration and reverse-transcription into the nuclear genome of the host cell.

Researchers have suggested that retroviruses evolved from a type of transposon called a retrotransposon, a Class I element; these genes can mutate and instead of moving to another location in the genome they can become exogenous or pathogenic. This means that not all ERVs may have originated as an insertion by a retrovirus but that some may have been the source for the genetic information in the retroviruses they resemble. When integration of viral DNA occurs in the germ-line, it can give rise to an ERV, which can later become fixed in the gene pool of the host population.

Formation

The replication cycle of a retrovirus entails the insertion ("integration") of a DNA copy of the viral genome into the nuclear genome of the host cell. Most retroviruses infect somatic cells, but occasional infection of germline cells (cells that produce eggs and sperm) can also occur. Rarely, retroviral integration may occur in a germline cell that goes on to develop into a viable organism. This organism will carry the inserted retroviral genome as an integral part of its own genome—an "endogenous" retrovirus (ERV) that may be inherited by its offspring as a novel allele. Many ERVs have persisted in the genome of their hosts for millions of years. However, most of these have acquired inactivating mutations during host DNA replication and are no longer capable of producing the virus. ERVs can also be partially excised from the genome by a process known as recombinational deletion, in which recombination between the identical sequences that flank newly integrated retroviruses results in deletion of the internal, protein-coding regions of the viral genome.

The general retrovirus genome consists of three genes vital for the invasion, replication, escape, and spreading of its viral genome. These three genes are gag (encodes for structural proteins for the viral core), pol (encodes for reverse transcriptase, integrase, and protease), and env (encodes for coat proteins for the virus's exterior). These viral proteins are encoded as polyproteins. In order to carry out their life cycle, the retrovirus relies heavily on the host cell's machinery. Protease degrades peptide bonds of the viral polyproteins, making the separate proteins functional. Reverse transcriptase functions to synthesize viral DNA from the viral RNA in the host cell's cytoplasm before it enters the nucleus. Integrase guides the integration of viral DNA into the host genome.

Over time, the genome of ERVs not only acquire point mutations, but also shuffle and recombine with other ERVs. ERVs with a decayed sequence for the env become more likely to propagate.

Role in genomic evolution

Diagram displaying the integration of viral DNA into a host genome

Endogenous retroviruses can play an active role in shaping genomes. Most studies in this area have focused on the genomes of humans and higher primates, but other vertebrates, such as mice and sheep, have also been studied in depth. The long terminal repeat (LTR) sequences that flank ERV genomes frequently act as alternate promoters and enhancers, often contributing to the transcriptome by producing tissue-specific variants. In addition, the retroviral proteins themselves have been co-opted to serve novel host functions, particularly in reproduction and development. Recombination between homologous retroviral sequences has also contributed to gene shuffling and the generation of genetic variation. Furthermore, in the instance of potentially antagonistic effects of retroviral sequences, repressor genes have co-evolved to combat them.

About 90% of endogenous retroviruses are solo LTRs, lacking all open reading frames (ORFs). Solo LTRs and LTRs associated with complete retroviral sequences have been shown to act as transcriptional elements on host genes. Their range of action is mainly by insertion into the 5' UTRs of protein coding genes; however, they have been known to act upon genes up to 70–100 kb away. The majority of these elements are inserted in the sense direction to their corresponding genes, but there has been evidence of LTRs acting in the antisense direction and as a bidirectional promoter for neighboring genes. In a few cases, the LTR functions as the major promoter for the gene.

For example, in humans AMY1C has a complete ERV sequence in its promoter region; the associated LTR confers salivary specific expression of the digestive enzyme amylase. Also, the primary promoter for bile acid-CoA:amino acid N-acyltransferase (BAAT), which codes for an enzyme that is integral in bile metabolism, is of LTR origin.

The insertion of a solo ERV-9 LTR may have produced a functional open reading frame, causing the rebirth of the human immunity related GTPase gene (IRGM). ERV insertions have also been shown to generate alternative splice sites either by direct integration into the gene, as with the human leptin hormone receptor, or driven by the expression of an upstream LTR, as with the phospholipase A-2 like protein.

Most of the time, however, the LTR functions as one of many alternate promoters, often conferring tissue-specific expression related to reproduction and development. In fact, 64% of known LTR-promoted transcription variants are expressed in reproductive tissues. For example, the gene CYP19 codes for aromatase P450, an important enzyme for estrogen synthesis, that is normally expressed in the brain and reproductive organs of most mammals. However, in primates, an LTR-promoted transcriptional variant confers expression to the placenta and is responsible for controlling estrogen levels during pregnancy. Furthermore, the neuronal apoptosis inhibitory protein (NAIP), normally widespread, has an LTR of the HERV-P family acting as a promoter that confers expression to the testis and prostate. Other proteins, such as nitric oxide synthase 3 (NOS3), interleukin-2 receptor B (IL2RB), and another mediator of estrogen synthesis, HSD17B1, are also alternatively regulated by LTRs that confer placental expression, but their specific functions are not yet known. The high degree of reproductive expression is thought to be an after effect of the method by which they were endogenized; however, this also may be due to a lack of DNA methylation in germ-line tissues.

The best-characterized instance of placental protein expression comes not from an alternatively promoted host gene but from a complete co-option of a retroviral protein. Retroviral fusogenic env proteins, which play a role in the entry of the virion into the host cell, have had an important impact on the development of the mammalian placenta. In mammals, intact env proteins called syncytins are responsible for the formation and function of syncytiotrophoblasts. These multinucleated cells are mainly responsible for maintaining nutrient exchange and separating the fetus from the mother's immune system. It has been suggested that the selection and fixation of these proteins for this function have played a critical role in the evolution of viviparity.

In addition, the insertion of ERVs and their respective LTRs have the potential to induce chromosomal rearrangement due to recombination between viral sequences at inter-chromosomal loci. These rearrangements have been shown to induce gene duplications and deletions that largely contribute to genome plasticity and dramatically change the dynamic of gene function. Furthermore, retroelements in general are largely prevalent in rapidly evolving, mammal-specific gene families whose function is largely related to the response to stress and external stimuli. In particular, both human class I and class II MHC genes have a high density of HERV elements as compared to other multi-locus-gene families. It has been shown that HERVs have contributed to the formation of extensively duplicated duplicon blocks that make up the HLA class 1 family of genes. More specifically, HERVs primarily occupy regions within and between the break points between these blocks, suggesting that considerable duplication and deletions events, typically associated with unequal crossover, facilitated their formation. The generation of these blocks, inherited as immunohaplotypes, act as a protective polymorphism against a wide range of antigens that may have imbued humans with an advantage over other primates.

The characteristic of placentas being very evolutionary distinct organs between different species has been suggested to result from the co-option of ERV enhancers. Regulatory mutations, instead of mutations in genes that encode for hormones and growth factors, support the known evolution of placental morphology, especially since the majority of hormone and growth factor genes are expressed in response to pregnancy, not during placental development. Researchers studied the regulatory landscape of placental development between the rat and mouse, two closely related species. This was done by mapping all regulatory elements of the rat trophoblast stem cells (TSCs) and comparing them to their orthologs in mouse TSCs. TSCs were observed because they reflect the initial cells that develop in the fetal placenta. Regardless of their tangible similarities, enhancer and repressed regions were mostly species-specific. However, most promoter sequences were conserved between mouse and rat. In conclusion to their study, researchers proposed that ERVs influenced species-specific placental evolution through mediation of placental growth, immunosuppression, and cell fusion.

Another example of ERV exploiting cellular mechanisms is p53, a tumor suppressor gene (TSG). DNA damage and cellular stress induces the p53 pathway, which results in cell apoptosis. Using chromatin immunoprecipitation with sequencing, thirty-percent of all p53-binding sites were located within copies of a few primate-specific ERV families. A study suggested that this benefits retroviruses because p53's mechanism provides a rapid induction of transcription, which leads to the exit of viral RNA from the host cell.

Finally, the insertion of ERVs or ERV elements into genic regions of host DNA, or overexpression of their transcriptional variants, has a much higher potential to produce deleterious effects than positive ones. Their appearance into the genome has created a host-parasite co-evolutionary dynamic that proliferated the duplication and expansion of repressor genes. The most clear-cut example of this involves the rapid duplication and proliferation of tandem zinc-finger genes in mammal genomes. Zinc-finger genes, particularly those that include a KRAB domain, exist in high copy number in vertebrate genomes, and their range of functions are limited to transcriptional roles. It has been shown in mammals, however, that the diversification of these genes was due to multiple duplication and fixation events in response to new retroviral sequences or their endogenous copies to repress their transcription.

Role in disease

The majority of ERVs that occur in vertebrate genomes are ancient, inactivated by mutation, and have reached genetic fixation in their host species. For these reasons, they are extremely unlikely to have negative effects on their hosts except under unusual circumstances. Nevertheless, it is clear from studies in birds and non-human mammal species including mice, cats and koalas, that younger (i.e., more recently integrated) ERVs can be associated with disease. The number of active ERVs in the genome of mammals is negatively related to their body size suggesting a contribution to the Peto's paradox through cancer pathogenesis. This has led researchers to propose a role for ERVs in several forms of human cancer and autoimmune disease, although conclusive evidence is lacking.

Neurological disorders

In humans, ERVs have been proposed to be involved in multiple sclerosis (MS). A specific association between MS and the ERVWE1, or "syncytin", gene, which is derived from an ERV insertion, has been reported, along with the presence of an "MS-associated retrovirus" (MSRV), in patients with the disease. Human ERVs (HERVs) have also been implicated in ALS and addiction.

In 2004 it was reported that antibodies to HERVs were found in greater frequency in the sera of people with schizophrenia. Additionally, the cerebrospinal fluid of people with recent onset schizophrenia contained levels of a retroviral marker, reverse transcriptase, four times higher than control subjects. Researchers continue to look at a possible link between HERVs and schizophrenia, with the additional possibility of a triggering infection inducing schizophrenia.

Immunity

ERVs have been found to be associated to disease not only through disease-causing relations, but also through immunity. The frequency of ERVs in long terminal repeats (LTRs) likely correlates to viral adaptations to take advantage of immunity signaling pathways that promote viral transcription and replication. A study done in 2016 investigated the benefit of ancient viral DNA integrated into a host through gene regulation networks induced by interferons, a branch of innate immunity. These cytokines are first to respond to viral infection and are also important in immunosurveillance for malignant cells. ERVs are predicted to act as cis-regulatory elements, but much of the adaptive consequences of this for certain physiological functions is still unknown. There is data that supports the general role of ERVs in the regulation of human interferon response, specifically to interferon-gamma (IFNG). For example, interferon-stimulated genes were found to be greatly enriched with ERVs bound by signal transducer and activator of transcription 1 (STAT1) and/or Interferon regulatory factor (IRF1) in CD14+ macrophages.

HERVs also play various roles shaping the human innate immunity response, with some sequences activating the system and others suppressing it. They may also protect from exogenous retroviral infections: the virus-like transcripts can activate pattern recognition receptors, and the proteins can interfere with active retroviruses. A gag protein from HERV-K(HML2) is shown to mix with HIV Gag, impairing HIV capsid formation as a result.

Gene regulation

Another idea proposed was that ERVs from the same family played a role in recruiting multiple genes into the same network of regulation. It was found that MER41 elements provided addition redundant regulatory enhancement to the genes located near STAT1 binding sites.

Role in medicine

Porcine endogenous retrovirus

For humans, porcine endogenous retroviruses (PERVs) pose a concern when using porcine tissues and organs in xenotransplantion, the transplanting of living cells, tissues, and organs from an organism of one species to an organism of different species. Although pigs are generally the most suitable donors to treat human organ diseases due to practical, financial, safety, and ethical reasons, PERVs previously could not be removed from pigs, due to their viral ability to integrate into the host genome and to be passed into offspring, until the year 2017, when one lab, using CRISPR-Cas9, removed all 62 retroviruses from the pig genome. The consequences of cross-species transmission remain unexplored and have dangerous potential.

Researchers have indicated that infection of human tissues by PERVs is very possible, especially in immunosuppressed individuals. An immunosuppressed condition could potentially permit a more rapid and tenacious replication of viral DNA, and would later have less difficulty adapting to human-to-human transmission. Although known infectious pathogens present in the donor organ/tissue can be eliminated by breeding pathogen-free herds, unknown retroviruses can be present in the donor. These retroviruses are often latent and asymptomatic in the donor, but can become active in the recipient. Some examples of endogenous viruses that can infect and multiply in human cells are from baboons (BaEV), cats (RD114), and mice.

There are three different classes of PERVs, PERV-A, PERV-B, and PERV-C. PERV-A and PERV-B are polytropic and can infect human cells in vitro, while PERV-C is ecotropic and does not replicate on human cells. The major differences between the classes is in the receptor binding domain of the env protein and the long terminal repeats (LTRs) that influence the replication of each class. PERV-A and PERV-B display LTRs that have repeats in the U3 region. However, PERV-A and PERV-C show repeatless LTRs. Researchers found that PERVs in culture actively adapted to the repeat structure of their LTR in order to match the best replication performance a host cell could perform. At the end of their study, researchers concluded that repeatless PERV LTR evolved from the repeat-harboring LTR. This was likely to have occurred from insertional mutation and was proven through use of data on LTR and env/Env. It is thought that the generation of repeatless LTRs could be reflective of an adaptation process of the virus, changing from an exogenous to an endogenous lifestyle.

A clinical trial study performed in 1999 sampled 160 patients who were treated with different living pig tissues and observed no evidence of a persistent PERV infection in 97% of the patients for whom a sufficient amount of DNA was available to PCR for amplification of PERV sequences. This study stated that retrospective studies are limited to find the true incidence of infection or associated clinical symptoms, however. It suggested using closely monitored prospective trials, which would provide a more complete and detailed evaluation of the possible cross-species PERV transmission and a comparison of the PERV.

Human endogenous retroviruses

Human endogenous retroviruses (HERV) comprise a significant part of the human genome, with approximately 98,000 ERV elements and fragments making up 5–8%. According to a study published in 2005, no HERVs capable of replication had been identified; all appeared to be defective, containing major deletions or nonsense mutations (not true for HERV-K). This is because most HERVs are merely traces of original viruses, having first integrated millions of years ago. An analysis of HERV integrations is ongoing as part of the 100,000 Genomes Project.

Human endogenous retroviruses were discovered by accident using a couple of different experiments. Human genomic libraries were screened under low-stringency conditions using probes from animal retroviruses, allowing the isolation and characterization of multiple, though defective, proviruses, that represented various families. Another experiment depended on oligonucleotides with homology to viral primer binding sites.

HERVs are classified based on their homologies to animal retroviruses. Families belong to Class I are similar in sequence to mammalian Gammaretroviruses (type C) and Epsilonretroviruses (Type E). Families belonging to Class II show homology to mammalian Betaretroviruses (Type B) and Deltaretroviruses (Type D). Families belong to Class III are similar to foamy viruses. For all classes, if homologies appear well conserved in the gag, pol, and env gene, they are grouped into a superfamily. There are more Class I families known to exist. The families themselves are named in a less uniform manner, with a mixture of naming based on an exogenous retrovirus, the priming tRNA (HERV-W, K), or some neighboring gene (HERV-ADP), clone number (HERV-S71), or some amino acid motif (HERV-FRD). A proposed nomenclature aims to clean up the sometimes paraphyletic standards.

There are two proposals for how HERVs became fixed in the human genome. The first assumes that sometime during human evolution, exogenous progenitors of HERV inserted themselves into germ line cells and then replicated along with the host's genes using and exploiting the host's cellular mechanisms. Because of their distinct genomic structure, HERVs were subjected to many rounds of amplification and transposition, which lead to a widespread distribution of retroviral DNA. The second hypothesis claims the continuous evolution of retro-elements from more simple structured ancestors.

Nevertheless, one family of viruses has been active since the divergence of humans and chimpanzees. This family, termed HERV-K (HML2), makes up less than 1% of HERV elements but is one of the most studied. There are indications it has even been active in the past few hundred thousand years, e.g., some human individuals carry more copies of HML2 than others. Traditionally, age estimates of HERVs are performed by comparing the 5' and 3' LTR of a HERV; however, this method is only relevant for full-length HERVs. A recent method, called cross-sectional dating, uses variations within a single LTR to estimate the ages of HERV insertions. This method is more precise in estimating HERV ages and can be used for any HERV insertions. Cross-sectional dating has been used to suggest that two members of HERV-K (HML2), HERV-K106 and HERV-K116, were active in the last 800,000 years and that HERV-K106 may have infected modern humans 150,000 years ago. However, the absence of known infectious members of the HERV-K (HML2) family, and the lack of elements with a full coding potential within the published human genome sequence, suggests to some that the family is less likely to be active at present. In 2006 and 2007, researchers working independently in France and the US recreated functional versions of HERV-K (HML2).

MER41.AIM2 is an HERV that regulates the transcription of AIM2 (Absent in Melanoma 2) which encodes for a sensor of foreign cytosolic DNA. This acts as a binding site for AIM2, meaning that it is necessary for the transcription of AIM2. Researchers had shown this by deleting MER41.AIM2 in HeLa cells using CRISPR/Cas9, leading to an undetectable transcript level of AIM2 in modified HeLa cells. The control cells, which still contained the MER41.AIM2 ERV, were observed with normal amounts of AIM2 transcript. In terms of immunity, researchers concluded that MER41.AIM2 is necessary for an inflammatory response to infection.

Immunological studies have shown some evidence for T cell immune responses against HERVs in HIV-infected individuals. The hypothesis that HIV induces HERV expression in HIV-infected cells led to the proposal that a vaccine targeting HERV antigens could specifically eliminate HIV-infected cells. The potential advantage of this novel approach is that, by using HERV antigens as surrogate markers of HIV-infected cells, it could circumvent the difficulty inherent in directly targeting notoriously diverse and fast-mutating HIV antigens.

There are a few classes of human endogenous retroviruses that still have intact open reading frames. For example, the expression of HERV-K, a biologically active family of HERV, produces proteins found in placenta. Furthermore, the expression of the envelope genes of HERV-W (ERVW-1) and HERV-FRD (ERVFRD-1) produces syncytins which are important for the generation of the syncytiotrophoblast cell layer during placentogenesis by inducing cell-cell fusion. The HUGO Gene Nomenclature Committee (HGNC) approves gene symbols for transcribed human ERVs.

Techniques for characterizing ERVs

Whole genome sequencing

Example: A porcine ERV (PERV) Chinese-born minipig isolate, PERV-A-BM, was sequenced completely and along with different breeds and cell lines in order to understand its genetic variation and evolution. The observed number of nucleotide substitutions and among the different genome sequences helped researchers determine an estimate age that PERV-A-BM was integrated into its host genome, which was found to be of an evolutionary age earlier than the European-born pigs isolates.

Chromatin immunoprecipitation with sequencing (ChIP-seq)

This technique is used to find histone marks indicative of promoters and enhancers, which are binding sites for DNA proteins, and repressed regions and trimethylation. DNA methylation has been shown to be vital to maintain silencing of ERVs in mouse somatic cells, while histone marks are vital for the same purpose in embryonic stem cells (ESCs) and early embryogenesis.

Applications

Constructing phylogenies

Because most HERVs have no function, are selectively neutral, and are very abundant in primate genomes, they easily serve as phylogenetic markers for linkage analysis. They can be exploited by comparing the integration site polymorphisms or the evolving, proviral, nucleotide sequences of orthologs. To estimate when integration occurred, researchers used distances from each phylogenetic tree to find the rate of molecular evolution at each particular locus. It is also useful that ERVs are rich in many species genomes (i.e. plants, insects, mollusks, fish, rodents, domestic pets, and livestock) because its application can be used to answer a variety of phylogenetic questions.

Designating the age of provirus and the time points of species separation events

This is accomplished by comparing the different HERV from different evolutionary periods. For example, this study was done for different hominoids, which ranged from humans to apes and to monkeys. This is difficult to do with PERV because of the large diversity present.

Further research

Epigenetic variability

Researchers could analyze individual epigenomes and transcriptomes to study the reactivation of dormant transposable elements through epigenetic release and their potential associations with human disease and exploring the specifics of gene regulatory networks.

Immunological problems of xenotransplantation

Little is known about an effective way to overcoming hyperacute rejection (HAR), which follows the activation of complement initiated by xenoreactive antibodies recognizing galactosyl-alpha1-3galatosyl (alpha-Gal) antigens on the donor epithelium.

Risk factors of HERVs in gene therapy

Because retroviruses are able to recombine with each other and with other endogenous DNA sequences, it would be beneficial for gene therapy to explore the potential risks HERVs can cause, if any. Also, this ability of HERVs to recombine can be manipulated for site-directed integration by including HERV sequences in retroviral vectors.

HERV gene expression

Researchers believe that RNA and proteins encoded for by HERV genes should continue to be explored for putative function in cell physiology and in pathological conditions. This would make sense to examine in order to more deeply define the biological significance of the proteins synthesized.

Cancer pharmacogenomics

From Wikipedia, the free encyclopedia
 
Aspects of cancer pharmacogenomics include the consideration of the tumor genome and the germline genome to make better decisions on cancer treatment.

Cancer pharmacogenomics is the study of how variances in the genome influences an individual’s response to different cancer drug treatments. It is a subset of the broader field of pharmacogenomics, which is the area of study aimed at understanding how genetic variants influence drug efficacy and toxicity.

Cancer is a genetic disease where changes to genes can cause cells to grow and divide out of control. Each cancer can have a unique combination of genetic mutations, and even cells within the same tumour may have different genetic changes. In clinical settings, it has commonly been observed that the same types and doses of treatment can result in substantial differences in efficacy and toxicity across patients.  Thus, the application of pharmacogenomics within the field of cancer can offer key advantages for personalizing cancer therapy, minimizing treatment toxicity, and maximizing treatment efficacy. This can include choosing drugs that target specific mutations within cancer cells, identifying patients at risk for severe toxicity to a drug, and identifying treatments that a patient is most likely to benefit from. Applying pharmacogenomics within cancer has considerable differences compared to other complex diseases, as there are two genomes that need to be considered - the germline and the tumour. The germline genome considers inter-individual inherited genetic variations, and the tumour genome considers any somatic mutations that accrue as a cancer evolves. The accumulation of somatic mutations within the tumour genome represents variation in disease, and plays a major role in understanding how individuals will respond to treatments. Additionally, the germline genome affects toxicity reactions to a specific treatment due to its influence on drug exposure. Specifically, pharmacokinetic genes participate in the inactivation and elimination of active compounds. Therefore, differences within the germline genome should also be considered.

Strategies

Advances in cancer diagnostics and treatment have shifted the use of traditional methods of physical examination, in vivo, and histopathological analysis to assessment of cancer drivers, mutations, and targetable genomic biomarkers. There are an increasing number of genomic variants being studied and identified as potential therapeutically actionable targets and drug metabolism modifiers. Thus, a patient's genomic information, in addition to information about the patient's tumour, can be used to determine a personalized approach to cancer treatment.

Cancer-driven DNA alterations

Cancer-driven DNA alterations can include somatic DNA mutations and inherited DNA variants. They are not a direct focus of pharmacogenomic studies, but they can have an impact on pharmacogenomic strategies. These alterations can affect the pharmacokinetics and pharmacodynamics of metabolic pathways, making them potentially actionable drug-targets.   

As whole-genome technologies continue to advance, there will be increased opportunities to discover mutations and variants that are involved in tumour progression, response to therapy, and drug-metabolism.

Polymorphism search

Candidate polymorphism search refers to finding polymorphic DNA sequences within specific genes that are candidates for certain traits. Within pharmacogenomics, this method tries to resolve pharmacokinetic or pharmacodynamic traits of a compound to a candidate polymorphism level. This type of information can contribute to selecting effective therapeutic strategies for a patient.

To understand the potential functional impact of a polymorphic DNA sequence, gene silencing can be used. Previously, siRNAs have been commonly used to suppress gene expressions, but more recently, siRNA have been suggested for use in studying and developing therapeutics.

Another new method being applied is Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR). CRISPR, combined with the Cas9 enzyme, form the basis for the technology known as CRISPR-Cas9. This system can recognize and cleave specific DNA sequences, and thus is a powerful system for gene silencing purposes.

Pathway search

An extension on the previous strategies is candidate pathway search. This type of analysis considers a group of related genes, whose altered function may have an effect on therapy, rather than solely focusing on one gene. It can provide insight into additional information such as gene-gene interactions, epistatic effects, or influences from cis-regulatory elements. These all contribute to understanding variations in drug efficacy and toxicity between patients.

Whole-Genome Strategies

Advancements in the cost and throughput of sequencing technologies is making it possible to perform whole-genome sequencing at higher rates. The ability to perform whole-genome analysis for cancer patients can aid in identifying markers of predisposition to drug toxicity and efficacy. Strategies for pharmacogenomic discovery using whole-genome sequences include targeting frequently mutated gene stretches (known as hotspots) to identify markers of prognostic and diagnostic significance, or targeting specific genes that are known to be associated with a particular disease.

Gene target examples

HER2

HER2 is an established therapeutic target within breast cancer, and the activation of HER2 is observed in approximately 20% of breast cancers as a result of overexpression. Trastuzumab, the first HER2-targeted drug developed in 1990, interferes with HER2 signalling. In 2001, a study showed that adding trastuzumab to chemotherapy improved overall survival in women with HER2-positive metastatic breast cancer.  Then, in 2005, it was shown that trastuzumab is effective as an adjuvant treatment in women with early-stage breast cancer.  Thus, trastuzumab has been a standard-of-care treatment in both metastatic and early stage HER2-positive breast cancer cases. Many genome sequencing studies have also revealed  that other cancer tumours had HER2 alterations, including overexpression, amplifications and other mutations. Because of this, there has been a lot of interest in studying the efficacy of HER2-targeted therapies within a range of cancer types, including bladder, colorectal, and gastro-esophageal.

BRC-ABL

The majority of chronic myelogenous leukemia cases are caused by a rearrangement between chromosomes 9 and 22. This results in the fusion of the genes BCR and ABL. This atypical gene fusion encodes for unregulated tyrosine kinase activity, which results in the rapid and continuous division of white blood cells. Drugs known as tyrosine kinase inhibitors target BCR-ABL, and are the standard treatment for chronic myelogenous leukemia. Imatinib was the first tyrosine kinase inhibitor discovered with high specificity for targeting BCR-ABL. However, after imatinib was used as the first-line therapy, several BCR-ABL-dependent and BCR-ABL-independent mechanisms of resistance developed. Thus, new second-line and third-line drugs have also been developed to address new, mutated forms of BCR-ABL. These include dasatinib, nilotinib, bosutinib, and ponatinib.

Pharmacokinetic genes

Components of the pharmacodynamic influence of genes on drug exposure.

Cancer pharmacogenomics has also contributed to the understanding of how pharmacokinetic genes affect the exposure to cancer drugs, which can help predict patient sensitivity to treatment toxicity. Some of these findings have been successfully translated into clinical practice in the form of professional guidelines from the Clinical Pharmacogenomics Implementation Consortium (CPIC) or other institutions.

TPMT

The TPMT gene encodes for the thiopurine S-methyltransferase (TPMT) enzyme. It participates in the S-methylation of thiopurine drugs, which include 6-mercaptopurine, 6-thioguanine, and Azathioprine. The first two drugs are indicated for leukemias and lymphomas, while Azathioprine is used in nonmalignant conditions such as Crohn’s disease. These purine antimetabolites are activated in the form of thioguanine nucleotides that affect DNA replication when incorporated into DNA. This activation occurs through hypoxanthine phosphoribosyltransferase to 6-thioguanosines (6-TGN), and the resulting antimetabolites are inactivated by TPMT. It has been established that the TPMT genotype of a patient can affect the levels of exposure to the active metabolites, which has an impact in the treatment toxicity and efficacy. Specifically, TPMT-deficient patients, such as those homozygous for the *2 and *3 alleles, can experience myelosuppression up to pancytopenia. In a study on 1214 European Caucasian individuals, a trimodal distribution of TPMT genotypes was found, with 89.5% normal-to-high methylators, 9.9% intermediates, and 0.6% deficient methylators CPIC guidelines recommend a dose reduction of 5-10% of the standard dose and a lower frequency of application in individuals that are TPMT poor metabolizers.

DPD

The dihydropyrimidine dehydrogenase (DPD) protein is responsible for the inactivation of more than 80% of the anticancer drug 5-Fluorouracil (5-FU) in the liver. This drug is commonly used in colorectal cancer treatment, and increased exposure to it can cause myelosuppression, mucositis, neurotoxicity, hand-foot syndrome, and diarrhea. The genotype of DPYD (the gene that codes for DPD) has been linked to severe 5-FU toxicities in several studies summarized in meta-analyses. The CPIC has provided guidelines for implementation of DPYD pharmacogenetics, indicating that homozygote carriers of low-activity variants should be prescribed an alternative drug, while heterozygotes should receive half of the normal dose.

UGT1A1

The UDP glucuronosyltransferase 1A1 (UGT1A1) is an hepatic enzyme involved in the glucoronidation of exogenous and endogenous substrates, such as bilirubin. There have been over 100 variants identified in UGT1A1 and some mutations are implicated Gilbert syndrome and Cringler-Najjar syndrome. Two variants in particular, UGT1A1*28 and UGT1A1*6, are associated with the pharmacogenomics of irinotecan chemotherapy. A UGT1A1*28 allele means the presence of 7 TA repeats in the promoter sequence of the gene, instead of the normal 6 repeats. The allele UGT1A1*6 is characterized by a SNP in exon 1.

Irinotecan is a prodrug used in the treatment of many solid tumours, including colorectal, pancreatic, and lung cancer. Irinotecan is metabolized into its active compound SN-38, which inhibits the enzyme topoisomerase-1, involved in DNA replication. This active metabolite is inactivated after glucoronidation, mainly performed by UGT1A1.  High exposure to SN-38 can result in neutropenia and gastrointestinal toxicity. The decreased activity of UGT1A1 in UGT1A1*28 individuals  has been found to increase exposure to the active compound and toxicity. For UGT1A1*6, this relationship is more controversial, with some studies finding it can predict irinotecan toxicity while others don’t. Previous prospective studies for assessing the adequate dose of irinotecan in Asians have supported the usage of lower doses in patients with both of UGT1A1*28 and UGT1A1*6. The results from these and other pharmacogenomics studies have been translated into clinical guidelines from organizations in USA, Canada, France, The Netherlands, and Europe. All of these institutions recommend a dose reduction in UGT1A1*28 patients.

Challenges

One of the biggest challenges in using pharmacogenomics to study cancer is the difficulty in conducting studies in humans. Drugs used for chemotherapy are too toxic to give to healthy individuals, which makes it difficult to perform genetic studies between related individuals. Furthermore, some mutations occur at high frequencies, whereas others occur at very low frequencies, so there is often a need to screen a large number of patients in order to identify those with a particular genetic marker. And, although genomic-driven analyses is effective for stratifying patients and identifying possible treatment options, it is often difficult for laboratories to get reimbursed for these genomic sequencing tests. Thus, tracking clinical outcomes for patients whom undergo sequencing is key to demonstrating both the clinical utility and cost-effectiveness of pharmacogenomics within cancer.

Another challenge is that cancer patients are often treated with different combinations and dosages of drugs, so finding a large sample of patients that have been treated the same way is rare. So, studying the pharmacogenomics of a specific drug of interest is difficult, and, because additional identical trials may not be feasible, it can be difficult to replicate discoveries.

Furthermore, studies have shown that drug efficacy and toxicity are likely multigenic traits. Since pathways contain multiple genes, various combinations of driver mutations could promote tumour progression. This can make it difficult to distinguish between functional driver mutations versus random, nonfunctional mutations.

Future

Personalized Cancer Therapy.png

With new tools and technologies continuing to develop, there are growing opportunities to analyze cancer at the single-cell level. Corresponding approaches with whole-genome sequencing can also be applied to single-cell sequences and analyses. This level of pharmacogenomics has implications in personalized medicine, as single-cell RNA sequencing and genotyping can characterize subclones of the same tumour, and lead to the identification therapy-resistant cells, as well as their corresponding pathways.   

As the ability to analyze and profile cancers continues to improve, so will the therapies developed to treat them. And, with increasing attention being given to whole-genome sequencing and single-cell sequencing, there will be a growing amount of pharmacogenomic data to analyze. These analyses will rely on new and improved bioinformatics tools to help identify targetable genes and pathways, to help select safer and more effect therapies for cancer patients.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...