Search This Blog

Friday, December 5, 2025

Epigenetics

From Wikipedia, the free encyclopedia
Epigenetic mechanisms

Epigenetics is the study of changes in gene expression that occur without altering the DNA sequence. The Greek prefix epi- (ἐπι- "over, outside of, around") in epigenetics implies features that are "on top of" or "in addition to" the traditional DNA-sequence-based mechanism of inheritance. Epigenetics usually involves changes that persist through cell division, and affect the regulation of gene expression. Such effects on cellular and physiological traits may result from environmental factors, or be part of normal development.

The term also refers to the mechanism behind these changes: functionally relevant alterations to the genome that do not involve mutations in the nucleotide sequence. Examples of mechanisms that produce such changes are DNA methylation and histone modification, each of which alters how genes are expressed without altering the underlying DNA sequence. Further, non-coding RNA sequences have been shown to play a key role in the regulation of gene expression. Gene expression can be controlled through the action of repressor proteins that attach to silencer regions of the DNA. These epigenetic changes may last through cell divisions for the duration of the cell's life, and may also last for multiple generations, even though they do not involve changes in the underlying DNA sequence of the organism; instead, non-genetic factors cause the organism's genes to behave (or "express themselves") differently.

One example of an epigenetic change in eukaryotic biology is the process of cellular differentiation. During morphogenesis, totipotent stem cells become the various pluripotent cell lines of the embryo, which in turn become fully differentiated cells. In other words, as a single fertilized egg cell – the zygote – continues to divide, the resulting daughter cells develop into the different cell types in an organism, including neurons, muscle cells, epithelium, endothelium of blood vessels, etc., by activating some genes while inhibiting the expression of others.

Definitions

The term epigenesis has a generic meaning of "extra growth" that has been used in English since the 17th century. In scientific publications, the term epigenetics started to appear in the 1930s (see Fig. on the right). However, its contemporary meaning emerged only in the 1990s.

Number of patent families and non-patent documents with the term "epigenetic*" by publication year

A definition of the concept of epigenetic trait as a "stably heritable phenotype resulting from changes in a chromosome without alterations in the DNA sequence" was formulated at a Cold Spring Harbor meeting in 2008, although alternate definitions that include non-heritable traits are still being used widely.

Waddington's canalisation, 1940s

The hypothesis of epigenetic changes affecting the expression of chromosomes was put forth by the Russian biologist Nikolai Koltsov. From the generic meaning, and the associated adjective epigenetic, British embryologist C. H. Waddington coined the term epigenetics in 1942 as pertaining to epigenesis, in parallel to Valentin Haecker's 'phenogenetics' (Phänogenetik). Epigenesis in the context of the biology of that period referred to the differentiation of cells from their initial totipotent state during embryonic development.

When Waddington coined the term, the physical nature of genes and their role in heredity was not known. He used it instead as a conceptual model of how genetic components might interact with their surroundings to produce a phenotype; he used the phrase "epigenetic landscape" as a metaphor for biological development. Waddington held that cell fates were established during development in a process he called canalisation much as a marble rolls down to the point of lowest local elevation. Waddington suggested visualising increasing irreversibility of cell type differentiation as ridges rising between the valleys where the marbles (analogous to cells) are travelling.

In recent times, Waddington's notion of the epigenetic landscape has been rigorously formalized in the context of the systems dynamics state approach to the study of cell-fate. Cell-fate determination is predicted to exhibit certain dynamics, such as attractor-convergence (the attractor can be an equilibrium point, limit cycle or strange attractor) or oscillatory.

Contemporary

In 1990, Robin Holliday defined epigenetics as "the study of the mechanisms of temporal and spatial control of gene activity during the development of complex organisms."

More recent usage of the word in biology follows stricter definitions. As defined by Arthur Riggs and colleagues, it is "the study of mitotically and/or meiotically heritable changes in gene function that cannot be explained by changes in DNA sequence."

The term has also been used, however, to describe processes which have not been demonstrated to be heritable, such as some forms of histone modification. Consequently, there are attempts to redefine "epigenetics" in broader terms that would avoid the constraints of requiring heritability. For example, Adrian Bird defined epigenetics as "the structural adaptation of chromosomal regions so as to register, signal or perpetuate altered activity states." This definition would be inclusive of transient modifications associated with DNA repair or cell-cycle phases as well as stable changes maintained across multiple cell generations, but exclude others such as templating of membrane architecture and prions unless they impinge on chromosome function. Such redefinitions however are not universally accepted and are still subject to debate. The NIH "Roadmap Epigenomics Project", which ran from 2008 to 2017, uses the following definition: "For purposes of this program, epigenetics refers to both heritable changes in gene activity and expression (in the progeny of cells or of individuals) and also stable, long-term alterations in the transcriptional potential of a cell that are not necessarily heritable." In 2008, a consensus definition of the epigenetic trait, a "stably heritable phenotype resulting from changes in a chromosome without alterations in the DNA sequence," was made at a Cold Spring Harbor meeting.

The similarity of the word to "genetics" has generated many parallel usages. The "epigenome" is a parallel to the word "genome", referring to the overall epigenetic state of a cell, and epigenomics refers to global analyses of epigenetic changes across the entire genome. The phrase "genetic code" has also been adapted – the "epigenetic code" has been used to describe the set of epigenetic features that create different phenotypes in different cells from the same underlying DNA sequence. Taken to its extreme, the "epigenetic code" could represent the total state of the cell, with the position of each molecule accounted for in an epigenomic map, a diagrammatic representation of the gene expression, DNA methylation and histone modification status of a particular genomic region. More typically, the term is used in reference to systematic efforts to measure specific, relevant forms of epigenetic information such as the histone code or DNA methylation patterns.

Mechanisms

Covalent modification of either DNA (e.g. cytosine methylation and hydroxymethylation) or of histone proteins (e.g. lysine acetylation, lysine and arginine methylation, serine and threonine phosphorylation, and lysine ubiquitination and sumoylation) play central roles in many types of epigenetic inheritance. Therefore, the word "epigenetics" is sometimes used as a synonym for these processes. However, this can be misleading. Chromatin remodeling is not always inherited, and not all epigenetic inheritance involves chromatin remodeling. In 2019, a further lysine modification appeared in the scientific literature linking epigenetics modification to cell metabolism, i.e. lactylation.

DNA associates with histone proteins to form chromatin.

Because the phenotype of a cell or individual is affected by which of its genes are transcribed, heritable transcription states can give rise to epigenetic effects. There are several layers of regulation of gene expression. One way that genes are regulated is through the remodeling of chromatin. Chromatin is the complex of DNA and the histone proteins with which it associates. If the way that DNA is wrapped around the histones changes, gene expression can change as well. Chromatin remodeling is accomplished through two main mechanisms:

  1. The first way is post translational modification of the amino acids that make up histone proteins. Histone proteins are made up of long chains of amino acids. If the amino acids that are in the chain are changed, the shape of the histone might be modified. DNA is not completely unwound during replication. It is possible, then, that the modified histones may be carried into each new copy of the DNA. Once there, these histones may act as templates, initiating the surrounding new histones to be shaped in the new manner. By altering the shape of the histones around them, these modified histones would ensure that a lineage-specific transcription program is maintained after cell division.
  2. The second way is the addition of methyl groups to the DNA, mostly at CpG sites, to convert cytosine to 5-methylcytosine. 5-Methylcytosine performs much like a regular cytosine, pairing with a guanine in double-stranded DNA. However, when methylated cytosines are present in CpG sites in the promoter and enhancer regions of genes, the genes are often repressed. When methylated cytosines are present in CpG sites in the gene body (in the coding region excluding the transcription start site) expression of the gene is often enhanced. Transcription of a gene usually depends on a transcription factor binding to a (10 base or less) recognition sequence at the enhancer that interacts with the promoter region of that gene (Gene expression#Enhancers, transcription factors, mediator complex and DNA loops in mammalian transcription). About 22% of transcription factors are inhibited from binding when the recognition sequence has a methylated cytosine. In addition, presence of methylated cytosines at a promoter region can attract methyl-CpG-binding domain (MBD) proteins. All MBDs interact with nucleosome remodeling and histone deacetylase complexes, which leads to gene silencing. In addition, another covalent modification involving methylated cytosine is its demethylation by TET enzymes. Hundreds of such demethylations occur, for instance, during learning and memory forming events in neurons.

There is frequently a reciprocal relationship between DNA methylation and histone lysine methylation. For instance, the methyl binding domain protein MBD1, attracted to and associating with methylated cytosine in a DNA CpG site, can also associate with H3K9 methyltransferase activity to methylate histone 3 at lysine 9. On the other hand, DNA maintenance methylation by DNMT1 appears to partly rely on recognition of histone methylation on the nucleosome present at the DNA site to carry out cytosine methylation on newly synthesized DNA. There is further crosstalk between DNA methylation carried out by DNMT3A and DNMT3B and histone methylation so that there is a correlation between the genome-wide distribution of DNA methylation and histone methylation.

Mechanisms of heritability of histone state are not well understood; however, much is known about the mechanism of heritability of DNA methylation state during cell division and differentiation. Heritability of methylation state depends on certain enzymes (such as DNMT1) that have a higher affinity for 5-methylcytosine than for cytosine. If this enzyme reaches a "hemimethylated" portion of DNA (where 5-methylcytosine is in only one of the two DNA strands) the enzyme will methylate the other half. However, it is now known that DNMT1 physically interacts with the protein UHRF1. UHRF1 has been recently recognized as essential for DNMT1-mediated maintenance of DNA methylation. UHRF1 is the protein that specifically recognizes hemi-methylated DNA, therefore bringing DNMT1 to its substrate to maintain DNA methylation.

Activation
 
Repression
 
Some acetylations and some methylations of lysines (symbol K) are activation signals for transcription when present on a nucleosome, as shown in the top figure. Some methylations on lysines or arginine (R) are repression signals for transcription when present on a nucleosome, as shown in the bottom figure. Nucleosomes consist of four pairs of histone proteins in a tightly assembled core region plus up to 30% of each histone remaining in a loosely organized tail (only one tail of each pair is shown). DNA is wrapped around the histone core proteins in chromatin. The lysines (K) are designated with a number showing their position as, for instance (K4), indicating lysine as the 4th amino acid from the amino (N) end of the tail in the histone protein. Methylations [Me], and acetylations [Ac] are common post-translational modifications on the lysines of the histone tails.

Although histone modifications occur throughout the entire sequence, the unstructured N-termini of histones (called histone tails) are particularly highly modified. These modifications include acetylation, methylation, ubiquitylation, phosphorylation, sumoylation, ribosylation and citrullination. Acetylation is the most highly studied of these modifications. For example, acetylation of the K14 and K9 lysines of the tail of histone H3 by histone acetyltransferase enzymes (HATs) is generally related to transcriptional competence (see Figure).

One mode of thinking is that this tendency of acetylation to be associated with "active" transcription is biophysical in nature. Because it normally has a positively charged nitrogen at its end, lysine can bind the negatively charged phosphates of the DNA backbone. The acetylation event converts the positively charged amine group on the side chain into a neutral amide linkage. This removes the positive charge, thus loosening the DNA from the histone. When this occurs, complexes like SWI/SNF and other transcriptional factors can bind to the DNA and allow transcription to occur. This is the "cis" model of the epigenetic function. In other words, changes to the histone tails have a direct effect on the DNA itself.

Another model of epigenetic function is the "trans" model. In this model, changes to the histone tails act indirectly on the DNA. For example, lysine acetylation may create a binding site for chromatin-modifying enzymes (or transcription machinery as well). This chromatin remodeler can then cause changes to the state of the chromatin. Indeed, a bromodomain – a protein domain that specifically binds acetyl-lysine – is found in many enzymes that help activate transcription, including the SWI/SNF complex. It may be that acetylation acts in this and the previous way to aid in transcriptional activation.

The idea that modifications act as docking modules for related factors is borne out by histone methylation as well. Methylation of lysine 9 of histone H3 has long been associated with constitutively transcriptionally silent chromatin (constitutive heterochromatin) (see bottom Figure). It has been determined that a chromodomain (a domain that specifically binds methyl-lysine) in the transcriptionally repressive protein HP1 recruits HP1 to K9 methylated regions. One example that seems to refute this biophysical model for methylation is that tri-methylation of histone H3 at lysine 4 is strongly associated with (and required for full) transcriptional activation (see top Figure). Tri-methylation, in this case, would introduce a fixed positive charge on the tail.

It has been shown that the histone lysine methyltransferase (KMT) is responsible for this methylation activity in the pattern of histones H3 & H4. This enzyme utilizes a catalytically active site called the SET domain (Suppressor of variegation, Enhancer of Zeste, Trithorax). The SET domain is a 130-amino acid sequence involved in modulating gene activities. This domain has been demonstrated to bind to the histone tail and causes the methylation of the histone.

Differing histone modifications are likely to function in differing ways; acetylation at one position is likely to function differently from acetylation at another position. Also, multiple modifications may occur at the same time, and these modifications may work together to change the behavior of the nucleosome. The idea that multiple dynamic modifications regulate gene transcription in a systematic and reproducible way is called the histone code, although the idea that histone state can be read linearly as a digital information carrier has been largely debunked. One of the best-understood systems that orchestrate chromatin-based silencing is the SIR protein based silencing of the yeast hidden mating-type loci HML and HMR.

DNA methylation

DNA methylation often occurs in repeated sequences, and helps to suppress the expression and movement of 'transposable elements': Because 5-methylcytosine can spontaneously deaminate to thymidine(replacing nitrogen by oxygen), CpG sites are frequently mutated and have become rare in the genome, except at CpG islands where they typically remain unmethylated. Epigenetic changes of this type thus have the potential to direct increased frequencies of permanent genetic mutation. DNA methylation patterns are known to be established and modified in response to environmental factors by a complex interplay of at least three independent DNA methyltransferases, DNMT1, DNMT3A, and DNMT3B, the loss of any of which is lethal in mice. DNMT1 is the most abundant methyltransferase in somatic cells, localizes to replication foci, has a 10–40-fold preference for hemimethylated DNA and interacts with the proliferating cell nuclear antigen (PCNA).

By preferentially modifying hemimethylated DNA, DNMT1 transfers patterns of methylation to a newly synthesized strand after DNA replication, and therefore is often referred to as the 'maintenance' methyltransferase. DNMT1 is essential for proper embryonic development, imprinting and X-inactivation. To emphasize the difference of this molecular mechanism of inheritance from the canonical Watson-Crick base-pairing mechanism of transmission of genetic information, the term 'Epigenetic templating' was introduced. Furthermore, in addition to the maintenance and transmission of methylated DNA states, the same principle could work in the maintenance and transmission of histone modifications and even cytoplasmic (structural) heritable states.

RNA methylation

RNA methylation of N6-methyladenosine (m6A) as the most abundant eukaryotic RNA modification has recently been recognized as an important gene regulatory mechanism.

In 2011, it was demonstrated that the methylation of mRNA plays a critical role in human energy homeostasis. The obesity-associated FTO gene is shown to be able to demethylate N6-methyladenosine in RNA.

Histone modifications

Histones H3 and H4 can also be manipulated through demethylation using histone lysine demethylase (KDM). This recently identified enzyme has a catalytically active site called the Jumonji domain (JmjC). The demethylation occurs when JmjC utilizes multiple cofactors to hydroxylate the methyl group, thereby removing it. JmjC is capable of demethylating mono-, di-, and tri-methylated substrates.

Chromosomal regions can adopt stable and heritable alternative states resulting in bistable gene expression without changes to the DNA sequence. Epigenetic control is often associated with alternative covalent modifications of histones. The stability and heritability of states of larger chromosomal regions are suggested to involve positive feedback where modified nucleosomes recruit enzymes that similarly modify nearby nucleosomes. A simplified stochastic model for this type of epigenetics is found here.

It has been suggested that chromatin-based transcriptional regulation could be mediated by the effect of small RNAs. Small interfering RNAs can modulate transcriptional gene expression via epigenetic modulation of targeted promoters.

RNA transcripts

Sometimes, a gene, once activated, transcribes a product that directly or indirectly sustains its own activity. For example, Hnf4 and MyoD enhance the transcription of many liver-specific and muscle-specific genes, respectively, including their own, through the transcription factor activity of the proteins they encode. RNA signalling includes differential recruitment of a hierarchy of generic chromatin modifying complexes and DNA methyltransferases to specific loci by RNAs during differentiation and development. Other epigenetic changes are mediated by the production of different splice forms of RNA, or by formation of double-stranded RNA (RNAi). Descendants of the cell in which the gene was turned on will inherit this activity, even if the original stimulus for gene-activation is no longer present. These genes are often turned on or off by signal transduction, although in some systems where syncytia or gap junctions are important, RNA may spread directly to other cells or nuclei by diffusion. A large amount of RNA and protein is contributed to the zygote by the mother during oogenesis or via nurse cells, resulting in maternal effect phenotypes. A smaller quantity of sperm RNA is transmitted from the father, but there is recent evidence that this epigenetic information can lead to visible changes in several generations of offspring.

MicroRNAs

MicroRNAs (miRNAs) are members of non-coding RNAs that range in size from 17 to 25 nucleotides. miRNAs regulate a large variety of biological functions in plants and animals. So far, in 2013, about 2000 miRNAs have been discovered in humans and these can be found online in a miRNA database. Each miRNA expressed in a cell may target about 100 to 200 messenger RNAs(mRNAs) that it downregulates. Most of the downregulation of mRNAs occurs by causing the decay of the targeted mRNA, while some downregulation occurs at the level of translation into protein.

It appears that about 60% of human protein coding genes are regulated by miRNAs. Many miRNAs are epigenetically regulated. About 50% of miRNA genes are associated with CpG islands, that may be repressed by epigenetic methylation. Transcription from methylated CpG islands is strongly and heritably repressed. Other miRNAs are epigenetically regulated by either histone modifications or by combined DNA methylation and histone modification.

sRNAs

sRNAs are small (50–250 nucleotides), highly structured, non-coding RNA fragments found in bacteria. They control gene expression including virulence genes in pathogens and are viewed as new targets in the fight against drug-resistant bacteria. They play an important role in many biological processes, binding to mRNA and protein targets in prokaryotes. Their phylogenetic analyses, for example through sRNA–mRNA target interactions or protein binding properties, are used to build comprehensive databases. sRNA-gene maps based on their targets in microbial genomes are also constructed.

Long non-coding RNAs

Numerous investigations have demonstrated the pivotal involvement of long non-coding RNAs (lncRNAs) in the regulation of gene expression and chromosomal modifications, thereby exerting significant control over cellular differentiation. These long non-coding RNAs also contribute to genomic imprinting and the inactivation of the X chromosome. In invertebrates such as social insects of honey bees, long non-coding RNAs are detected as a possible epigenetic mechanism via allele-specific genes underlying aggression via reciprocal crosses.

Prions

Prions are infectious forms of proteins. In general, proteins fold into discrete units that perform distinct cellular functions, but some proteins are also capable of forming an infectious conformational state known as a prion. Although often viewed in the context of infectious disease, prions are more loosely defined by their ability to catalytically convert other native state versions of the same protein to an infectious conformational state. It is in this latter sense that they can be viewed as epigenetic agents capable of inducing a phenotypic change without a modification of the genome.

Fungal prions are considered by some to be epigenetic because the infectious phenotype caused by the prion can be inherited without modification of the genome. PSI+ and URE3, discovered in yeast in 1965 and 1971, are the two best studied of this type of prion. Prions can have a phenotypic effect through the sequestration of protein in aggregates, thereby reducing that protein's activity. In PSI+ cells, the loss of the Sup35 protein (which is involved in termination of translation) causes ribosomes to have a higher rate of read-through of stop codons, an effect that results in suppression of nonsense mutations in other genes. The ability of Sup35 to form prions may be a conserved trait. It could confer an adaptive advantage by giving cells the ability to switch into a PSI+ state and express dormant genetic features normally terminated by stop codon mutations.

Prion-based epigenetics has also been observed in Saccharomyces cerevisiae.

Molecular basis

Epigenetic changes modify the activation of certain genes, but not the genetic code sequence of DNA. The microstructure (not code) of DNA itself or the associated chromatin proteins may be modified, causing activation or silencing. This mechanism enables differentiated cells in a multicellular organism to express only the genes that are necessary for their own activity. Epigenetic changes are preserved when cells divide. Most epigenetic changes only occur within the course of one individual organism's lifetime; however, these epigenetic changes can be transmitted to the organism's offspring through a process called transgenerational epigenetic inheritance. Moreover, if gene inactivation occurs in a sperm or egg cell that results in fertilization, this epigenetic modification may also be transferred to the next generation.

Specific epigenetic processes include paramutation, bookmarking, imprinting, gene silencing, X chromosome inactivation, position effect, DNA methylation reprogramming, transvection, maternal effects, the progress of carcinogenesis, many effects of teratogens, regulation of histone modifications and heterochromatin, and technical limitations affecting parthenogenesis and cloning.

DNA damage

DNA damage can also cause epigenetic changes. DNA damage is very frequent, occurring on average about 60,000 times a day per cell of the human body (see DNA damage (naturally occurring)). These damages are largely repaired, however, epigenetic changes can still remain at the site of DNA repair. In particular, a double strand break in DNA can initiate unprogrammed epigenetic gene silencing both by causing DNA methylation as well as by promoting silencing types of histone modifications (chromatin remodeling - see next section). In addition, the enzyme Parp1 (poly(ADP)-ribose polymerase) and its product poly(ADP)-ribose (PAR) accumulate at sites of DNA damage as part of the repair process. This accumulation, in turn, directs recruitment and activation of the chromatin remodeling protein, ALC1, that can cause nucleosome remodeling. Nucleosome remodeling has been found to cause, for instance, epigenetic silencing of DNA repair gene MLH1. DNA damaging chemicals, such as benzene, hydroquinone, styrene, carbon tetrachloride and trichloroethylene, cause considerable hypomethylation of DNA, some through the activation of oxidative stress pathways.

Foods are known to alter the epigenetics of rats on different diets. Some food components epigenetically increase the levels of DNA repair enzymes such as MGMT and MLH1 and p53. Other food components can reduce DNA damage, such as soy isoflavones. In one study, markers for oxidative stress, such as modified nucleotides that can result from DNA damage, were decreased by a 3-week diet supplemented with soy. A decrease in oxidative DNA damage was also observed 2 h after consumption of anthocyanin-rich bilberry (Vaccinium myrtillius L.) pomace extract.

DNA repair

Damage to DNA is very common and is constantly being repaired. Epigenetic alterations can accompany DNA repair of oxidative damage or double-strand breaks. In human cells, oxidative DNA damage occurs about 10,000 times a day and DNA double-strand breaks occur about 10 to 50 times a cell cycle in somatic replicating cells (see DNA damage (naturally occurring)). The selective advantage of DNA repair is to allow the cell to survive in the face of DNA damage. The selective advantage of epigenetic alterations that occur with DNA repair is not clear.

Repair of oxidative DNA damage can alter epigenetic markers

In the steady state (with endogenous damages occurring and being repaired), there are about 2,400 oxidatively damaged guanines that form 8-oxo-2'-deoxyguanosine (8-OHdG) in the average mammalian cell DNA. 8-OHdG constitutes about 5% of the oxidative damages commonly present in DNA. The oxidized guanines do not occur randomly among all guanines in DNA. There is a sequence preference for the guanine at a methylated CpG site (a cytosine followed by guanine along its 5' → 3' direction and where the cytosine is methylated (5-mCpG)). A 5-mCpG site has the lowest ionization potential for guanine oxidation.

Initiation of DNA demethylation at a CpG site. In adult somatic cells DNA methylation typically occurs in the context of CpG dinucleotides (CpG sites), forming 5-methylcytosine-pG, or 5mCpG. Reactive oxygen species (ROS) may attack guanine at the dinucleotide site, forming 8-hydroxy-2'-deoxyguanosine (8-OHdG), and resulting in a 5mCp-8-OHdG dinucleotide site. The base excision repair enzyme OGG1 targets 8-OHdG and binds to the lesion without immediate excision. OGG1, present at a 5mCp-8-OHdG site recruits TET1 and TET1 oxidizes the 5mC adjacent to the 8-OHdG. This initiates demethylation of 5mC.

Oxidized guanine has mispairing potential and is mutagenic. Oxoguanine glycosylase (OGG1) is the primary enzyme responsible for the excision of the oxidized guanine during DNA repair. OGG1 finds and binds to an 8-OHdG within a few seconds. However, OGG1 does not immediately excise 8-OHdG. In HeLa cells half maximum removal of 8-OHdG occurs in 30 minutes, and in irradiated mice, the 8-OHdGs induced in the mouse liver are removed with a half-life of 11 minutes.

When OGG1 is present at an oxidized guanine within a methylated CpG site it recruits TET1 to the 8-OHdG lesion (see Figure). This allows TET1 to demethylate an adjacent methylated cytosine. Demethylation of cytosine is an epigenetic alteration.

As an example, when human mammary epithelial cells were treated with H2O2 for six hours, 8-OHdG increased about 3.5-fold in DNA and this caused about 80% demethylation of the 5-methylcytosines in the genome. Demethylation of CpGs in a gene promoter by TET enzyme activity increases transcription of the gene into messenger RNA. In cells treated with H2O2, one particular gene was examined, BACE1. The methylation level of the BACE1 CpG island was reduced (an epigenetic alteration) and this allowed about 6.5 fold increase of expression of BACE1 messenger RNA.

While six-hour incubation with H2O2 causes considerable demethylation of 5-mCpG sites, shorter times of H2O2 incubation appear to promote other epigenetic alterations. Treatment of cells with H2O2 for 30 minutes causes the mismatch repair protein heterodimer MSH2-MSH6 to recruit DNA methyltransferase 1 (DNMT1) to sites of some kinds of oxidative DNA damage. This could cause increased methylation of cytosines (epigenetic alterations) at these locations.

Jiang et al. treated HEK 293 cells with agents causing oxidative DNA damage, (potassium bromate (KBrO3) or potassium chromate (K2CrO4)). Base excision repair (BER) of oxidative damage occurred with the DNA repair enzyme polymerase beta localizing to oxidized guanines. Polymerase beta is the main human polymerase in short-patch BER of oxidative DNA damage. Jiang et al. also found that polymerase beta recruited the DNA methyltransferase protein DNMT3b to BER repair sites. They then evaluated the methylation pattern at the single nucleotide level in a small region of DNA including the promoter region and the early transcription region of the BRCA1 gene. Oxidative DNA damage from bromate modulated the DNA methylation pattern (caused epigenetic alterations) at CpG sites within the region of DNA studied. In untreated cells, CpGs located at −189, −134, −29, −19, +16, and +19 of the BRCA1 gene had methylated cytosines (where numbering is from the messenger RNA transcription start site, and negative numbers indicate nucleotides in the upstream promoter region). Bromate treatment-induced oxidation resulted in the loss of cytosine methylation at −189, −134, +16 and +19 while also leading to the formation of new methylation at the CpGs located at −80, −55, −21 and +8 after DNA repair was allowed.

Homologous recombinational repair alters epigenetic markers

At least four articles report the recruitment of DNA methyltransferase 1 (DNMT1) to sites of DNA double-strand breaks. During homologous recombinational repair (HR) of the double-strand break, the involvement of DNMT1 causes the two repaired strands of DNA to have different levels of methylated cytosines. One strand becomes frequently methylated at about 21 CpG sites downstream of the repaired double-strand break. The other DNA strand loses methylation at about six CpG sites that were previously methylated downstream of the double-strand break, as well as losing methylation at about five CpG sites that were previously methylated upstream of the double-strand break. When the chromosome is replicated, this gives rise to one daughter chromosome that is heavily methylated downstream of the previous break site and one that is unmethylated in the region both upstream and downstream of the previous break site. With respect to the gene that was broken by the double-strand break, half of the progeny cells express that gene at a high level and in the other half of the progeny cells expression of that gene is repressed. When clones of these cells were maintained for three years, the new methylation patterns were maintained over that time period.

In mice with a CRISPR-mediated homology-directed recombination insertion in their genome there were a large number of increased methylations of CpG sites within the double-strand break-associated insertion.

Non-homologous end joining can cause some epigenetic marker alterations

Non-homologous end joining (NHEJ) repair of a double-strand break can cause a small number of demethylations of pre-existing cytosine DNA methylations downstream of the repaired double-strand break. Further work by Allen et al. showed that NHEJ of a DNA double-strand break in a cell could give rise to some progeny cells having repressed expression of the gene harboring the initial double-strand break and some progeny having high expression of that gene due to epigenetic alterations associated with NHEJ repair. The frequency of epigenetic alterations causing repression of a gene after an NHEJ repair of a DNA double-strand break in that gene may be about 0.9%.

Techniques used to study epigenetics

Epigenetic research uses a wide range of molecular biological techniques to further understanding of epigenetic phenomena. These techniques include chromatin immunoprecipitation (together with its large-scale variants ChIP-on-chip and ChIP-Seq), fluorescent in situ hybridization, methylation-sensitive restriction enzymes, DNA adenine methyltransferase identification (DamID) and bisulfite sequencing. Furthermore, the use of bioinformatics methods has a role in computational epigenetics.

Chromatin Immunoprecipitation

Chromatin Immunoprecipitation (ChIP) has helped bridge the gap between DNA and epigenetic interactions. With the use of ChIP, researchers are able to make findings in regards to gene regulation, transcription mechanisms, and chromatin structure.

Fluorescent in situ hybridization

Fluorescent in situ hybridization (FISH) is very important to understand epigenetic mechanisms. FISH can be used to find the location of genes on chromosomes, as well as finding noncoding RNAs. FISH is predominantly used for detecting chromosomal abnormalities in humans.

Methylation-sensitive restriction enzymes

Methylation sensitive restriction enzymes paired with PCR is a way to evaluate methylation in DNA - specifically the CpG sites. If DNA is methylated, the restriction enzymes will not cleave the strand. Contrarily, if the DNA is not methylated, the enzymes will cleave the strand and it will be amplified by PCR.

Bisulfite sequencing

Bisulfite sequencing is another way to evaluate DNA methylation. Cytosine will be changed to uracil from being treated with sodium bisulfite, whereas methylated cytosines will not be affected.

Nanopore sequencing

Certain sequencing methods, such as nanopore sequencing, allow sequencing of native DNA. Native (=unamplified) DNA retains the epigenetic modifications which would otherwise be lost during the amplification step. Nanopore basecaller models can distinguish between the signals obtained for epigenetically modified bases and unaltered based and provide an epigenetic profile in addition to the sequencing result.

Structural inheritance

In ciliates such as Tetrahymena and Paramecium, genetically identical cells show heritable differences in the patterns of ciliary rows on their cell surface. Experimentally altered patterns can be transmitted to daughter cells. It seems existing structures act as templates for new structures. The mechanisms of such inheritance are unclear, but reasons exist to assume that multicellular organisms also use existing cell structures to assemble new ones.

Nucleosome positioning

Eukaryotic genomes have numerous nucleosomes. Nucleosome position is not random, and determine the accessibility of DNA to regulatory proteins. Promoters active in different tissues have been shown to have different nucleosome positioning features. This determines differences in gene expression and cell differentiation. It has been shown that at least some nucleosomes are retained in sperm cells (where most but not all histones are replaced by protamines). Thus nucleosome positioning is to some degree inheritable. Recent studies have uncovered connections between nucleosome positioning and other epigenetic factors, such as DNA methylation and hydroxymethylation.

Histone variants

Different histone variants are incorporated into specific regions of the genome non-randomly. Their differential biochemical characteristics can affect genome functions via their roles in gene regulation, and maintenance of chromosome structures.

Genomic architecture

The three-dimensional configuration of the genome (the 3D genome) is complex, dynamic and crucial for regulating genomic function and nuclear processes such as DNA replication, transcription and DNA-damage repair.

Functions and consequences

In the brain

Memory

Memory formation and maintenance are due to epigenetic alterations that cause the required dynamic changes in gene transcription that create and renew memory in neurons.

An event can set off a chain of reactions that result in altered methylations of a large set of genes in neurons, which give a representation of the event, a memory.

Including medial prefrontal cortex (mPFC)

Areas of the brain important in the formation of memories include the hippocampus, medial prefrontal cortex (mPFC), anterior cingulate cortex and amygdala, as shown in the diagram of the human brain in this section.

When a strong memory is created, as in a rat subjected to contextual fear conditioning (CFC), one of the earliest events to occur is that more than 100 DNA double-strand breaks are formed by topoisomerase IIB in neurons of the hippocampus and the medial prefrontal cortex (mPFC). These double-strand breaks are at specific locations that allow activation of transcription of immediate early genes (IEGs) that are important in memory formation, allowing their expression in mRNA, with peak mRNA transcription at seven to ten minutes after CFC.

Two important IEGs in memory formation are EGR1 and the alternative promoter variant of DNMT3A, DNMT3A2. EGR1 protein binds to DNA at its binding motifs, 5′-GCGTGGGCG-3′ or 5′-GCGGGGGCGG-3', and there are about 12,000 genome locations at which EGR1 protein can bind. EGR1 protein binds to DNA in gene promoter and enhancer regions. EGR1 recruits the demethylating enzyme TET1 to an association, and brings TET1 to about 600 locations on the genome where TET1 can then demethylate and activate the associated genes.

Cytosine and 5-methylcytosine

The DNA methyltransferases DNMT3A1, DNMT3A2 and DNMT3B can all methylate cytosines (see image this section) at CpG sites in or near the promoters of genes. As shown by Manzo et al., these three DNA methyltransferases differ in their genomic binding locations and DNA methylation activity at different regulatory sites. Manzo et al. located 3,970 genome regions exclusively enriched for DNMT3A1, 3,838 regions for DNMT3A2 and 3,432 regions for DNMT3B. When DNMT3A2 is newly induced as an IEG (when neurons are activated), many new cytosine methylations occur, presumably in the target regions of DNMT3A2. Oliviera et al. found that the neuronal activity-inducible IEG levels of Dnmt3a2 in the hippocampus determined the ability to form long-term memories.

Rats form long-term associative memories after contextual fear conditioning (CFC). Duke et al. found that 24 hours after CFC in rats, in hippocampus neurons, 2,097 genes (9.17% of the genes in the rat genome) had altered methylation. When newly methylated cytosines are present in CpG sites in the promoter regions of genes, the genes are often repressed, and when newly demethylated cytosines are present the genes may be activated. After CFC, there were 1,048 genes with reduced mRNA expression and 564 genes with upregulated mRNA expression. Similarly, when mice undergo CFC, one hour later in the hippocampus region of the mouse brain there are 675 demethylated genes and 613 hypermethylated genes. However, memories do not remain in the hippocampus, but after four or five weeks the memories are stored in the anterior cingulate cortex. In the studies on mice after CFC, Halder et al. showed that four weeks after CFC there were at least 1,000 differentially methylated genes and more than 1,000 differentially expressed genes in the anterior cingulate cortex, while at the same time the altered methylations in the hippocampus were reversed.

The epigenetic alteration of methylation after a new memory is established creates a different pool of nuclear mRNAs. As reviewed by Bernstein, the epigenetically determined new mix of nuclear mRNAs are often packaged into neuronal granules, or messenger RNP, consisting of mRNA, small and large ribosomal subunits, translation initiation factors and RNA-binding proteins that regulate mRNA function. These neuronal granules are transported from the neuron nucleus and are directed, according to 3′ untranslated regions of the mRNA in the granules (their "zip codes"), to neuronal dendrites. Roughly 2,500 mRNAs may be localized to the dendrites of hippocampal pyramidal neurons and perhaps 450 transcripts are in excitatory presynaptic nerve terminals (dendritic spines). The altered assortments of transcripts (dependent on epigenetic alterations in the neuron nucleus) have different sensitivities in response to signals, which is the basis of altered synaptic plasticity. Altered synaptic plasticity is often considered the neurochemical foundation of learning and memory.

Aging

Epigenetics play a major role in brain aging and age-related cognitive decline, with relevance to life extension.

Other and general

In adulthood, changes in the epigenome are important for various higher cognitive functions. Dysregulation of epigenetic mechanisms is implicated in neurodegenerative disorders and diseases. Epigenetic modifications in neurons are dynamic and reversible. Epigenetic regulation impacts neuronal action, affecting learning, memory, and other cognitive processes.

Early events, including during embryonic development, can influence development, cognition, and health outcomes through epigenetic mechanisms.

Epigenetic mechanisms have been proposed as "a potential molecular mechanism for effects of endogenous hormones on the organization of developing brain circuits".

Nutrients could interact with the epigenome to "protect or boost cognitive processes across the lifespan".

With the axo-ciliary synapse, there is communication between serotonergic axons and antenna-like primary cilia of CA1 pyramidal neurons that alters the neuron's epigenetic state in the nucleus via the signalling distinct from that at the plasma membrane (and longer-term).

Epigenetics also play a major role in the brain evolution in and to humans.

Development

Developmental epigenetics can be divided into predetermined and probabilistic epigenesis. Predetermined epigenesis is a unidirectional movement from structural development in DNA to the functional maturation of the protein. "Predetermined" here means that development is scripted and predictable. Probabilistic epigenesis on the other hand is a bidirectional structure-function development with experiences and external molding development.

Somatic epigenetic inheritance, particularly through DNA and histone covalent modifications and nucleosome repositioning, is very important in the development of multicellular eukaryotic organisms. The genome sequence is static (with some notable exceptions), but cells differentiate into many different types, which perform different functions, and respond differently to the environment and intercellular signaling. Thus, as individuals develop, morphogens activate or silence genes in an epigenetically heritable fashion, giving cells a memory. In mammals, most cells terminally differentiate, with only stem cells retaining the ability to differentiate into several cell types ("totipotency" and "multipotency"). In mammals, some stem cells continue producing newly differentiated cells throughout life, such as in neurogenesis, but mammals are not able to respond to loss of some tissues, for example, the inability to regenerate limbs, which some other animals are capable of. Epigenetic modifications regulate the transition from neural stem cells to glial progenitor cells (for example, differentiation into oligodendrocytes is regulated by the deacetylation and methylation of histones). Unlike animals, plant cells do not terminally differentiate, remaining totipotent with the ability to give rise to a new individual plant. While plants do utilize many of the same epigenetic mechanisms as animals, such as chromatin remodeling, it has been hypothesized that some kinds of plant cells do not use or require "cellular memories", resetting their gene expression patterns using positional information from the environment and surrounding cells to determine their fate.

Epigenetic changes can occur in response to environmental exposure – for example, maternal dietary supplementation with genistein (250 mg/kg) have epigenetic changes affecting expression of the agouti gene, which affects their fur color, weight, and propensity to develop cancer. Ongoing research is focused on exploring the impact of other known teratogens, such as diabetic embryopathy, on methylation signatures.

Controversial results from one study suggested that traumatic experiences might produce an epigenetic signal that is capable of being passed to future generations. Mice were trained, using foot shocks, to fear a cherry blossom odor. The investigators reported that the mouse offspring had an increased aversion to this specific odor. They suggested epigenetic changes that increase gene expression, rather than in DNA itself, in a gene, M71, that governs the functioning of an odor receptor in the nose that responds specifically to this cherry blossom smell. There were physical changes that correlated with olfactory (smell) function in the brains of the trained mice and their descendants. Several criticisms were reported, including the study's low statistical power as evidence of some irregularity such as bias in reporting results. Due to limits of sample size, there is a probability that an effect will not be demonstrated to within statistical significance even if it exists. The criticism suggested that the probability that all the experiments reported would show positive results if an identical protocol was followed, assuming the claimed effects exist, is merely 0.4%. The authors also did not indicate which mice were siblings, and treated all of the mice as statistically independent. The original researchers pointed out negative results in the paper's appendix that the criticism omitted in its calculations, and undertook to track which mice were siblings in the future.

Transgenerational

Epigenetic mechanisms were a necessary part of the evolutionary origin of cell differentiation. Although epigenetics in multicellular organisms is generally thought to be a mechanism involved in differentiation, with epigenetic patterns "reset" when organisms reproduce, there have been some observations of transgenerational epigenetic inheritance (e.g., the phenomenon of paramutation observed in maize). Although most of these multigenerational epigenetic traits are gradually lost over several generations, the possibility remains that multigenerational epigenetics could be another aspect to evolution and adaptation. As mentioned above, some define epigenetics as heritable.

A sequestered germ line or Weismann barrier is specific to animals, and epigenetic inheritance is more common in plants and microbes. Eva Jablonka, Marion J. Lamb and Étienne Danchin have argued that these effects may require enhancements to the standard conceptual framework of the modern synthesis and have called for an extended evolutionary synthesis. Other evolutionary biologists, such as John Maynard Smith, have incorporated epigenetic inheritance into population-genetics models or are openly skeptical of the extended evolutionary synthesis (Michael Lynch). Thomas Dickins and Qazi Rahman state that epigenetic mechanisms such as DNA methylation and histone modification are genetically inherited under the control of natural selection and therefore fit under the earlier "modern synthesis".

Two important ways in which epigenetic inheritance can differ from traditional genetic inheritance, with important consequences for evolution, are:

  • rates of epimutation can be much faster than rates of mutation
  • the epimutations are more easily reversible

In plants, heritable DNA methylation mutations are 100,000 times more likely to occur compared to DNA mutations. An epigenetically inherited element such as the PSI+ system can act as a "stop-gap", good enough for short-term adaptation that allows the lineage to survive for long enough for mutation and/or recombination to genetically assimilate the adaptive phenotypic change. The existence of this possibility increases the evolvability of a species.

More than 100 cases of transgenerational epigenetic inheritance phenomena have been reported in a wide range of organisms, including prokaryotes, plants, and animals.[177] For instance, mourning-cloak butterflies will change color through hormone changes in response to experimentation of varying temperatures.

The filamentous fungus Neurospora crassa is a prominent model system for understanding the control and function of cytosine methylation. In this organism, DNA methylation is associated with relics of a genome-defense system called RIP (repeat-induced point mutation) and silences gene expression by inhibiting transcription elongation.

The yeast prion PSI is generated by a conformational change of a translation termination factor, which is then inherited by daughter cells. This can provide a survival advantage under adverse conditions, exemplifying epigenetic regulation which enables unicellular organisms to respond rapidly to environmental stress. Prions can be viewed as epigenetic agents capable of inducing a phenotypic change without modification of the genome.

Direct detection of epigenetic marks in microorganisms is possible with single molecule real time sequencing, in which polymerase sensitivity allows for measuring methylation and other modifications as a DNA molecule is being sequenced. Several projects have demonstrated the ability to collect genome-wide epigenetic data in bacteria.

Epigenetics in bacteria

Escherichia coli bacteria

While epigenetics is of fundamental importance in eukaryotes, especially metazoans, it plays a different role in bacteria. Most importantly, eukaryotes use epigenetic mechanisms primarily to regulate gene expression which bacteria rarely do. However, bacteria make widespread use of postreplicative DNA methylation for the epigenetic control of DNA-protein interactions. Bacteria also use DNA adenine methylation (rather than DNA cytosine methylation) as an epigenetic signal. DNA adenine methylation is important in bacteria virulence in organisms such as Escherichia coli, Salmonella, Vibrio, Yersinia, Haemophilus, and Brucella. In Alphaproteobacteria, methylation of adenine regulates the cell cycle and couples gene transcription to DNA replication. In Gammaproteobacteria, adenine methylation provides signals for DNA replication, chromosome segregation, mismatch repair, packaging of bacteriophage, transposase activity and regulation of gene expression. There exists a genetic switch controlling Streptococcus pneumoniae (the pneumococcus) that allows the bacterium to randomly change its characteristics into six alternative states that could pave the way to improved vaccines. Each form is randomly generated by a phase variable methylation system. The ability of the pneumococcus to cause deadly infections is different in each of these six states. Similar systems exist in other bacterial genera. In Bacillota such as Clostridioides difficile, adenine methylation regulates sporulation, biofilm formation and host-adaptation.

Medicine

Epigenetics has many and varied potential medical applications.

Twins

Direct comparisons of identical twins constitute an optimal model for interrogating environmental epigenetics. In the case of humans with different environmental exposures, monozygotic (identical) twins were epigenetically indistinguishable during their early years, while older twins had remarkable differences in the overall content and genomic distribution of 5-methylcytosine DNA and histone acetylation. The twin pairs who had spent less of their lifetime together and/or had greater differences in their medical histories were those who showed the largest differences in their levels of 5-methylcytosine DNA and acetylation of histones H3 and H4.

Dizygotic (fraternal) and monozygotic (identical) twins show evidence of epigenetic influence in humans sequence differences that would be abundant in a singleton-based study do not interfere with the analysis. Environmental differences can produce long-term epigenetic effects, and different developmental monozygotic twin subtypes may be different with respect to their susceptibility to be discordant from an epigenetic point of view.

A high-throughput study, which denotes technology that looks at extensive genetic markers, focused on epigenetic differences between monozygotic twins to compare global and locus-specific changes in DNA methylation and histone modifications in a sample of 40 monozygotic twin pairs. In this case, only healthy twin pairs were studied, but a wide range of ages was represented, between 3 and 74 years. One of the major conclusions from this study was that there is an age-dependent accumulation of epigenetic differences between the two siblings of twin pairs. This accumulation suggests the existence of epigenetic "drift". Epigenetic drift is the term given to epigenetic modifications as they occur as a direct function with age. While age is a known risk factor for many diseases, age-related methylation has been found to occur differentially at specific sites along the genome. Over time, this can result in measurable differences between biological and chronological age. Epigenetic changes have been found to be reflective of lifestyle and may act as functional biomarkers of disease before clinical threshold is reached.\

A more recent study, where 114 monozygotic twins and 80 dizygotic twins were analyzed for the DNA methylation status of around 6000 unique genomic regions, concluded that epigenetic similarity at the time of blastocyst splitting may also contribute to phenotypic similarities in monozygotic co-twins. This supports the notion that microenvironment at early stages of embryonic development can be quite important for the establishment of epigenetic marks. Congenital genetic disease is well understood and it is clear that epigenetics can play a role, for example, in the case of Angelman syndrome and Prader–Willi syndrome. These are normal genetic diseases caused by gene deletions or inactivation of the genes but are unusually common because individuals are essentially hemizygous because of genomic imprinting, and therefore a single gene knock out is sufficient to cause the disease, where most cases would require both copies to be knocked out.

Genomic imprinting

Some human disorders are associated with genomic imprinting, a phenomenon in mammals where the father and mother contribute different epigenetic patterns for specific genomic loci in their germ cells. The best-known case of imprinting in human disorders is that of Angelman syndrome and Prader–Willi syndrome – both can be produced by the same genetic mutation, chromosome 15q partial deletion, and the particular syndrome that will develop depends on whether the mutation is inherited from the child's mother or from their father.

In the Överkalix study, paternal (but not maternal) grandsons of Swedish men who were exposed during preadolescence to famine in the 19th century were less likely to die of cardiovascular disease. If food was plentiful, then diabetes mortality in the grandchildren increased, suggesting that this was a transgenerational epigenetic inheritance. The opposite effect was observed for females – the paternal (but not maternal) granddaughters of women who experienced famine while in the womb (and therefore while their eggs were being formed) lived shorter lives on average.

Examples of drugs altering gene expression from epigenetic events

The use of beta-lactam antibiotics can alter glutamate receptor activity and the action of cyclosporine on multiple transcription factors. Additionally, lithium can impact autophagy of aberrant proteins, and opioid drugs via chronic use can increase the expression of genes associated with addictive phenotypes.[

Parental nutrition, in utero exposure to stress or endocrine disrupting chemicals, male-induced maternal effects such as the attraction of differential mate quality, and maternal as well as paternal age, and offspring gender could all possibly influence whether a germline epimutation is ultimately expressed in offspring and the degree to which intergenerational inheritance remains stable throughout posterity. However, whether and to what extent epigenetic effects can be transmitted across generations remains unclear, particularly in humans.

Addiction

Addiction is a disorder of the brain's reward system which arises through transcriptional and neuroepigenetic mechanisms and occurs over time from chronically high levels of exposure to an addictive stimulus (e.g., morphine, cocaine, sexual intercourse, gambling). Transgenerational epigenetic inheritance of addictive phenotypes has been noted to occur in preclinical studies. However, robust evidence in support of the persistence of epigenetic effects across multiple generations has yet to be established in humans; for example, an epigenetic effect of prenatal exposure to smoking that is observed in great-grandchildren who had not been exposed.

Research

The two forms of heritable information, namely genetic and epigenetic, are collectively called dual inheritance. Members of the APOBEC/AID family of cytosine deaminases may concurrently influence genetic and epigenetic inheritance using similar molecular mechanisms, and may be a point of crosstalk between these conceptually compartmentalized processes.

Fluoroquinolone antibiotics induce epigenetic changes in mammalian cells through iron chelation. This leads to epigenetic effects through inhibition of α-ketoglutarate-dependent dioxygenases that require iron as a co-factor.

Various pharmacological agents are applied for the production of induced pluripotent stem cells (iPSC) or maintain the embryonic stem cell (ESC) phenotypic via epigenetic approach. Adult stem cells like bone marrow stem cells have also shown a potential to differentiate into cardiac competent cells when treated with G9a histone methyltransferase inhibitor BIX01294.

Cell plasticity, which is the adaptation of cells to stimuli without changes in their genetic code, requires epigenetic changes. These have been observed in cell plasticity in cancer cells during epithelial-to-mesenchymal transition and also in immune cells, such as macrophages. Interestingly, metabolic changes underlie these adaptations, since various metabolites play crucial roles in the chemistry of epigenetic marks. This includes for instance alpha-ketoglutarate, which is required for histone demethylation, and acetyl-Coenzyme A, which is required for histone acetylation.

Epigenome editing

Epigenetic regulation of gene expression that could be altered or used in epigenome editing are or include mRNA/lncRNA modification, DNA methylation modification and histone modification.

CpG sites, SNPs and biological traits

Methylation is a widely characterized mechanism of genetic regulation that can determine biological traits. However, strong experimental evidences correlate methylation patterns in SNPs as an important additional feature for the classical activation/inhibition epigenetic dogma. Molecular interaction data, supported by colocalization analyses, identify multiple nuclear regulatory pathways, linking sequence variation to disturbances in DNA methylation and molecular and phenotypic variation.

UBASH3B locus

UBASH3B encodes a protein with tyrosine phosphatase activity, which has been previously linked to advanced neoplasia. SNP rs7115089 was identified as influencing DNA methylation and expression of this locus, as well as and Body Mass Index (BMI). In fact, SNP rs7115089 is strongly associated with BMI and with genetic variants linked to other cardiovascular and metabolic traits in GWASs. New studies suggesting UBASH3B as a potential mediator of adiposity and cardiometabolic disease. In addition, animal models demonstrated that UBASH3B expression is an indicator of caloric restriction that may drive programmed susceptibility to obesity and it is associated with other measures of adiposity in human peripherical blood.

NFKBIE locus

SNP rs730775 is located in the first intron of NFKBIE and is a cis eQTL for NFKBIE in whole blood. Nuclear factor (NF)-κB inhibitor ε (NFKBIE) directly inhibits NF-κB1 activity and is significantly co-expressed with NF-κB1, also, it is associated with rheumatoid arthritis. Colocalization analysis supports that variants for the majority of the CpG sites in SNP rs730775 cause genetic variation at the NFKBIE locus which is suggestible linked to rheumatoid arthritis through trans acting regulation of DNA methylation by NF-κB.

FADS1 locus

Fatty acid desaturase 1 (FADS1) is a key enzyme in the metabolism of fatty acids. Moreover, rs174548 in the FADS1 gene shows increased correlation with DNA methylation in people with high abundance of CD8+ T cells. SNP rs174548 is strongly associated with concentrations of arachidonic acid and other metabolites in fatty acid metabolism, blood eosinophil counts. and inflammatory diseases such as asthma. Interaction results indicated a correlation between rs174548 and asthma, providing new insights about fatty acid metabolism in CD8+ T cells with immune phenotypes.

Pseudoscience

As epigenetics is in the early stages of development as a science and is surrounded by sensationalism in the public media, David Gorski and geneticist Adam Rutherford have advised caution against the proliferation of false and pseudoscientific conclusions by new age authors making unfounded suggestions that a person's genes and health can be manipulated by mind control. Misuse of the scientific term by quack authors has produced misinformation among the general public.

Wednesday, December 3, 2025

Innatism

From Wikipedia, the free encyclopedia

In the philosophy of mind, innatism is the view that the mind is born with already-formed ideas, knowledge, and beliefs. The opposing doctrine, that the mind is a tabula rasa (blank slate) at birth and all knowledge is gained from experience and the senses, is called empiricism.

Difference from nativism

Innatism and nativism are generally synonymous terms referring to the notion of preexisting ideas in the mind. However, more specifically, innatism refers to the philosophy of Descartes, who assumed that God or a similar being or process placed innate ideas and principles in the human mind. The innatist principles in this regard may overlap with similar concepts such as natural order and state of nature, in philosophy.

Nativism represents an adaptation of this, grounded in the fields of genetics, cognitive psychology, and psycholinguistics. Nativists hold that innate beliefs are in some way genetically programmed in our mind—they are the phenotypes of certain genotypes that all humans share in common. Nativism is a modern view rooted in innatism. The advocates of nativism are mainly philosophers who also work in the field of cognitive psychology or psycholinguistics: most notably Noam Chomsky and Jerry Fodor (although the latter adopted a more critical attitude toward nativism in his later writings). The nativist's general objection against empiricism is still the same as was raised by the rationalists; the human mind of a newborn child is not a tabula rasa but is equipped with an inborn structure.

History

Although individual human beings vary in many ways (culturally, ethnically, linguistically, and so on), innate ideas are the same for everyone everywhere. For example, the philosopher René Descartes theorized that knowledge of God is innate in everybody. Philosophers such as Descartes and Plato were rationalists. Other philosophers, most notably the empiricists, were critical of innate ideas and denied they existed.

The debate over innate ideas is central to the conflict between rationalists (who believe certain ideas exist independently of experience) and empiricists (who believe knowledge is derived from experience).

Many believe the German philosopher Immanuel Kant synthesized these two early modern traditions in his philosophical thought.

Plato

Plato argues that if there are certain concepts that we know to be true but did not learn from experience, then it must be because we have an innate knowledge of it and that this knowledge must have been gained before birth. In Plato's Meno, he recalls a situation where his mentor Socrates questioned a slave boy about geometry. Though the slave boy had no previous experience with geometry, he was able to answer correctly. Plato reasoned that this was possible because Socrates' questions sparked the innate knowledge of math the boy had from birth.

Descartes

Descartes conveys the idea that innate knowledge or ideas is something inborn such as one would say, that a certain disease might be 'innate' to signify that a person might be at risk of contracting such a disease. He suggests that something that is 'innate' is effectively present from birth and while it may not reveal itself then, is more than likely to present itself later in life. Descartes’ comparison of innate knowledge to an innate disease, whose symptoms may show up only later in life, unless prohibited by a factor like age or puberty, suggests that if an event occurs prohibiting someone from exhibiting an innate behaviour or knowledge, it doesn't mean the knowledge did not exist at all but rather it wasn't expressed – they were not able to acquire that knowledge. In other words, innate beliefs, ideas and knowledge require experiences to be triggered or they may never be expressed. Experiences are not the source of knowledge as proposed by John Locke, but catalysts to the uncovering of knowledge.

Gottfried Wilhelm Leibniz

Gottfried Wilhelm Leibniz suggested that we are born with certain innate ideas, the most identifiable of these being mathematical truisms. The idea that 1 + 1 = 2 is evident to us without the necessity for empirical evidence. Leibniz argues that empiricism can show us show that concepts are true in the present; the observation of one apple and then another in one instance, and in that instance only, leads to the conclusion that one and another equals two. However, the suggestion that one and another will always equal two requires an innate idea, as that would be a suggestion of things unwitnessed.

Leibniz called such concepts as mathematical truisms "necessary truths". Another example of such may be the phrase, "What is, is" or "It is impossible for the same thing to be and not to be". Leibniz argues that such truisms are universally assented to (acknowledged by all to be true); this being the case, it must be due to their status as innate ideas. Often some ideas are acknowledged as necessarily true but are not universally assented to. Leibniz would suggest that this is simply because the person in question has not become aware of the innate idea, not because they do not possess it. Leibniz argues that empirical evidence can serve to bring to the surface certain principles that are already innately embedded in our minds. This is similar to needing to hear only the first few notes to recall the rest of the melody.

John Locke

The main antagonist to the concept of innate ideas is John Locke, a contemporary of Leibniz. Locke argued that the mind is in fact devoid of all knowledge or ideas at birth; it is a blank sheet or tabula rasa. He argued that all our ideas are constructed in the mind via a process of constant composition and decomposition of the input that we receive through our senses.

Locke, in An Essay Concerning Human Understanding, suggests that the concept of universal assent in fact proves nothing, except perhaps that everyone is in agreement; in short universal assent proves that there is universal assent and nothing else. Moreover, Locke goes on to suggest that in fact there is no universal assent. Even a phrase such as "What is, is" is not universally assented to; infants and severely mentally disabled adults do not generally acknowledge this truism. Locke also attacks the idea that an innate idea can be imprinted on the mind without the owner realizing it. For Locke, such reasoning would allow one to conclude the absurd: "All the Truths a Man ever comes to know, will, by this account, be, every one of them, innate." To return to the musical analogy, we may not be able to recall the entire melody until we hear the first few notes, but we were aware of the fact that we knew the melody and that upon hearing the first few notes we would be able to recall the rest.

Locke ends his attack upon innate ideas by suggesting that the mind is a tabula rasa or "blank slate", and that all ideas come from experience; all our knowledge is founded in sensory experience.

Essentially, the same knowledge thought to be a priori by Leibniz is, according to Locke, the result of empirical knowledge, which has a lost origin [been forgotten] in respect to the inquirer. However, the inquirer is not cognizant of this fact; thus, he experiences what he believes to be a priori knowledge.

  1. The theory of innate knowledge is excessive. Even innatists accept that most of our knowledge is learned through experience, but if that can be extended to account for all knowledge, we learn color through seeing it, so therefore, there is no need for a theory about an innate understanding of color.
  2. No ideas are universally held. Do we all possess the idea of God? Do we all believe in justice and beauty? Do we all understand the law of identity? If not, it may not be the case that we have acquired these ideas through impressions/experience/social interaction.
  3. Even if there are some universally agreed statements, it is just the ability of the human brain to organize learned ideas/words, that is, innate. An "ability to organize" is not the same as "possessing propositional knowledge" (e.g., a computer with no saved files has all the operations programmed in but has an empty memory).

Contemporary approaches

Linguistics

In his Meno, Plato raises an important epistemological quandary: How is it that we have certain ideas that are not conclusively derivable from our environments? Noam Chomsky has taken this problem as a philosophical framework for the scientific inquiry into innatism. His linguistic theory, which derives from 18th century classical-liberal thinkers such as Wilhelm von Humboldt, attempts to explain in cognitive terms how we can develop knowledge of systems which are said, by supporters of innatism, to be too rich and complex to be derived from our environment. One such example is our linguistic faculty. Our linguistic systems contain a systemic complexity which supposedly could not be empirically derived: the environment seems too poor, variable and indeterminate, according to Chomsky, to explain the extraordinary ability to learn complex concepts possessed by very young children. Essentially, their accurate grammatical knowledge cannot have originated from their experiences as their experiences are not adequate. It follows that humans must be born with a universal innate grammar, which is determinate and has a highly organized directive component, and enables the language learner to ascertain and categorize language heard into a system. Chomsky states that the ability to learn how to properly construct sentences or know which sentences are grammatically incorrect is an ability gained from innate knowledge. Noam Chomsky cites as evidence for this theory, the apparent invariability, according to his views, of human languages at a fundamental level. In this way, linguistics may provide a window into the human mind, and establish scientific theories of innateness which otherwise would remain merely speculative.

One implication of Noam Chomsky's innatism, if correct, is that at least a part of human knowledge consists in cognitive predispositions, which are triggered and developed by the environment, but not determined by it. Chomsky suggests that we can look at how a belief is acquired as an input-output situation. He supports the doctrine of innatism as he states that human beliefs gathered from sensory experience are much richer and complex than the experience itself. He asserts that the extra information gathered is from the mind itself as it cannot solely be from experiences. Humans derive excess amount of information from their environment so some of that information must be pre-determined.

Interplanetary contamination

From Wikipedia, the free encyclopedia

Interplanetary contamination refers to biological contamination of a planetary body by a space probe or spacecraft, either deliberate or unintentional.

There are two types of interplanetary contamination:

  • Forward contamination is the transfer of life and other forms of contamination from Earth to another celestial body.
  • Back contamination is the introduction of extraterrestrial organisms and other forms of contamination into Earth's biosphere. It also covers infection of humans and human habitats in space and on other celestial bodies by extraterrestrial organisms, if such organisms exist.

The main focus is on microbial life and on potentially invasive species. Non-biological forms of contamination have also been considered, including contamination of sensitive deposits (such as lunar polar ice deposits) of scientific interest. In the case of back contamination, multicellular life is thought unlikely but has not been ruled out. In the case of forward contamination, contamination by multicellular life (e.g. lichens) is unlikely to occur for robotic missions, but it becomes a consideration in crewed missions to Mars.

Current space missions are governed by the Outer Space Treaty and the COSPAR guidelines for planetary protection. Forward contamination is prevented primarily by sterilizing the spacecraft. In the case of sample-return missions, the aim of the mission is to return extraterrestrial samples to Earth, and sterilization of the samples would make them of much less interest. So, back contamination would be prevented mainly by containment, and breaking the chain of contact between the planet of origin and Earth. It would also require quarantine procedures for the materials and for anyone who comes into contact with them.

Overview

Most of the Solar System appears hostile to life as we know it. No extraterrestrial life has ever been discovered. But if extraterrestrial life exists, it may be vulnerable to interplanetary contamination by foreign microorganisms. Some extremophiles may be able to survive space travel to another planet, and foreign life could possibly be introduced by spacecraft from Earth. If possible, some believe this poses scientific and ethical concerns.

Locations within the Solar System where life might exist today include the oceans of liquid water beneath the icy surface of Europa, Enceladus, and Titan (its surface has oceans of liquid ethane / methane, but it may also have liquid water below the surface and ice volcanoes).

There are multiple consequences for both forward- and back-contamination. If a planet becomes contaminated with Earth life, it might then be difficult to tell whether any lifeforms discovered originated there or came from Earth. Furthermore, the organic chemicals produced by the introduced life would confuse sensitive searches for biosignatures of living or ancient native life. The same applies to other more complex biosignatures. Life on other planets could have a common origin with Earth life, since in the early Solar System there was much exchange of material between the planets which could have transferred life as well. If so, it might be based on nucleic acids too (RNA or DNA).

The majority of the species isolated are not well understood or characterized and cannot be cultured in labs, and are known only from DNA fragments obtained with swabs. On a contaminated planet, it might be difficult to distinguish the DNA of extraterrestrial life from the DNA of life brought to the planet by the exploring. Most species of microorganisms on Earth are not yet well understood or DNA sequenced. This particularly applies to the unculturable archaea, and so are difficult to study. This can be either because they depend on the presence of other microorganisms, are slow growing, or depend on other conditions not yet understood. In typical habitats, 99% of microorganisms are not culturable. Introduced Earth life could contaminate resources of value for future human missions, such as water.

Invasive species could outcompete native life or consume it, if there is life on the planet. However, the experience on earth shows that species moved from one continent to another may be able to out compete the native life adapted to that continent. Additionally, evolutionary processes on Earth might have developed biological pathways different from extraterrestrial organisms, and so may be able to outcompete it. The same is also possible the other way around for contamination introduced to Earth's biosphere.

In addition to science research concerns, there are also attempts to raise ethical and moral concerns regarding intentional or unintentional interplanetary transport of life.

Evidence for possible habitats outside Earth

Enceladus and Europa show the best evidence for current habitats, mainly due to the possibility of their hosting liquid water and organic compounds.

Mars

There is ample evidence to suggest that Mars once offered habitable conditions for microbial life. It is therefore possible that microbial life may have existed on Mars, although no evidence has been found.

It is thought that many bacterial spores (endospores) from Earth were transported on Mars spacecraft. Some may be protected within Martian rovers and landers on the shallow surface of the planet. In that sense, Mars may have already been contaminated.

Certain lichens from the arctic permafrost are able to photosynthesize and grow in the absence of any liquid water, simply by using the humidity from the atmosphere. They are also highly tolerant of UV radiation, using melanin and other more specialized chemicals to protect their cells.

Although numerous studies point to resistance to some of Mars conditions, they do so separately, and none have considered the full range of Martian surface conditions, including temperature, pressure, atmospheric composition, radiation, humidity, oxidizing regolith, and others, all at the same time and in combination. Laboratory simulations show that whenever multiple lethal factors are combined, the survival rates plummet quickly.

Other studies have suggested the potential for life to survive using deliquescing salts. These, similarly to the lichens, use the humidity of the atmosphere. If the mixture of salts is right, the organisms may obtain liquid water at times of high atmospheric humidity, with salts capturing enough to be capable of supporting life.

Research published in July 2017 shows that when irradiated with a simulated Martian UV flux, perchlorates become even more lethal to bacteria (bactericide effect). Even dormant spores lost viability within minutes. In addition, two other compounds of the Martian surface, iron oxides and hydrogen peroxide, act in synergy with irradiated perchlorates to cause a 10.8-fold increase in cell death when compared to cells exposed to UV radiation after 60 seconds of exposure. It was also found that abraded silicates (quartz and basalt) lead to the formation of toxic reactive oxygen species. The researchers concluded that "the surface of Mars is lethal to vegetative cells and renders much of the surface and near-surface regions uninhabitable." This research demonstrates that the present-day surface is more uninhabitable than previously thought, and reinforces the notion to inspect at least a few meters into the ground to ensure the levels of radiation would be relatively low.

Enceladus

The Cassini spacecraft directly sampled the plumes escaping from Enceladus. Measured data indicates that these geysers are made primarily of salt rich particles with an 'ocean-like' composition, which is thought to originate from a subsurface ocean of liquid saltwater, rather than from the moon's icy surface. Data from the geyser flythroughs also indicate the presence of organic chemicals in the plumes. Heat scans of Enceladus's surface also indicate higher temperatures around the fissures where the geysers originate, with temperatures reaching −93 °C (−135 °F), which is 115 °C (207 °F) warmer than the surrounding surface regions.

Europa

Europa has much indirect evidence for its sub-surface ocean. Models of how Europa is affected by tidal heating require a subsurface layer of liquid water in order to accurately reproduce the linear fracturing of the surface. Indeed, observations by the Galileo spacecraft of how Europa's magnetic field interacts with Jupiter's field strengthens the case for a liquid, rather than solid, layer; an electrically conductive fluid deep within Europa would explain these results. Observations from the Hubble Space Telescope in December 2012 appear to show an ice plume spouting from Europa's surface, which would immensely strengthen the case for a liquid subsurface ocean. As was the case for Enceladus, vapour geysers would allow for easy sampling of the liquid layer. Unfortunately, there appears to be little evidence that geysering is a frequent event on Europa due to the lack of water in the space near Europa.

Planetary protection

Forward contamination is prevented by sterilizing space probes sent to sensitive areas of the Solar System. Missions are classified depending on whether their destinations are of interest for the search for life, and whether there is any chance that Earth life could reproduce there.

NASA made these policies official with the issuing of Management Manual NMI-4-4-1, NASA Unmanned Spacecraft Decontamination Policy on September 9, 1963. Prior to NMI-4-4-1 the same sterilization requirements were required on all outgoing spacecraft regardless of their target. Difficulties in the sterilization of Ranger probes sent to the Moon are the primary reasons for NASA's change to a target-by-target basis in assessing the likelihood forward contamination.

Some destinations such as Mercury need no precautions at all. Others such as the Moon require documentation but nothing more, while destinations such as Mars require sterilization of the rovers sent there.

Back contamination would be prevented by containment or quarantine. However, there have been no sample-returns thought to have any possibility of a back contamination risk since the Apollo missions. The Apollo regulations have been rescinded and new regulations have yet to be developed. See suggested precautions for sample-returns.

Crewed spacecraft

Crewed spacecraft are of particular concern for interplanetary contamination because of the impossibility to sterilize a human to the same level as a robotic spacecraft. Therefore, the chance of forwarding contamination is higher than for a robotic mission. Humans are typically host to a hundred trillion microorganisms in ten thousand species in the human microbiome which cannot be removed while preserving the life of the human. Containment seems the only option, but effective containment to the same standard as a robotic rover appears difficult to achieve with present-day technology. In particular, adequate containment in the event of a hard landing is a major challenge.

Human explorers may be potential carriers back to Earth of microorganisms acquired on Mars, if such microorganisms exist. Another issue is the contamination of the water supply by Earth microorganisms shed by humans in their stools, skin and breath, which could have a direct effect on the long-term human colonization of Mars.

Historical examples of measures taken to prevent planetary contamination of the moon include the inclusion of an anti-bacterial filter in the Apollo Lunar Module, from Apollo 13 and onward. This was placed on the cabin relief valve in order to prevent contaminants from the cabin being released into the lunar environment during the depressurization of the crew compartment, prior to EVA.

The Moon

The Apollo 11 missions incited public concern about the possibility of microbes on the Moon, creating fears about a plague being brought to Earth when the astronauts returned. NASA received thousands of letters from Americans concerned with the potential for back contamination.

As a testbed

The Moon has been suggested as a testbed for new technology to protect sites in the Solar System, and astronauts, from forward and back contamination. Currently, the Moon has no contamination restrictions because it is considered to be "not of interest" for prebiotic chemistry and origins of life. Analysis of the contamination left by the Apollo program astronauts could also yield useful ground truth for planetary protection models.

Non-contaminating exploration methods

Telerobotics exploration on Mars and Earth

One of the most reliable ways to reduce the risk of forward and back contamination during visits to extraterrestrial bodies is to use only robotic spacecraft. Humans in close orbit around the target planet could control equipment on the surface in real time via telepresence, so bringing many of the benefits of a surface mission, without its associated increased forward and back contamination risks.

Back contamination issues

Since the Moon is now generally considered to be free from life, the most likely source of contamination would be from Mars during either a Mars sample-return mission or as a result of a crewed mission to Mars. The possibility of new human pathogens, or environmental disruption due to back contamination, is considered to be of extremely low probability but cannot yet be ruled out.

NASA and ESA are actively developing a Mars Sample Return Program to return samples collected by the Perseverance Rover to Earth. The European Space Foundation report cites many advantages of a Mars sample-return. In particular, it would permit extensive analyses on Earth, without the size and weight constraints for instruments sent to Mars on rovers. These analyses could also be carried out without the communication delays for experiments carried out by Martian rovers. It would also make it possible to repeat experiments in multiple laboratories with different instruments to confirm key results.

Carl Sagan was first to publicise back contamination issues that might follow from a Mars sample-return. In Cosmic Connection (1973) he wrote:

Precisely because Mars is an environment of great potential biological interest, it is possible that on Mars there are pathogens, organisms which, if transported to the terrestrial environment, might do enormous biological damage.

Later in Cosmos (1980) Carl Sagan wrote:

Perhaps Martian samples can be safely returned to Earth. But I would want to be very sure before considering a returned-sample mission.

NASA and ESA views are similar. The findings were that with present-day technology, Martian samples can be safely returned to Earth provided the right precautions are taken.

Suggested precautions for sample-returns

NASA has already had experience with returning samples thought to represent a low back contamination risk when samples were returned for the first time by Apollo 11. At the time, it was thought that there was a low probability of life on the Moon, so the requirements were not very stringent. The precautions taken then were inadequate by current standards, however. The regulations used then have been rescinded, and new regulations and approaches for a sample-return would be needed.

Chain of contact

A sample-return mission would be designed to break the chain of contact between Mars and the exterior of the sample container, for instance, by sealing the returned container inside another larger container in the vacuum of space before it returns to Earth. In order to eliminate the risk of parachute failure, the capsule could fall at terminal velocity and the impact would be cushioned by the capsule's thermal protection system. The sample container would be designed to withstand the force of the impact.

Receiving facility

Working inside a BSL-4 laboratory with air hoses providing positive air pressure to their suits

To receive, analyze and curate extraterrestrial soil samples, NASA has proposed to build a biohazard containment facility, tentatively known as the Mars Sample Return Receiving Facility (MSRRF). This future facility must be rated biohazard level 4 (BSL-4). While existing BSL-4 facilities deal primarily with fairly well-known organisms, a BSL-4 facility focused on extraterrestrial samples must pre-plan the systems carefully while being mindful that there will be unforeseen issues during sample evaluation and curation that will require independent thinking and solutions.

The facility's systems must be able to contain unknown biohazards, as the sizes of any putative Martian microorganisms are unknown. In consideration of this, additional requirements were proposed. Ideally it should filter particles of 0.01 μm or larger, and release of a particle 0.05 μm or larger is unacceptable under any circumstance.

The reason for this extremely small size limit of 0.01 μm is for consideration of gene transfer agents (GTAs) which are virus-like particles that are produced by some microorganisms that package random segments of DNA capable of horizontal gene transfer. These randomly incorporate segments of the host genome and can transfer them to other evolutionarily distant hosts, and do that without killing the new host. In this way many archaea and bacteria can swap DNA with each other. This raises the possibility that Martian life, if it has a common origin with Earth life in the distant past, could swap DNA with Earth microorganisms in the same way. In one experiment reported in 2010, researchers left GTAs (DNA conferring antibiotic resistance) and marine bacteria overnight in natural conditions and found that by the next day up to 47% of the bacteria had incorporated the genetic material from the GTAs. Another reason for the 0.05 μm limit is because of the discovery of ultramicrobacteria as small as 0.2 μm across.

The BSL-4 containment facility must also double as a cleanroom to preserve the scientific value of the samples. A challenge is that, while it is relatively easy to simply contain the samples once returned to Earth, researchers would also want to remove parts of the sample and perform analyses. During all these handling procedures, the samples would need to be protected from Earthly contamination. A cleanroom is normally kept at a higher pressure than the external environment to keep contaminants out, while a biohazard laboratory is kept at a lower pressure to keep the biohazards in. This would require compartmentalizing the specialized rooms in order to combine them in a single building. Solutions suggested include a triple walled containment facility, and extensive robotic handling of the samples.

The facility would be expected to take 7 to 10 years from design to completion, and an additional two years recommended for the staff to become accustomed to the facilities.

Dissenting views on back contamination

Robert Zubrin, from the Mars Society, maintains that the risk of back contamination is negligible. He supports this using an argument based on the possibility of transfer of life from Earth to Mars on meteorites.

Margaret Race has examined in detail the legal process of approval for a MSR. She found that under the National Environmental Policy Act (NEPA) (which did not exist in the Apollo era), a formal environment impact statement is likely to be required, and public hearings during which all the issues would be aired openly. This process is likely to take up to several years to complete.

During this process, she found, the full range of worst accident scenarios, impact, and project alternatives would be played out in the public arena. Other agencies such as the Environment Protection Agency, Occupational Health and Safety Administration, etc., might also get involved in the decision-making process.

The laws on quarantine would also need to be clarified as the regulations for the Apollo program were rescinded. In the Apollo era, NASA delayed announcement of its quarantine regulations until the day Apollo was launched, bypassing the requirement for public debate - something that would likely not be tolerated today.

It is also probable that the presidential directive NSC-25 would apply, requiring a review of large scale alleged effects on the environment to be carried out subsequent to other domestic reviews and through a long process, leading eventually to presidential approval of the launch.

Apart from those domestic legal hurdles, there would be numerous international regulations and treaties to be negotiated in the case of a Mars sample-return, especially those relating to environmental protection and health. Race concluded that the public of necessity has a significant role to play in the development of the policies governing Mars sample-return.

Alternatives to sample-returns

Several exobiologists have suggested that a Mars sample-return is not necessary at this stage, and that it is better to focus more on in situ studies on the surface first. Although it is not their main motivation, this approach of course also eliminates back contamination risks.

Some of these exobiologists advocate more in situ studies followed by a sample-return in the near future. Others go as far as to advocate in situ study instead of a sample-return at the present state of understanding of Mars.

Their reasoning is that life on Mars is likely to be hard to find. Any present day life is likely to be sparse and occur in only a few niche habitats. Past life is likely to be degraded by cosmic radiation over geological time periods if exposed in the top few meters of the Mars surface. Also, only certain special deposits of salts or clays on Mars would have the capability to preserve organics for billions of years. So, they argue, there is a high risk that a Mars sample-return at our current stage of understanding would return samples that are no more conclusive about the origins of life on Mars or present day life than the Martian meteorite samples we already have.

Another consideration is the difficulty of keeping the sample completely free from Earth life contamination during the return journey and during handling procedures on Earth. This might make it hard to show conclusively that any biosignatures detected does not result from contamination of the samples.

Instead they advocate sending more sensitive instruments on Mars surface rovers. These could examine many different rocks and soil types, and search for biosignatures on the surface and so examine a wide range of materials which could not all be returned to Earth with current technology at reasonable cost.

A sample-return to Earth would then be considered at a later stage, once we have a reasonably thorough understanding of conditions on Mars, and possibly have already detected life there, either current or past life, through biosignatures and other in situ analyses.

Instruments under development for in situ analyses

  • NASA Marshall Space Flight Center is leading a research effort to develop a Miniaturized Variable Pressure Scanning Electron Microscope (MVP-SEM) for future lunar and Martian missions.[77]
  • Several teams, including Jonathan Rothberg, and J. Craig Venter, are separately developing solutions for sequencing alien DNA directly on the Martian surface itself.
  • Levin is working on updated versions of the Labeled Release instrument flown on Viking. For instance versions that rely on detecting chirality. This is of special interest because it can enable detection of life even if it is not based on standard life chemistry.
  • The Urey Mars Organic and Oxidant Detector instrument for detection of biosignatures has been descoped, but was due to be flown on ExoMars in 2018. It is designed with much higher levels of sensitivity for biosignatures than any previous instruments.

Study and analyses from orbit

During the “Exploration Telerobotics Symposium" in 2012, experts on telerobotics from industry, NASA, and academics met to discuss telerobotics and its applications to space exploration. Amongst other issues, particular attention was given to Mars missions and a Mars sample-return.

They came to the conclusion that telerobotic approaches could permit direct study of the samples on the Mars surface via telepresence from Mars orbit, permitting rapid exploration and use of human cognition to take advantage of chance discoveries and feedback from the results obtained.

They found that telepresence exploration of Mars has many advantages. The astronauts have near real-time control of the robots, and can respond immediately to discoveries. It also prevents contamination both ways and has mobility benefits as well.

Finally, return of the sample to orbit has the advantage that it permits analysis of the sample without delay, to detect volatiles that may be lost during a voyage home.

Telerobotics exploration of Mars

Similar methods could be used to directly explore other biologically sensitive moons such as Europa, Titan, or Enceladus, once human presence in the vicinity becomes possible.

Forward contamination

The 2019 Beresheet incident

In August 2019, scientists reported that a capsule containing tardigrades (a resilient microbial animal) in a cryptobiotic state may have survived for a while on the Moon after the April 2019 crash landing of Beresheet, a failed Israeli lunar lander.

Science of morality

From Wikipedia, the free encyclopedia

Science of morality (also known as science of ethics or scientific ethics) may refer to various forms of ethical naturalism grounding morality and ethics in rational, empirical consideration of the natural world. It is sometimes framed as using the scientific approach to determine what is right and wrong, in contrast to the widespread belief that "science has nothing to say on the subject of human values".

Overview

Moral science may refer to the consideration of what is best for, and how to maximize the flourishing of, either particular individuals or all conscious creatures. It has been proposed that "morality" can be appropriately defined on the basis of fundamental premises necessary for any empirical, secular, or philosophical discussion and that societies can use the methods of science to provide answers to moral questions.

The norms advocated by moral scientists (e.g. rights to abortion, euthanasia, and drug liberalization under certain circumstances) would be founded upon the shifting and growing collection of human understanding. Even with science's admitted degree of ignorance, and the various semantic issues, moral scientists can meaningfully discuss things as being almost certainly "better" or "worse" for promoting flourishing.

History

In philosophy

Utilitarian Jeremy Bentham discussed some of the ways moral investigations are a science. He criticized deontological ethics for failing to recognize that it needed to make the same presumptions as his science of morality to really work – whilst pursuing rules that were to be obeyed in every situation (something that worried Bentham).

W. V. O. Quine advocated naturalizing epistemology by looking to natural sciences like psychology for a full explanation of knowledge. His work contributed to a resurgence of moral naturalism in the last half of the 20th century. Paul Kurtz, who believes that the careful, secular pursuit of normative rules is vital to society, coined the term eupraxophy to refer to his approach to normative ethics. Steven Pinker, Sam Harris, and Peter Singer believe that we learn what is right and wrong through reason and empirical methodology.

Maria Ossowska used the methods of science to understand the origins of moral norms.

Maria Ossowska thought that sociology was inextricably related to philosophical reflections on morality, including normative ethics. She proposed that science analyse: (a) existing social norms and their history, (b) the psychology of morality, and the way that individuals interact with moral matters and prescriptions, and (c) the sociology of morality.

The theory and methods of a normative science of morality are explicitly discussed in Joseph Daleiden's The Science of Morality: The Individual, Community, and Future Generations (1998). Daleiden's book, in contrast to Harris, extensively discusses the relevant philosophical literature. In The Moral Landscape: How Science Can Determine Human Values, Sam Harris's goal is to show how moral truth can be backed by "science", or more specifically, empirical knowledge, critical thinking, philosophy, but most controversially, the scientific method.

Patricia Churchland offers that, accepting David Hume's is–ought problem, the use of induction from premises and definitions remains a valid way of reasoning in life and science:

Our moral behavior, while more complex than the social behavior of other animals, is similar in that it represents our attempt to manage well in the existing social ecology. ... from the perspective of neuroscience and brain evolution, the routine rejection of scientific approaches to moral behavior based on Hume's warning against deriving ought from is seems unfortunate, especially as the warning is limited to deductive inferences. ... The truth seems to be that values rooted in the circuitry for caring—for well-being of self, offspring, mates, kin, and others—shape social reasoning about many issues: conflict resolutions, keeping the peace, defense, trade, resource distribution, and many other aspects of social life in all its vast richness.

Daleiden and Leonard Carmichael warn that science is probabilistic, and that certainty is not possible. One should therefore expect that moral prescriptions will change as humans gain understanding.

In futurism

Transhumanist philosophers such as David Pearce and Mark Alan Walker have extensively discussed the ethical implications of future technologies. Walker coined the term "biohappiness" to describe the idea of directly manipulating the biological roots of happiness in order to increase it. Pearce argues that suffering could eventually be eradicated entirely, stating that: "It is predicted that the world's last unpleasant experience will be a precisely dateable event." Proposed technological methods of overcoming the hedonic treadmill include wireheading (direct brain stimulation for uniform bliss), which undermines motivation and evolutionary fitness; designer drugs, offering sustainable well-being without side effects, though impractical for lifelong reliance; and genetic engineering, the most promising approach. Genetic recalibration through hyperthymia-promoting genes could raise hedonic set-points, fostering adaptive well-being, creativity, and productivity while maintaining responsiveness to stimuli. While scientifically achievable, this transformation requires careful ethical and societal considerations to navigate its profound implications.

On the opposite end of the spectrum, risks of astronomical suffering are possible futures in which vastly more suffering will exist than has ever been produced on earth so far in all of earth's history. Possible sources of these risks include artificial superintelligence, genetic engineering for maximum suffering, space colonization, and terraforming leading to an increase in wild animal suffering.

Views in scientific morality

Training to promote good behaviour

The science of morality may aim to discover the best ways to motivate and shape individuals. Methods to accomplish this include instilling explicit virtues, building character strengths, and forming mental associations. These generally require some level of practical reason. James Rest suggested that abstract reasoning is also a factor in making moral judgements and emphasized that moral judgements alone do not predict moral behaviour: “Moral judgement may be closely related to advocacy behaviour, which in turn influences social institutions, which in turn creates a system of norms and sanctions that influences people’s behaviour.” Daleiden suggested that religions instill a practical sense of virtue and justice, right and wrong. They also effectively use art and myths to educate people about moral situations.

Role of government

Harris argues that moral science does not imply an "Orwellian future" with "scientists at every door". Instead, Harris imagines data about normative moral issues being shared in the same way as other sciences (e.g. peer-reviewed journals on medicine).

Daleiden specifies that government, like any organization, should have limited power. He says "centralization of power irrevocably in the hands of one person or an elite has always ultimately led to great evil for the human race. It was the novel experiment of democracy—a clear break with tradition—that ended the long tradition of tyranny.” He is also explicit that government should only use law to enforce the most basic, reasonable, proven and widely supported moral norms. In other words, there are a great many moral norms that should never be the task of the government to enforce.

Role of punishment

One author has argued that to attain a society where people are motivated by conditioned self-interest, punishment must go hand-in-hand with reward. For instance, in this line of reasoning, prison remains necessary for many perpetrators of crimes. This is so, even if libertarian free will is false. This is because punishment can still serve its purposes: it deters others from committing their own crimes, educates and reminds everyone about what the society stands for, incapacitates the criminal from doing more harm, goes some way to relieving or repaying the victim, and corrects the criminal (also see recidivism). This author argues that, at least, any prison system should be pursuing those goals, and that it is an empirical question as to what sorts of punishment realize these goals most effectively, and how well various prison systems actually serve these purposes.

Research

The brain areas that are consistently involved when humans reason about moral issues have been investigated. The neural network underlying moral decisions overlaps with the network pertaining to representing others' intentions (i.e., theory of mind) and the network pertaining to representing others' (vicariously experienced) emotional states (i.e., empathy). This supports the notion that moral reasoning is related to both seeing things from other persons’ points of view and to grasping others’ feelings. These results provide evidence that the neural network underlying moral decisions is probably domain-global (i.e., there might be no such things as a "moral module" in the human brain) and might be dissociable into cognitive and affective sub-systems.

An essential, shared component of moral judgment involves the capacity to detect morally salient content within a given social context. Recent research implicated the salience network in this initial detection of moral content. The salience network responds to behaviourally salient events, and may be critical to modulate downstream default and frontal control network interactions in the service of complex moral reasoning and decision-making processes. This suggest that moral cognition involves both bottom-up and top-down attentional processes, mediated by discrete large-scale brain networks and their interactions.

In universities

Moral sciences is offered at the degree level at Ghent University (as "an integrated empirical and philosophical study of values, norms and world views")

Other implications

Daleiden provides examples of how science can use empirical evidence to assess the effect that specific behaviours can have on the well-being of individuals and society with regard to various moral issues. He argues that science supports decriminalization and regulation of drugs, euthanasia under some circumstances, and the permission of sexual behaviours that are not tolerated in some cultures (he cites homosexuality as an example). Daleiden further argues that in seeking to reduce human suffering, abortion should not only be permissible, but at times a moral obligation (as in the case of a mother of a potential child who would face the probability of much suffering). Like all moral claims in his book, however, Daleiden is adamant that these decisions remain grounded in, and contingent on empirical evidence.

The ideas of cultural relativity, to Daleiden, do offer some lessons: investigators must be careful not to judge a person's behaviour without understanding the environmental context. An action may be necessary and more moral once we are aware of circumstances. However, Daleiden emphasizes that this does not mean all ethical norms or systems are equally effective at promoting flourishing and he often offers the equal treatment of women as a reliably superior norm, wherever it is practiced.

Criticisms

The idea of a normative science of morality has met with many criticisms from scientists and philosophers. Critics include physicist Sean M. Carroll, who argues that morality cannot be part of science. He and other critics cite the widely held "fact-value distinction", that the scientific method cannot answer "moral" questions, although it can describe the norms of different cultures. In contrast, moral scientists defend the position that such a division between values and scientific facts ("moral relativism") is not only arbitrary and illusory, but impeding progress towards taking action against documented cases of human rights violations in different cultures.

Stephen Jay Gould argued that science and religion occupy "non-overlapping magisteria". To Gould, science is concerned with questions of fact and theory, but not with meaning and morality – the magisteria of religion. In the same vein, Edward Teller proposed that politics decides what is right, whereas science decides what is true.

During a discussion on the role that naturalism might play in professions like nursing, the philosopher Trevor Hussey calls the popular view that science is unconcerned with morality "too simplistic". Although his main focus in the paper is naturalism in nursing, he goes on to explain that science can, at very least, be interested in morality at a descriptive level. He even briefly entertains the idea that morality could itself be a scientific subject, writing that one might argue "... that moral judgements are subject to the same kinds of rational, empirical examination as the rest of the world: they are a subject for science – although a difficult one. If this could be shown to be so, morality would be contained within naturalism. However, I will not assume the truth of moral realism here."

Stream of consciousness (psychology)

From Wikipedia, the free encyclopedia

The metaphor "stream of consciousness" suggests how thoughts seem to flow through the conscious mind. Research studies have shown that humans only experience one mental event at a time, as a fast-moving mind-stream. The full range of thoughts one can be aware of forms the content of this "stream".

The term was coined by Alexander Bain in 1855, when he wrote in The Senses and the Intellect, "The concurrence of Sensations in one common stream of consciousness (on the same cerebral highway) enables those of different senses to be associated as readily as the sensations of the same sense". But the man who popularized it is commonly credited instead: William James, often considered the father of American psychology, used it in 1890 in The Principles of Psychology.

Buddhism

Early Buddhist scriptures describe the "stream of consciousness" (Pali; viññāna-sota) where it is referred to as the Mindstream. The practice of mindfulness, which is about being aware moment-to-moment of one's subjective conscious experience, aid one to directly experience the "stream of consciousness" and to gradually cultivate self-knowledge and wisdom.

Buddhist teachings describe the continuous flow of the "stream of mental and material events" as including sensory experiences (i.e., seeing, hearing, smelling, tasting, touch sensations, or a thought relating to the past, present or future) as well as various mental events that are generated, namely, feelings, perceptions and intentions/behaviour. These mental events are also described as being influenced by other factors such as attachments and past conditioning. Further, the moment-by-moment manifestation of the "stream of consciousness" is described as being affected by physical laws, biological laws, psychological laws, volitional laws, and universal laws.

Proponents

In his lectures circa 1838–1839 Sir William Hamilton, 9th Baronet described "thought" as "a series of acts indissolubly connected"; this comes about because of what he asserted was a fourth "law of thought" known as the "law of reason and consequent":

"The logical significance of the law of Reason and Consequent lies in this, – That in virtue of it, thought is constituted into a series of acts all indissolubly connected; each necessarily inferring the other" (Hamilton 1860:61-62).

In this context the words "necessarily infer" are synonymous with "imply". In further discussion Hamilton identified "the law" with modus ponens; thus the act of "necessarily infer" detaches the consequent for purposes of becoming the (next) antecedent in a "chain" of connected inferences.

William James asserts the notion as follows:

"Consciousness, then, does not appear to itself chopped up in bits. Such words as 'chain' or 'train' do not describe it fitly as it presents itself in the first instance. It is nothing jointed; it flows. A 'river' or a 'stream' are the metaphors by which it is most naturally described. In talking of it hereafter let us call it the stream of thought, of consciousness, or of subjective life. (James 1890:239)

He was enormously skeptical about using introspection as a technique to understand the stream of consciousness. "The attempt at introspective analysis in these cases is in fact like seizing a spinning top to catch its motion, or trying to turn up the gas quickly enough to see how the darkness looks." However, the epistemological separation of two levels of analyses appears to be important in order to systematically understand the "stream of consciousness."

Bernard Baars has developed Global Workspace Theory which bears some resemblance to stream of consciousness.

Conceptually understanding what is meant by the "present moment," "the past" and "the future" can aid one to systematically understand the "stream of consciousness."

Criticism

Susan Blackmore challenged the concept of stream of consciousness. "When I say that consciousness is an illusion I do not mean that consciousness does not exist. I mean that consciousness is not what it appears to be. If it seems to be a continuous stream of rich and detailed experiences, happening one after the other to a conscious person, this is the illusion." However, she also says that a good way to observe the "stream of consciousness" may be to calm the mind in meditation. The criticism is based on the stream of perception data from the senses rather than about consciousness itself. Also, it is not explained the reason why some things are conscious at all. Suggestions have also been made regarding the importance of separating "two levels of analyses" when attempting to understand the "stream of consciousness".

Baars is in agreement with these points. The continuity of the "stream of consciousness" may in fact be illusory, just as the continuity of a movie is illusory. Nevertheless, the seriality of mutually incompatible conscious events is well supported by objective research over some two centuries of experimental work. A simple illustration would be to try to be conscious of two interpretations of an ambiguous figure or word at the same time. When timing is precisely controlled, as in the case of the audio and video tracks of the same movie, seriality appears to be compulsory for potentially conscious events presented within the same 100 ms interval.

J. W. Dalton has criticized the global workspace theory on the grounds that it provides, at best, an account of the cognitive function of consciousness, and fails even to address the deeper problem of its nature, of what consciousness is, and of how any mental process whatsoever can be conscious: the so-called "hard problem of consciousness". Avshalom Elitzur has argued, however, "While this hypothesis does not address the 'hard problem', namely, the very nature of consciousness, it constrains any theory that attempts to do so and provides important insights into the relation between consciousness and cognition.", as much as any consciousness theory is constrained by the natural brain perception limitations.

New work by Richard Robinson shows promise in establishing the brain functions involved in this model and may help shed light on how we understand signs or symbols and reference these to our semiotic registers.

Daniel Kolak has argued extensively against the existence of a stream of consciousness, in the sense of someone having a continuous identity over time, in his book I am You. Kolak describes three opposing philosophical views regarding the continuity of consciousness: Closed individualism, Empty individualism, and Open individualism. Closed Individualism is defined as the default common sense view of identity where one's identity consists of a line stretching across time, which Kolak, argues is incoherent. Empty Individualism is the view that one's identity only exists for an infinitesimally small amount of time, and an individual person has an entirely different identity from moment to moment. Kolak instead advocates for Open Individualism, which is the view that everyone is in reality the same being, and that the "self" doesn't actually exist at all, similar to anattā in Buddhist philosophy.

Derek Parfit is another philosopher who has challenged the idea of the existence of a continuous stream of consciousness over time. In his book Reasons and Persons, Parfit describes the teletransportation paradox thought experiment, which describes the difficulties in distinguishing one's future self from an entity that is merely a copy of oneself.

Literary technique

In literature, stream of consciousness writing is a literary device which seeks to portray an individual's point of view by giving the written equivalent of the character's thought processes, either in a loose interior monologue, or in connection to his or her sensory reactions to external occurrences. Stream-of-consciousness as a narrative device is strongly associated with the modernist movement. The term was first applied in a literary context, transferred from psychology, in The Egoist, April 1918, by May Sinclair, in relation to the early volumes of Dorothy Richardson's novel sequence Pilgrimage. Amongst other modernist novelists who used it are James Joyce in Ulysses (1922) and William Faulkner in The Sound and the Fury (1929).

Inner space

In science fiction, inner space refers to works of psychological science fiction that emphasize internal, mental, and emotional experiences over external adventure or technological speculation, which defined it as "a category introduced to science fiction by representatives of the New Wave to designate internal, mental experiences as imaginary worlds with no connection to the real world" contrasts with traditional science fiction's fascination with outer space.

Works from this genre appeared as part of the emergence of the New Wave in science fiction in the 1960s. They were popularized by English writer J.G. Ballard and associated with the New Wave movement in science fiction. Subsequent contributions by critics and writers such as Michael Moorcock, Pat Cadigan, and Greg Bear helped establish inner space as a recurring theme in science fiction discourse.

Rob Mayo wrote that the 1980s was the second "golden age" of inner space, associated with writers such as Pat Cadigan and Greg Bear; he also notes the movie Dreamscape (1984), which he calls "the first inner space film". He notes that the genre once again returned the 2000s, here noting the movies The Cell (2000) and Inception (2010), as well as the video game Psychonauts (2005). He notes that Inception marked "the transition of inner space fiction from a marginal genre (SF literature) to a viable mainstream (Hollywood cinema)".

Transgenerational epigenetic inheritance

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Transgenerational_epigenetic_inheritance   ...