Search This Blog

Friday, August 9, 2019

Neuropsychology

From Wikipedia, the free encyclopedia

Neuropsychology is the study and characterization of the behavioral modifications that follow a neurological trauma or condition. It is both an experimental and clinical field of psychology that aims to understand how behavior and cognition are influenced by brain functioning and is concerned with the diagnosis and treatment of behavioral and cognitive effects of neurological disorders. Whereas classical neurology focuses on the pathology of the nervous system and classical psychology is largely divorced from it, neuropsychology seeks to discover how the brain correlates with the mind through the study of neurological patients. It thus shares concepts and concerns with neuropsychiatry and with behavioral neurology in general. The term neuropsychology has been applied to lesion studies in humans and animals. It has also been applied in efforts to record electrical activity from individual cells (or groups of cells) in higher primates (including some studies of human patients).

In practice, neuropsychologists tend to work in research settings (universities, laboratories or research institutions), clinical settings (medical hospitals or rehabilitation settings, often involved in assessing or treating patients with neuropsychological problems), or forensic settings or industry (often as clinical-trial consultants where CNS function is a concern).

History

Neuropsychology is a relatively new discipline within the field of psychology. The first textbook defining the field, Fundamentals of Human Neuropsychology, was initially published by Kolb and Whishaw in 1980. However, the history of its development can be traced back to the Third Dynasty in ancient Egypt, perhaps even earlier. There is much debate as to when societies started considering the functions of different organs. For many centuries, the brain was thought useless and was often discarded during burial processes and autopsies. As the field of medicine developed its understanding of human anatomy and physiology, different theories were developed as to why the body functioned the way it did. Many times, bodily functions were approached from a religious point of view and abnormalities were blamed on bad spirits and the gods. The brain has not always been considered the center of the functioning body. It has taken hundreds of years to develop our understanding of the brain and how it affects our behaviors.

Ancient Egypt

In ancient Egypt, writings on medicine date from the time of the priest Imhotep. They took a more scientific approach to medicine and disease, describing the brain, trauma, abnormalities, and remedies for reference for future physicians. Despite this, Egyptians saw the heart not the brain as the seat of the soul.

Aristotle

Senses, perception, memory, dreams, action in Aristotle's biology. Impressions are stored in the seat of perception, linked by his Laws of Association (similarity, contrast, and contiguity).
 
Aristotle reinforced this focus on the heart which originated in Egypt. He believed the heart to be in control of mental processes, and looked on the brain, due to its inert nature, as a mechanism for cooling the heat generated by the heart. He drew his conclusions based on the empirical study of animals. He found that while their brains were cold to the touch and that such contact did not trigger any movements, the heart was warm and active, accelerating and slowing dependent on mood. Such beliefs were upheld by many for years to come, persisting through the Middle Ages and the Renaissance period until they began to falter in the 17th century due to further research. The influence of Aristotle in the development of neuropsychology is evident within language used in modern day, since we "follow our hearts" and "learn by the heart."

Hippocrates

Hippocrates viewed the brain as the seat of the soul. He drew a connection between the brain and behaviors of the body, writing: "The brain exercises the greatest power in the man." Apart from moving the focus from the heart as the "seat of the soul" to the brain, Hippocrates did not go into much detail about its actual functioning. However, by switching the attention of the medical community to the brain, his theory led to more scientific discovery of the organ responsible for our behaviors. For years to come, scientists were inspired to explore the functions of the body and to find concrete explanations for both normal and abnormal behaviors. Scientific discovery led them to believe that there were natural and organically occurring reasons to explain various functions of the body, and it could all be traced back to the brain. Hippocrates introduced the concept of the mind – which was widely seen as a separate function apart from the actual brain organ.

René Descartes

Philosopher René Descartes expanded upon this idea and is most widely known for his work on the mind-body problem. Often Descartes's ideas were looked upon as overly philosophical and lacking in sufficient scientific foundation. Descartes focused much of his anatomical experimentation on the brain, paying special attention to the pineal gland – which he argued was the actual "seat of the soul." Still deeply rooted in a spiritual outlook towards the scientific world, the body was said to be mortal, and the soul immortal. The pineal gland was then thought to be the very place at which the mind would interact with the mortal and machine-like body. At the time, Descartes was convinced the mind had control over the behaviors of the body (controlling the person) – but also that the body could have influence over the mind, which is referred to as dualism. This idea that the mind essentially had control over the body, but the body could resist or even influence other behaviors, was a major turning point in the way many physiologists would look at the brain. The capabilities of the mind were observed to do much more than simply react, but also to be rational and function in organized, thoughtful ways – much more complex than he thought the animal world to be. These ideas, although disregarded by many and cast aside for years led the medical community to expand their own ideas of the brain and begin to understand in new ways just how intricate the workings of the brain really were, and the complete effects it had on daily life, as well, which treatments would be the most beneficial to helping those people living with a dysfunctional mind. The mind-body problem, spurred by René Descartes, continues to this day with many philosophical arguments both for and against his ideas. However controversial they were and remain today, the fresh and well-thought-out perspective Descartes presented has had long-lasting effects on the various disciplines of medicine, psychology and much more, especially in putting an emphasis on separating the mind from the body in order to explain observable behaviors.

Thomas Willis

It was in the mid-17th century that another major contributor to the field of neuropsychology emerged. Thomas Willis studied at Oxford University and took a physiological approach to the brain and behavior. It was Willis who coined the words 'hemisphere' and 'lobe' when referring to the brain. He was one of the earliest to use the words 'neurology' and 'psychology'. Rejecting the idea that humans were the only beings capable of rational thought, Willis looked at specialized structures of the brain. He theorized that higher structures accounted for complex functions, whereas lower structures were responsible for functions similar to those seen in other animals, consisting mostly of reactions and automatic responses. He was particularly interested in people who suffered from manic disorders and hysteria. His research constituted some of the first times that psychiatry and neurology came together to study individuals. Through his in-depth study of the brain and behavior, Willis concluded that automated responses such as breathing, heartbeats and other various motor activities were carried out within the lower region of the brain. Although much of his work has been made obsolete, his ideas presented the brain as more complex than previously imagined, and led the way for future pioneers to understand and build upon his theories, especially when it came to looking at disorders and dysfunctions in the brain.

Franz Joseph Gall

Neuroanatomist and physiologist Franz Joseph Gall made major progress in understanding the brain. He theorized that personality was directly related to features and structures within the brain. However, Gall's major contribution within the field of neuroscience is his invention of phrenology. This new discipline looked at the brain as an organ of the mind, where the shape of the skull could ultimately determine one's intelligence and personality. This theory was like many circulating at the time, as many scientists were taking into account physical features of the face and body, head size, anatomical structure, and levels of intelligence; only Gall looked primarily at the brain. There was much debate over the validity of Gall's claims however, because he was often found to be wrong in his predictions. He was once sent a cast of René Descartes' skull, and through his method of phrenology claimed the subject must have had a limited capacity for reasoning and higher cognition. As controversial and false as many of Gall's claims were, his contributions to understanding cortical regions of the brain and localized activity continued to advance understanding of the brain, personality, and behavior. His work is considered crucial to having laid a firm foundation in the field of neuropsychology, which would flourish over the next few decades.

Jean-Baptiste Bouillaud

Jean-Baptiste Bouillaud
 
Towards the late 19th century, the belief that the size of ones skull could determine their level of intelligence was discarded as science and medicine moved forward. A physician by the name of Jean-Baptiste Bouillaud expanded upon the ideas of Gall and took a closer look at the idea of distinct cortical regions of the brain each having their own independent function. Bouillaud was specifically interested in speech and wrote many publications on the anterior region of the brain being responsible for carrying out the act of ones speech, a discovery that had stemmed from the research of Gall. He was also one of the first to use larger samples for research although it took many years for that method to be accepted. By looking at over a hundred different case studies, Bouillaud came to discover that it was through different areas of the brain that speech is completed and understood. By observing people with brain damage, his theory was made more concrete. Bouillaud, along with many other pioneers of the time made great advances within the field of neurology, especially when it came to localization of function. There are many arguable debates as to who deserves the most credit for such discoveries, and often, people remain unmentioned, but Paul Broca is perhaps one of the most famous and well known contributors to neuropsychology – often referred to as "the father" of the discipline.

Paul Broca

Inspired by the advances being made in the area of localized function within the brain, Paul Broca committed much of his study to the phenomena of how speech is understood and produced. Through his study, it was discovered and expanded upon that we articulate via the left hemisphere. Broca's observations and methods are widely considered to be where neuropsychology really takes form as a recognizable and respected discipline. Armed with the understanding that specific, independent areas of the brain are responsible for articulation and understanding of speech, the brains abilities were finally being acknowledged as the complex and highly intricate organ that it is. Broca was essentially the first to fully break away from the ideas of phrenology and delve deeper into a more scientific and psychological view of the brain.

Karl Spencer Lashley

Lashley's works and theories that follow are summarized in his book Brain Mechanisms and Intelligence. Lashley's theory of the Engram was the driving force for much of his research. An engram was believed to be a part of the brain where a specific memory was stored. He continued to use the training/ablation method that Franz had taught him. He would train a rat to learn a maze and then use systematic lesions and removed sections of cortical tissue to see if the rat forgot what it had learned.

Through his research with the rats, he learned that forgetting was dependent on the amount of tissue removed and not where it was removed from. He called this mass action and he believed that it was a general rule that governed how brain tissue would respond, independent of the type of learning. But we know now that mass action was a misinterpretation of his empirical results, because in order to run a maze the rats required multiple cortical areas, so cutting into small individual parts alone will not impair the rats' brains much, but taking large sections removes multiple cortical areas at one time, affecting various functions such as sight, motor coordination and memory, making the animal unable to run a maze properly. 

Lashley also proposed that a portion of a functional area could carry out the role of the entire area, even when the rest of the area has been removed. He called this phenomenon equipotentiality. We know now that he was seeing evidence of plasticity in the brain. The brain has the spectacular ability for certain areas to take over the functions of other areas if those areas should fail or be removed, although not to the extent initially argued by Lashley.

Approaches

Experimental neuropsychology is an approach that uses methods from experimental psychology to uncover the relationship between the nervous system and cognitive function. The majority of work involves studying healthy humans in a laboratory setting, although a minority of researchers may conduct animal experiments. Human work in this area often takes advantage of specific features of our nervous system (for example that visual information presented to a specific visual field is preferentially processed by the cortical hemisphere on the opposite side) to make links between neuroanatomy and psychological function.

Clinical neuropsychology is the application of neuropsychological knowledge to the assessment (see neuropsychological test and neuropsychological assessment), management, and rehabilitation of people who have suffered illness or injury (particularly to the brain) which has caused neurocognitive problems. In particular they bring a psychological viewpoint to treatment, to understand how such illness and injury may affect and be affected by psychological factors. They also can offer an opinion as to whether a person is demonstrating difficulties due to brain pathology or as a consequence of an emotional or another (potentially) reversible cause or both. For example, a test might show that both patients X and Y are unable to name items that they have been previously exposed to within the past 20 minutes (indicating possible dementia). If patient Y can name some of them with further prompting (e.g. given a categorical clue such as being told that the item they could not name is a fruit), this allows a more specific diagnosis than simply dementia (Y appears to have the vascular type which is due to brain pathology but is usually at least somewhat reversible). Clinical neuropsychologists often work in hospital settings in an interdisciplinary medical team; others work in private practice and may provide expert input into medico-legal proceedings.

Cognitive neuropsychology is a relatively new development and has emerged as a distillation of the complementary approaches of both experimental and clinical neuropsychology. It seeks to understand the mind and brain by studying people who have suffered brain injury or neurological illness. One model of neuropsychological functioning is known as functional localization. This is based on the principle that if a specific cognitive problem can be found after an injury to a specific area of the brain, it is possible that this part of the brain is in some way involved. However, there may be reason to believe that the link between mental functions and neural regions is not so simple. An alternative model of the link between mind and brain, such as parallel processing, may have more explanatory power for the workings and dysfunction of the human brain. Yet another approach investigates how the pattern of errors produced by brain-damaged individuals can constrain our understanding of mental representations and processes without reference to the underlying neural structure. A more recent but related approach is cognitive neuropsychiatry which seeks to understand the normal function of mind and brain by studying psychiatric or mental illness.

Connectionism is the use of artificial neural networks to model specific cognitive processes using what are considered to be simplified but plausible models of how neurons operate. Once trained to perform a specific cognitive task these networks are often damaged or 'lesioned' to simulate brain injury or impairment in an attempt to understand and compare the results to the effects of brain injury in humans.

Functional neuroimaging uses specific neuroimaging technologies to take readings from the brain, usually when a person is doing a particular task, in an attempt to understand how the activation of particular brain areas is related to the task. In particular, the growth of methodologies to employ cognitive testing within established functional magnetic resonance imaging (fMRI) techniques to study brain-behavior relations is having a notable influence on neuropsychological research.

In practice these approaches are not mutually exclusive and most neuropsychologists select the best approach or approaches for the task to be completed.

Methods and tools

Standardized neuropsychological tests 
These tasks have been designed so the performance on the task can be linked to specific neurocognitive processes. These tests are typically standardized, meaning that they have been administered to a specific group (or groups) of individuals before being used in individual clinical cases. The data resulting from standardization are known as normative data. After these data have been collected and analyzed, they are used as the comparative standard against which individual performances can be compared. Examples of neuropsychological tests include: the Wechsler Memory Scale (WMS), the Wechsler Adult Intelligence Scale (WAIS), Boston Naming Test, the Wisconsin Card Sorting Test, the Benton Visual Retention Test, and the Controlled Oral Word Association.
Brain scans 
The use of brain scans to investigate the structure or function of the brain is common, either as simply a way of better assessing brain injury with high resolution pictures, or by examining the relative activations of different brain areas. Such technologies may include fMRI (functional magnetic resonance imaging) and positron emission tomography (PET), which yields data related to functioning, as well as MRI (magnetic resonance imaging) and computed axial tomography (CAT or CT), which yields structural data.
Global Brain Project 
Brain models based on mouse and monkey have been developed based on theoretical neuroscience involving working memory and attention, while mapping brain activity based on time constants validated by measurements of neuronal activity in various layers of the brain. These methods also map to decision states of behavior in simple tasks that involve binary outcomes.
Electrophysiology 
The use of electrophysiological measures designed to measure the activation of the brain by measuring the electrical or magnetic field produced by the nervous system. This may include electroencephalography (EEG) or magneto-encephalography (MEG).
Experimental tasks 
The use of designed experimental tasks, often controlled by computer and typically measuring reaction time and accuracy on a particular tasks thought to be related to a specific neurocognitive process. An example of this is the Cambridge Neuropsychological Test Automated Battery (CANTAB) or CNS Vital Signs (CNSVS).
Software products 
Researchers have mapped to neural activity in the brain through brain scan or MRI. Applications based on Neuropsychology are being used to influence behavior design and habit formation. An example of such a product is fooya, which is a mobile App for children that has been shown in randomized and controlled studies to influence dietary preferences among children.

Thursday, August 8, 2019

Cyberterrorism

From Wikipedia, the free encyclopedia
 
Cyberterrorism is the use of the Internet to conduct violent acts that result in, or threaten, loss of life or significant bodily harm, in order to achieve political or ideological gains through threat or intimidation. It is also sometimes considered an act of Internet terrorism where terrorist activities, including acts of deliberate, large-scale disruption of computer networks, especially of personal computers attached to the Internet by means of tools such as computer viruses, computer worms, phishing, and other malicious software and hardware methods and programming scripts.

Cyberterrorism is a controversial term. Some authors opt for a very narrow definition, relating to deployment by known terrorist organizations of disruption attacks against information systems for the primary purpose of creating alarm, panic, or physical disruption. Other authors prefer a broader definition, which includes cybercrime. Participating in a cyberattack affects the terror threat perception, even if it isn't done with a violent approach. By some definitions, it might be difficult to distinguish which instances of online activities are cyberterrorism or cybercrime.

Cyberterrorism can be also defined as the intentional use of computers, networks, and public internet to cause destruction and harm for personal objectives. Experienced cyberterrorists, who are very skilled in terms of hacking can cause massive damage to government systems, hospital records, and national security programs, which might leave a country, community or organization in turmoil and in fear of further attacks. The objectives of such terrorists may be political or ideological since this can be considered a form of terror.

There is much concern from government and media sources about potential damage that could be caused by cyberterrorism, and this has prompted efforts by government agencies such as the Federal Bureau of Investigations (FBI) and the Central Intelligence Agency (CIA) to put an end to cyber attacks and cyberterrorism.

There have been several major and minor instances of cyberterrorism. Al-Qaeda utilized the internet to communicate with supporters and even to recruit new members.[5] Estonia, a Baltic country which is constantly evolving in terms of technology, became a battleground for cyberterror in April, 2007 after disputes regarding the removal of a WWII soviet statue located in Estonia's capital Tallinn.

Overview

There is debate over the basic definition of the scope of cyberterrorism. These definitions can be narrow such as the use of Internet to attack other systems in the Internet that result to violence against persons or property. They can also be broad, those that include any form of Internetusage by terrorists ro conventional attacks on information technology infrastructures. There is variation in qualification by motivation, targets, methods, and centrality of computer use in the act. U.S. government agencies also use varying definitions and that none of these have so far attempted to introduce a standard that is binding outside of their sphere of influence.

Depending on context, cyberterrorism may overlap considerably with cybercrime, cyberwar or ordinary terrorism. Eugene Kaspersky, founder of Kaspersky Lab, now feels that "cyberterrorism" is a more accurate term than "cyberwar". He states that "with today's attacks, you are clueless about who did it or when they will strike again. It's not cyber-war, but cyberterrorism." He also equates large-scale cyber weapons, such as the Flame Virus and NetTraveler Virus which his company discovered, to biological weapons, claiming that in an interconnected world, they have the potential to be equally destructive.

If cyberterrorism is treated similarly to traditional terrorism, then it only includes attacks that threaten property or lives, and can be defined as the leveraging of a target's computers and information, particularly via the Internet, to cause physical, real-world harm or severe disruption of infrastructure.
Many academics and researchers who specialize in terrorism studies suggest that cyberterrorism does not exist and is really a matter of hacking or information warfare. They disagree with labeling it as terrorism because of the unlikelihood of the creation of fear, significant physical harm, or death in a population using electronic means, considering current attack and protective technologies.

If death or physical damage that could cause human harm is considered a necessary part of the cyberterrorism definition, then there have been few identifiable incidents of cyberterrorism, although there has been much policy research and public concern. Modern terrorism and political violence is not easily defined, however, and some scholars assert that it is now "unbounded" and not exclusively concerned with physical damage 

There is an old saying that death or loss of property are the side products of terrorism, the main purpose of such incidents is to create terror in peoples' minds and harm bystanders. If any incident in cyberspace can create terror, it may be rightly called cyberterrorism. For those affected by such acts, the fears of cyberterrorism are quite real.

As with cybercrime in general, the threshold of required knowledge and skills to perpetrate acts of cyberterror has been steadily diminishing thanks to freely available hacking suites and online courses. Additionally, the physical and virtual worlds are merging at an accelerated rate, making for many more targets of opportunity which is evidenced by such notable cyber attacks as Stuxnet, the Saudi petrochemical sabotage attempt in 2018 and others.

Defining cyberterrorism

Assigning a concrete definition to cyberterrorism can be hard, due to the difficulty of defining the term terrorism itself. Multiple organizations have created their own definitions, most of which are overly broad. There is also controversy concerning overuse of the term, hyperbole in the media and by security vendors trying to sell "solutions".

One way of understanding cyberterrorism involves the idea that terrorists could cause massive loss of life, worldwide economic chaos and environmental damage by hacking into critical infrastructure systems. The nature of cyberterrorism covers conduct involving computer or Internet technology that:
  1. is motivated by a political, religious or ideological cause
  2. is intended to intimidate a government or a section of the public to varying degrees
  3. seriously interferes with infrastructure
The term "cyberterrorism" can be used in a variety of different ways, but there are limits to its use. An attack on an Internet business can be labeled cyberterrorism, however when it is done for economic motivations rather than ideological it is typically regarded as cybercrime. Convention also limits the label "cyberterrorism" to actions by individuals, independent groups, or organizations. Any form of cyberwarfare conducted by governments and states would be regulated and punishable under international law.

The Technolytics Institute defines cyberterrorism as
"[t]he premeditated use of disruptive activities, or the threat thereof, against computers and/or networks, with the intention to cause harm or further social, ideological, religious, political or similar objectives. Or to intimidate any person in furtherance of such objectives."
The term appears first in defense literature, surfacing (as "cyber-terrorism") in reports by the U.S. Army War College as early as 1998.

The National Conference of State Legislatures, an organization of legislators created to help policymakers in the United States of America with issues such as economy and homeland security defines cyberterrorism as:
[T]he use of information technology by terrorist groups and individuals to further their agenda. This can include use of information technology to organize and execute attacks against networks, computer systems and telecommunications infrastructures, or for exchanging information or making threats electronically. Examples are hacking into computer systems, introducing viruses to vulnerable networks, web site defacing, Denial-of-service attacks, or terroristic threats made via electronic communication.
NATO defines cyberterrorism as "[a] cyberattack using or exploiting computer or communication networks to cause sufficient destruction or disruption to generate fear or to intimidate a society into an ideological goal".

The United States National Infrastructure Protection Center defined cyberterrorism as:
A criminal act perpetrated by the use of computers and telecommunications capabilities resulting in violence, destruction, and/or disruption of services to create fear by causing confusion and uncertainty within a given population, with the goal of influencing a government or population to conform to a political, social, or ideological agenda.
The FBI, another United States agency, defines "cyber terrorism" as “premeditated, politically motivated attack against information, computer systems, computer programs, and data which results in violence against non-combatant targets by subnational groups or clandestine agents”.

These definitions tend to share the view of cyberterrorism as politically and/or ideologically inclined. One area of debate is the difference between cyberterrorism and hacktivism. Hacktivism is ”the marriage of hacking with political activism”. Both actions are politically driven and involve using computers, however cyberterrorism is primarily used to cause harm. It becomes an issue because acts of violence on the computer can be labeled either cyberterrorism or hacktivism.

Types of cyberterror capability

The following three levels of cyberterror capability are defined by Monterey group
  • Simple-Unstructured: The capability to conduct basic hacks against individual systems using tools created by someone else. The organization possesses little target analysis, command, and control, or learning capability.
  • Advanced-Structured: The capability to conduct more sophisticated attacks against multiple systems or networks and possibly, to modify or create basic hacking tools. The organization possesses an elementary target analysis, command and control, and learning capability.
  • Complex-Coordinated: The capability for a coordinated attack capable of causing mass-disruption against integrated, heterogeneous defenses (including cryptography). Ability to create sophisticated hacking tools. Highly capable target analysis, command, and control, and organization learning capability.

Concerns

Cyberterrorism is becoming more and more prominent on social media today. As the Internet becomes more pervasive in all areas of human endeavor, individuals or groups can use the anonymity afforded by cyberspace to threaten citizens, specific groups (i.e. with membership based on ethnicity or belief), communities and entire countries, without the inherent threat of capture, injury, or death to the attacker that being physically present would bring. Many groups such as Anonymous, use tools such as denial-of-service attack to attack and censor groups who oppose them, creating many concerns for freedom and respect for differences of thought. 

Many believe that cyberterrorism is an extreme threat to countries' economies, and fear an attack could potentially lead to another Great Depression. Several leaders agree that cyberterrorism has the highest percentage of threat over other possible attacks on U.S. territory. Although natural disasters are considered a top threat and have proven to be devastating to people and land, there is ultimately little that can be done to prevent such events from happening. Thus, the expectation is to focus more on preventative measures that will make Internet attacks impossible for execution.

As the Internet continues to expand, and computer systems continue to be assigned increased responsibility while becoming more complex and interdependent, sabotage or terrorism via the Internet may become a more serious threat and is possibly one of the top 10 events to "end the human race." People have much easier access to illegal involvement within the cyberspace by the ability to access a part of the internet known as the Dark Web. The Internet of Things promises to further merge the virtual and physical worlds, which some experts see as a powerful incentive for states to use terrorist proxies in furtherance of objectives.

Dependence on the internet is rapidly increasing on a worldwide scale, creating a platform for international cyber terror plots to be formulated and executed as a direct threat to national security. For terrorists, cyber-based attacks have distinct advantages over physical attacks. They can be conducted remotely, anonymously, and relatively cheaply, and they do not require significant investment in weapons, explosive and personnel. The effects can be widespread and profound. Incidents of cyberterrorism are likely to increase. They will be conducted through denial of service attacks, malware, and other methods that are difficult to envision today. One example involves the deaths involving the Islamic State and the online social networks Twitter, Google, and Facebook lead to legal action being taken against them, that ultimately resulted in them being sued.

In an article about cyber attacks by Iran and North Korea, The New York Times observes, "The appeal of digital weapons is similar to that of nuclear capability: it is a way for an outgunned, outfinanced nation to even the playing field. 'These countries are pursuing cyberweapons the same way they are pursuing nuclear weapons,' said James A. Lewis, a computer security expert at the Center for Strategic and International Studies in Washington. 'It's primitive; it's not top of the line, but it's good enough and they are committed to getting it.'"

History

Public interest in cyberterrorism began in the late 1990s, when the term was coined by Barry C. Collin.[35] As 2000 approached, the fear and uncertainty about the millennium bug heightened, as did the potential for attacks by cyber terrorists. Although the millennium bug was by no means a terrorist attack or plot against the world or the United States, it did act as a catalyst in sparking the fears of a possibly large-scale devastating cyber-attack. Commentators noted that many of the facts of such incidents seemed to change, often with exaggerated media reports.

The high-profile terrorist attacks in the United States on September 11, 2001 and the ensuing War on Terror by the US led to further media coverage of the potential threats of cyberterrorism in the years following. Mainstream media coverage often discusses the possibility of a large attack making use of computer networks to sabotage critical infrastructures with the aim of putting human lives in jeopardy or causing disruption on a national scale either directly or by disruption of the national economy.

Authors such as Winn Schwartau and John Arquilla are reported to have had considerable financial success selling books which described what were purported to be plausible scenarios of mayhem caused by cyberterrorism. Many critics claim that these books were unrealistic in their assessments of whether the attacks described (such as nuclear meltdowns and chemical plant explosions) were possible. A common thread throughout what critics perceive as cyberterror-hype is that of non-falsifiability; that is, when the predicted disasters fail to occur, it only goes to show how lucky we've been so far, rather than impugning the theory. 

In 2016, for the first time ever, the Department of Justice charged Ardit Ferizi with cyberterrorism. He is accused of allegedly hacking into a military website and stealing the names, addresses, and other personal information of government and military personnel and selling it to ISIS.

On the other hand, it is also argued that, despite substantial studies on cyberterrorism, the body of literature is still unable to present a realistic estimate of the actual threat. For instance, in the case of a cyberterrorist attack on a public infrastructure such as a power plant or air traffic control through hacking, there is uncertainty as to its success because data concerning such phenomena are limited.

International attacks and response

Conventions

As of 2016 there have been seventeen conventions and major legal instruments that specifically deal with terrorist activities and can also be applied to cyber terrorism.
  • 1963: Convention on Offences and Certain Other Acts Committed on Board Aircraft
  • 1970: Convention for the Suppression of Unlawful Seizure of Aircraft
  • 1971: Convention for the Suppression of Unlawful Acts Against the Safety of Civil Aviation
  • 1973: Convention on the Prevention and Punishment of Crimes against Internationally Protected Persons
  • 1979: International Convention against the Taking of Hostages
  • 1980: Convention on the Physical Protection of Nuclear Material
  • 1988: Protocol for the Suppression of Unlawful Acts of Violence at Airports Serving International Civil Aviation
  • 1988: Protocol for the Suppression of Unlawful Acts against the Safety of Fixed Platforms Located on the Continental Shelf
  • 1988: Convention for the Suppression of Unlawful Acts against the Safety of Maritime Navigation
  • 1989: Supplementary to the Convention for the Suppression of Unlawful Acts against the Safety of Civil Aviation
  • 1991: Convention on the Marking of Plastic Explosives for the Purpose of Detection
  • 1997: International Convention for the Suppression of Terrorist Bombings
  • 1999: International Convention for the Suppression of the Financing of Terrorism
  • 2005: Protocol to the Convention for the Suppression of Unlawful Acts against the Safety of Maritime Navigation
  • 2005: International Convention for the Suppression of Acts of Nuclear Terrorism
  • 2010: Protocol Supplementary to the Convention for the Suppression of Unlawful Seizure of Aircraft
  • 2010: Convention on the Suppression of Unlawful Acts Relating to International Civil Aviation

Motivations for cyberattacks

There are many different motives for cyberattacks, with the majority being for financial reasons. However, there is increasing evidence that hackers are becoming more politically motivated. Cyberterrorists are aware that governments are reliant on the internet and have exploited this as a result. For example, Mohammad Bin Ahmad As-Sālim's piece '39 Ways to Serve and Participate in Jihad' discusses how an electronic jihad could disrupt the West through targeted hacks of American websites, and other resources seen as anti-Jihad, modernist, or secular in orientation (Denning, 2010; Leyden, 2007).

International institutions

As of 2016 the United Nations only has one agency that specializes in cyberterrorism, the International Telecommunications Union.

U.S. military/protections against cyberterrorism

The US Department of Defense (DoD) charged the United States Strategic Command with the duty of combating cyberterrorism. This is accomplished through the Joint Task Force-Global Network Operations, which is the operational component supporting USSTRATCOM in defense of the DoD's Global Information Grid. This is done by integrating GNO capabilities into the operations of all DoD computers, networks, and systems used by DoD combatant commands, services and agencies.

On November 2, 2006, the Secretary of the Air Force announced the creation of the Air Force's newest MAJCOM, the Air Force Cyber Command, which would be tasked to monitor and defend American interest in cyberspace. The plan was however replaced by the creation of Twenty-Fourth Air Force which became active in August 2009 and would be a component of the planned United States Cyber Command.

On December 22, 2009, the White House named its head of computer security as Howard Schmidt to coordinate U.S Government, military and intelligence efforts to repel hackers. He left the position in May, 2012. Michael Daniel was appointed to the position of White House Coordinator of Cyber Security the same week and continues in the position during the second term of the Obama administration.

More recently, Obama signed an executive order to enable the US to impose sanctions on either individuals or entities that are suspected to be participating in cyber related acts. These acts were assessed to be possible threats to US national security, financial issues or foreign policy issues. U.S. authorities indicted a man over 92 cyberterrorism hacks attacks on computers used by the Department of Defense. A Nebraska-based consortium apprehended four million hacking attempts in the course of eight weeks. In 2011 cyberterrorism attacks grew 20%.

Estonia and NATO

The Baltic state of Estonia was the target of a massive denial-of-service attack that ultimately rendered the country offline and shut out from services dependent on Internet connectivity in April 2007. The infrastructure of Estonia including everything from online banking and mobile phone networks to government services and access to health care information was disabled for a time. The tech-dependent state experienced severe turmoil and there was a great deal of concern over the nature and intent of the attack. 

The cyber attack was a result of an Estonian-Russian dispute over the removal of a bronze statue depicting a World War II-era Soviet soldier from the center of the capital, Tallinn. In the midst of the armed conflict with Russia, Georgia likewise was subject to sustained and coordinated attacks on its electronic infrastructure in August 2008. In both of these cases, circumstantial evidence point to coordinated Russian attacks, but attribution of the attacks is difficult; though both the countries blame Moscow for contributing to the cyber attacks, proof establishing legal culpability is lacking. 

Estonia joined NATO in 2004, which prompted NATO to carefully monitor its member state's response to the attack. NATO also feared escalation and the possibility of cascading effects beyond Estonia's border to other NATO members. In 2008, directly as a result of the attacks, NATO opened a new center of excellence on cyberdefense to conduct research and training on cyber warfare in Tallinn.

The chaos resulting from the attacks in Estonia illustrated to the world the dependence countries had on information technology. This dependence then makes countries vulnerable to future cyber attacks and terrorism.

Republic of Korea

According to 2016 Deloitte Asia-Pacific Defense Outlook, South Korea's 'Cyber Risk Score' was 884 out of 1,000 and South Korea is found to be the most vulnerable country to cyber attacks in the Asia-Pacific region. Considering South Korea's high speed internet and cutting edge technology, its cyber security infrastructure is relatively weak. The 2013 South Korea cyberattack significantly damaged the Korean economy. In 2017, a ransomware attack harassed private companies and users, who experienced personal information leakage. Additionally, there were North Korea's cyber attacks which risked national security of South Korea.

In response to this, South Korean government's countermeasure is to protect the information security centres the National Intelligence Agency. Currently, 'cyber security' is one of the major goals of NIS Korea. Since 2013, South Korea had established policies related to National cyber security and trying to prevent cyber crises via sophisticated investigation on potential threats. Meanwhile, scholars emphasise on improving the national consciousness towards cyber attacks as South Korea had already entered the so-called 'hyper connected society'.

China

The Chinese Defense Ministry confirmed the existence of an online defense unit in May 2011. Composed of about thirty elite internet specialists, the so-called "Cyber Blue Team", or "Blue Army", is officially claimed to be engaged in cyber-defense operations, though there are fears the unit has been used to penetrate secure online systems of foreign governments.

Pakistan

Pakistani Government has also taken steps to curb the menace of cyberterrorism and extremist propaganda. National Counter Terrorism Authority (Nacta) is working on joint programs with different NGOs and other cyber security organizations in Pakistan to combat this problem. Surf Safe Pakistan is one such example. Now people in Pakistan can report extremist and terrorist related content online on Surf Safe Pakistan portal. The National Counter Terrorism Authority (NACTA) provides the Federal Government's leadership for the Surf Safe Campaign. In March 2008 an al Qaeda forum posted a training website with six training modules to learn cyberterrorism techniques.

Ukraine

A series of powerful cyber attacks began 27 June 2017 that swamped websites of Ukrainian organizations, including banks, ministries, newspapers and electricity firms.

Examples

An operation can be done by anyone anywhere in the world, for it can be performed thousands of miles away from a target. An attack can cause serious damage to a critical infrastructure which may result in casualties.

Some attacks are conducted in furtherance of political and social objectives, as the following examples illustrate:
  • In 1996, a computer hacker allegedly associated with the White Supremacist movement temporarily disabled a Massachusetts ISP and damaged part of the ISP's record keeping system. The ISP had attempted to stop the hacker from sending out worldwide racist messages under the ISP's name. The hacker signed off with the threat: "you have yet to see true electronic terrorism. This is a promise."
  • In 1998, Spanish protesters bombarded the Institute for Global Communications (IGC) with thousands of bogus e-mail messages. E-mail was tied up and undeliverable to the ISP's users, and support lines were tied up with people who couldn't get their mail. The protestors also spammed IGC staff and member accounts, clogged their Web page with bogus credit card orders, and threatened to employ the same tactics against organizations using IGC services. They demanded that IGC stop hosting the Web site for the Euskal Herria Journal, a New York-based publication supporting Basque independence. Protestors said IGC supported terrorism because a section on the Web pages contained materials on the terrorist group ETA, which claimed responsibility for assassinations of Spanish political and security officials, and attacks on military installations. IGC finally relented and pulled the site because of the "mail bombings."
  • In 1998, ethnic Tamil guerrillas attempted to disrupt Sri Lankan embassies by sending large volumes of e-mail. The embassies received 800 e-mails a day over a two-week period. The messages read "We are the Internet Black Tigers and we're doing this to disrupt your communications." Intelligence authorities characterized it as the first known attack by terrorists against a country's computer systems.
  • During the Kosovo conflict in 1999, NATO computers were blasted with e-mail bombs and hit with denial-of-service attacks by hacktivists protesting the NATO bombings. In addition, businesses, public organizations and academic institutes received highly politicized virus-laden e-mails from a range of Eastern European countries, according to reports. Web defacements were also common. After the Chinese Embassy was accidentally bombed in Belgrade, Chinese hacktivists posted messages such as "We won't stop attacking until the war stops!" on U.S. government Web sites.
  • Since December 1997, the Electronic Disturbance Theater (EDT) has been conducting Web sit-ins against various sites in support of the Mexican Zapatistas. At a designated time, thousands of protestors point their browsers to a target site using software that floods the target with rapid and repeated download requests. EDT's software has also been used by animal rights groups against organizations said to abuse animals. Electrohippies, another group of hacktivists, conducted Web sit-ins against the WTO when they met in Seattle in late 1999. These sit-ins all require mass participation to have much effect, and thus are more suited to use by activists than by terrorists.
  • In 2000, a Japanese investigation revealed that the government was using software developed by computer companies affiliated with Aum Shinrikyo, the doomsday sect responsible for the sarin gas attack on the Tokyo subway system in 1995. "The government found 100 types of software programs used by at least 10 Japanese government agencies, including the Defense Ministry, and more than 80 major Japanese companies, including Nippon Telegraph and Telephone." Following the discovery, the Japanese government suspended use of Aum-developed programs out of concern that Aum-related companies may have compromised security by breaching firewalls. gaining access to sensitive systems or information, allowing invasion by outsiders, planting viruses that could be set off later, or planting malicious code that could cripple computer systems and key data system.
  • In March 2013, The New York Times reported on a pattern of cyber attacks against U.S. financial institutions believed to be instigated by Iran as well as incidents affecting South Korean financial institutions that originate with the North Korean government.
  • In August 2013, media companies including The New York Times, Twitter and the Huffington Post lost control of some of their websites after hackers supporting the Syrian government breached the Australian Internet company that manages many major site addresses. The Syrian Electronic Army, a hacker group that has previously attacked media organisations that it considers hostile to the regime of Syrian president Bashar al-Assad, claimed credit for the Twitter and Huffington Post hacks in a series of Twitter messages. Electronic records showed that NYTimes.com, the only site with an hours-long outage, redirected visitors to a server controlled by the Syrian group before it went dark.
  • The website of Air Botswana, defaced by a group calling themselves the "Pakistan Cyber Army"
  • Pakistani Cyber Army is the name taken by a group of hackers who are known for their defacement of websites, particularly Indian, Chinese, and Israeli companies and governmental organizations, claiming to represent Pakistani nationalist and Islamic interests. The group is thought to have been active since at least 2008, and maintains an active presence on social media, especially Facebook. Its members have claimed responsibility for the hijacking of websites belonging to Acer, BSNL, India's CBI, Central Bank, and the State Government of Kerala.
  • British hacker Kane Gamble, sentenced to 2 years in youth detention, posed as CIA chief to access highly sensitive information. He also "cyber-terrorized" high-profile U.S. intelligence officials such as then CIA chief John Brennan or Director of National Intelligence James Clapper. The judge said Gamble engaged in "politically motivated cyber terrorism."

Sabotage

Non-political acts of sabotage have caused financial and other damage. In 2000, disgruntled employee Vitek Boden caused the release of 800,000 litres of untreated sewage into waterways in Maroochy Shire, Australia.

More recently, in May 2007 Estonia was subjected to a mass cyber-attack in the wake of the removal of a Russian World War II war memorial from downtown Tallinn. The attack was a distributed denial-of-service attack in which selected sites were bombarded with traffic to force them offline; nearly all Estonian government ministry networks as well as two major Estonian bank networks were knocked offline; in addition, the political party website of Estonia's Prime Minister Andrus Ansip featured a counterfeit letter of apology from Ansip for removing the memorial statue. Despite speculation that the attack had been coordinated by the Russian government, Estonia's defense minister admitted he had no conclusive evidence linking cyber attacks to Russian authorities. Russia called accusations of its involvement "unfounded", and neither NATO nor European Commission experts were able to find any conclusive proof of official Russian government participation. In January 2008 a man from Estonia was convicted for launching the attacks against the Estonian Reform Party website and fined.

During the Russia-Georgia War, on 5 August 2008, three days before Georgia launched its invasion of South Ossetia, the websites for OSInform News Agency and OSRadio were hacked. The OSinform website at osinform.ru kept its header and logo, but its content was replaced by a feed to the Alania TV website content. Alania TV, a Georgian government-supported television station aimed at audiences in South Ossetia, denied any involvement in the hacking of the websites. Dmitry Medoyev, at the time the South Ossetian envoy to Moscow, claimed that Georgia was attempting to cover up information on events which occurred in the lead-up to the war. One such cyber attack caused the Parliament of Georgia and Georgian Ministry of Foreign Affairs websites to be replaced by images comparing Georgian president Mikheil Saakashvili to Adolf Hitler. Other attacks involved denials of service to numerous Georgian and Azerbaijani websites, such as when Russian hackers allegedly disabled the servers of the Azerbaijani Day.Az news agency.

In June 2019, Russia has conceded that it is "possible" its electrical grid is under cyber-attack by the United States. The New York Times reported that American hackers from the United States Cyber Command planted malware potentially capable of disrupting the Russian electrical grid.

Website defacement and denial of service

Even more recently, in October 2007, the website of Ukrainian president Viktor Yushchenko was attacked by hackers. A radical Russian nationalist youth group, the Eurasian Youth Movement, claimed responsibility.

In 1999 hackers attacked NATO computers. The computers flooded them with email and hit them with a denial-of-service attack. The hackers were protesting against the NATO bombings of the Chinese embassy in Belgrade. Businesses, public organizations and academic institutions were bombarded with highly politicized emails containing viruses from other European countries.

In December 2018, Twitter warned of "unusual activity" from China and Saudi Arabia. A bug was detected in November that could have revealed the country code of users' phone numbers. Twitter said the bug could have had ties to "state-sponsored actors".

Transgenerational epigenetic inheritance

From Wikipedia, the free encyclopedia
 
Genetically identical mice with different DNA methylation patterns causing kinks in the tail of one but not the other.
 
Transgenerational epigenetic inheritance is the transmission of information from one generation of an organism to the next (i.e., parent–child transmission) that affects the traits of offspring without alteration of the primary structure of DNA (i.e., the sequence of nucleotides)—in other words, epigenetically. The less precise term "epigenetic inheritance" may be used to describe both cell–cell and organism–organism information transfer. Although these two levels of epigenetic inheritance are equivalent in unicellular organisms, they may have distinct mechanisms and evolutionary distinctions in multicellular organisms. 

For some epigenetically influenced traits, the epigenetic marks can be induced by the environment and some marks are heritable, leading some to view epigenetics as a relaxation of the rejection of the inheritance of acquired characteristics (Lamarckism).

Epigenetic categories

Four general categories of epigenetic modification are known:
  1. self-sustaining metabolic loops, in which a mRNA or protein product of a gene stimulates transcription of the gene; e.g. Wor1 gene in Candida albicans
  2. structural templating in which structures are replicated using a template or scaffold structure on the parent; e.g. the orientation and architecture of cytoskeletal structures, cilia and flagella, prions, proteins that replicate by changing the structure of normal proteins to match their own
  3. chromatin marks, in which methyl or acetyl groups bind to DNA nucleotides or histones thereby altering gene expression patterns; e.g. Lcyc gene in Linaria vulgaris described below
  4. RNA silencing, in which small RNA strands interfere (RNAi) with the transcription of DNA or translation of mRNA; known only from a few studies, mostly in Caenorhabditis elegans

Inheritance of epigenetic marks

Epigenetic variation may take one of four general forms. Others may yet be elucidated, but currently self-sustaining feedback loops, spatial templating, chromatin marking, and RNA-mediated pathways modify epigenes at the level of individual cells. Epigenetic variation within multicellular organisms may be endogenous, generated by cell–cell signaling (e.g. during cell differentiation early in development), or exogenous, a cellular response to environmental cues.

Removal vs. retention

In sexually reproducing organisms, much of the epigenetic modification within cells is reset during meiosis (e.g. marks at the FLC locus controlling plant vernalization), though some epigenetic responses have been shown to be conserved (e.g. transposon methylation in plants). Differential inheritance of epigenetic marks due to underlying maternal or paternal biases in removal or retention mechanisms may lead to the assignment of epigenetic causation to some parent of origin effects in animals and plants.

Reprogramming

In mammals, epigenetic marks are erased during two phases of the life cycle. Firstly just after fertilization and secondly, in the developing primordial germ cells, the precursors to future gametes. During fertilization the male and female gametes join in different cell cycle states and with different configuration of the genome. The epigenetic marks of the male are rapidly diluted. First, the protamines associated with male DNA are replaced with histones from the female's cytoplasm, most of which are acetylated due to either higher abundance of acetylated histones in the female's cytoplasm or through preferential binding of the male DNA to acetylated histones. Second, male DNA is systematically demethylated in many organisms, possibly through 5-hydroxymethylcytosine. However, some epigenetic marks, particularly maternal DNA methylation, can escape this reprogramming; leading to parental imprinting. 

In the primordial germ cells (PGC) there is a more extensive erasure of epigenetic information. However, some rare sites can also evade erasure of DNA methylation. If epigenetic marks evade erasure during both zygotic and PGC reprogramming events, this could enable transgenerational epigenetic inheritance. 

Recognition of the importance of epigenetic programming to the establishment and fixation of cell line identity during early embryogenesis has recently stimulated interest in artificial removal of epigenetic programming. Epigenetic manipulations may allow for restoration of totipotency in stem cells or cells more generally, thus generalizing regenerative medicine.

Retention

Cellular mechanisms may allow for co-transmission of some epigenetic marks. During replication, DNA polymerases working on the leading and lagging strands are coupled by the DNA processivity factor proliferating cell nuclear antigen (PCNA), which has also been implicated in patterning and strand crosstalk that allows for copy fidelity of epigenetic marks. Work on histone modification copy fidelity has remained in the model phase, but early efforts suggest that modifications of new histones are patterned on those of the old histones and that new and old histones randomly assort between the two daughter DNA strands. With respect to transfer to the next generation, many marks are removed as described above. Emerging studies are finding patterns of epigenetic conservation across generations. For instance, centromeric satellites resist demethylation. The mechanism responsible for this conservation is not known, though some evidence suggests that methylation of histones may contribute. Dysregulation of the promoter methylation timing associated with gene expression dysregulation in the embryo was also identified.

Decay

Whereas the mutation rate in a given 100-base gene may be 10−7 per generation, epigenes may "mutate" several times per generation or may be fixed for many generations. This raises the question: do changes in epigene frequencies constitute evolution? Rapidly decaying epigenetic effects on phenotypes (i.e. lasting less than three generations) may explain some of the residual variation in phenotypes after genotype and environment are accounted for. However, distinguishing these short-term effects from the effects of the maternal environment on early ontogeny remains a challenge.

Contribution to phenotypes

The relative importance of genetic and epigenetic inheritance is subject to debate. Though hundreds of examples of epigenetic modification of phenotypes have been published, few studies have been conducted outside of the laboratory setting. Therefore, the interactions of genes and epigenes with the environment cannot be inferred despite the central role of environment in natural selection. Experimental methodologies for manipulating epigenetic mechanisms are nascent (e.g.) and will need rigorous demonstration before studies explicitly testing the relative contributions of genotype, environment, and epigenotype are feasible.

In plants

b1 paramutation in maize. The B' allele converts the B-I allele to a B'-like state after interaction in F1 heterozygotes. These converted alleles gain the ability to convert naive B-I alleles in subsequent generations resulting in all progeny displaying lightly pigmented phenotype.
 
Studies concerning transgenerational epigenetic inheritance in plants have been reported as early as the 1950s. One of the earliest and best characterized examples of this is b1 paramutation in maize. The b1 gene encodes a basic helix-loop-helix transcription factor that is involved in the anthocyanin production pathway. When the b1 gene is expressed, the plant accumulates anthocyanin within its tissues, leading to a purple coloration of those tissues. The B-I allele (for B-Intense) has high expression of b1 resulting in the dark pigmentation of the sheath and husk tissues while the B' (pronounced B-prime) allele has low expression of b1 resulting in low pigmentation in those tissues. When homozygous B-I parents are crossed to homozygous B', the resultant F1 offspring all display low pigmentation which is due gene silencing of b1. Unexpectedly, when F1 plants are self-crossed, the resultant F2 generation all display low pigmentation and have low levels of b1 expression. Furthermore, when any F2 plant (including those that are genetically homozygous for B-I) are crossed to homozygous B-I, the offspring will all display low pigmentation and expression of b1. The lack of darkly pigmented individuals in the F2 progeny is an example of non-Mendelian inheritance and further research has suggested that the B-I allele is converted to B' via epigenetic mechanisms. The B' and B-I alleles are considered to be epialleles because they are identical at the DNA sequence level but differ in the level of DNA methylation, siRNA production, and chromosomal interactions within the nucleus. Additionally, plants defective in components of the RNA-directed DNA-methylation pathway show an increased expression of b1 in B' individuals similar to that of B-I, however, once these components are restored, the plant reverts to the low expression state. Although spontaneous conversion from B-I to B' has been observed, a reversion from B' to B-I (green to purple) has never been observed over 50 years and thousands of plants in both greenhouse and field experiments.

Examples of environmentally induced transgenerational epigenetic inheritance in plants has also been reported. In one case, rice plants that were exposed to drought-simulation treatments displayed increased tolerance to drought after 11 generations of exposure and propagation by single-seed descent as compared to non-drought treated plants. Differences in drought tolerance was linked to directional changes in DNA-methylation levels throughout the genome, suggesting that stress-induced heritable changes in DNA-methylation patterns may be important in adaptation to recurring stresses. In another study, plants that were exposed to moderate caterpillar herbivory over multiple generations displayed increased resistance to herbivory in subsequent generations (as measured by caterpillar dry mass) compared to plants lacking herbivore pressure. This increase in herbivore resistance persisted after a generation of growth without any herbivore exposure suggesting that the response was transmitted across generations. The report concluded that components of the RNA-directed DNA-methylation pathway are involved in the increased resistance across generations.

In humans

A number of studies suggest the existence of transgenerational epigenetic inheritance in humans. These include those of the Dutch famine of 1944–45, wherein the offspring born during the famine were smaller than those born the year before the famine and the effects could last for two generations. Moreover, these offspring were found to have an increased risk of glucose intolerance in adulthood. Differential DNA methylation has been found in adult female offspring who had been exposed to famine in utero, but it is unknown whether these differences are present in their germline. It is hypothesized that inhibiting the PIM3 gene may have caused slower metabolism in later generations, but causation has not been proven, only correlation. The phenomenon is sometimes referred to as Dutch Hunger Winter Syndrome. Another study hypothesizes epigenetic changes on the Y chromosome to explain differences in lifespan among the male descendants of prisoners of war in the American Civil War.

The Överkalix study noted sex-specific effects; a greater body mass index (BMI) at 9 years in sons, but not daughters, of fathers who began smoking early. The paternal grandfather's food supply was only linked to the mortality RR of grandsons and not granddaughters. The paternal grandmother's food supply was only associated with the granddaughters' mortality risk ratio. When the grandmother had a good food supply was associated with a twofold higher mortality (RR). This transgenerational inheritance was observed with exposure during the slow growth period (SGP). The SGP is the time before the start of puberty, when environmental factors have a larger impact on the body. The ancestors' SGP in this study was set between the ages of 9-12 for boys and 8–10 years for girls. This occurred in the SGP of both grandparents, or during the gestation period/infant life of the grandmothers, but not during either grandparent's puberty. The father's poor food supply and the mother's good food supply were associated with a lower risk of cardiovascular death.

The loss of genetic expression which results in Prader–Willi syndrome or Angelman syndrome has in some cases been found to be caused by epigenetic changes (or "epimutations") on both the alleles, rather than involving any genetic mutation. In all 19 informative cases, the epimutations that, together with physiological imprinting and therefore silencing of the other allele, were causing these syndromes were localized on a chromosome with a specific parental and grandparental origin. Specifically, the paternally derived chromosome carried an abnormal maternal mark at the SNURF-SNRPN, and this abnormal mark was inherited from the paternal grandmother.

Similarly, epimutations on the MLH1 gene has been found in two individuals with a phenotype of hereditary nonpolyposis colorectal cancer, and without any frank MLH1 mutation which otherwise causes the disease. The same epimutations were also found on the spermatozoa of one of the individuals, indicating the potential to be transmitted to offspring.

A study has shown childhood abuse (defined in this study as "sexual contact, severe physical abuse and/or severe neglect") leads to epigenetic modifications of glucocorticoid receptor expression which play a role in HPA (hypothalamic-pituitary-adrenal) activity. Animal experiments have shown that epigenetic changes depend on mother-infant interactions after birth. In a recent study investigating correlations among maternal stress in pregnancy and methylation in teenagers and their mothers, it has been found that children of women who were abused during pregnancy were significantly more likely than others to have methylated glucocorticoid-receptor genes, which in turn change the response to stress, leading to a higher susceptibility to anxiety.

Effects on fitness

Epigenetic inheritance may only affect fitness if it predictably alters a trait under selection. Evidence has been forwarded that environmental stimuli are important agents in the alteration of epigenes. Ironically, Darwinian evolution may act on these neo-Lamarckian acquired characteristics as well as the cellular mechanisms producing them (e.g. methyltransferase genes). Epigenetic inheritance may confer a fitness benefit to organisms that deal with environmental changes at intermediate timescales. Short-cycling changes are likely to have DNA-encoded regulatory processes, as the probability of the offspring needing to respond to changes multiple times during their lifespans is high. On the other end, natural selection will act on populations experiencing changes on longer-cycling environmental changes. In these cases, if epigenetic priming of the next generation is deleterious to fitness over most of the interval (e.g. misinformation about the environment), these genotypes and epigenotypes will be lost. For intermediate time cycles, the probability of the offspring encountering a similar environment is sufficiently high without substantial selective pressure on individuals lacking a genetic architecture capable of responding to the environment. Naturally, the absolute lengths of short, intermediate, and long environmental cycles will depend on the trait, the length of epigenetic memory, and the generation time of the organism. Much of the interpretation of epigenetic fitness effects centers on the hypothesis that epigenes are important contributors to phenotypes, which remains to be resolved.

Deleterious effects

Inherited epigenetic marks may be important for regulating important components of fitness. In plants, for instance, the Lcyc gene in Linaria vulgaris controls the symmetry of the flower. Linnaeus first described radially symmetric mutants, which arise when Lcyc is heavily methylated. Given the importance of floral shape to pollinators, methylation of Lcyc homologues (e.g. CYCLOIDEA) may have deleterious effects on plant fitness. In animals, numerous studies have shown that inherited epigenetic marks can increase susceptibility to disease. Transgenerational epigenetic influences are also suggested to contribute to disease, especially cancer, in humans. Tumor methylation patterns in gene promotors have been shown to correlate positively with familial history of cancer. Furthermore, methylation of the MSH2 gene is correlated with early-onset colorectal and endometrial cancers.

Putatively adaptive effects

Experimentally demethylated seeds of the model organism Arabidopsis thaliana have significantly higher mortality, stunted growth, delayed flowering, and lower fruit set, indicating that epigenes may increase fitness. Furthermore, environmentally induced epigenetic responses to stress have been shown to be inherited and positively correlated with fitness. In animals, communal nesting changes mouse behavior increasing parental care regimes and social abilities that are hypothesized to increase offspring survival and access to resources (such as food and mates), respectively.

Macroevolutionary patterns

Inherited epigenetic effects on phenotypes have been documented in bacteria, protists, fungi, plants, and animals. Though no systematic study of epigenetic inheritance has been conducted (most focus on model organisms), there is preliminary evidence that this mode of inheritance is more important in plants than in animals. The early differentiation of animal germlines is likely to preclude epigenetic marking occurring later in development, while in plants and fungi somatic cells may be incorporated into the germ line.

Life history patterns may also contribute to the occurrence of epigenetic inheritance. Sessile organisms, those with low dispersal capability, and those with simple behavior may benefit most from conveying information to their offspring via epigenetic pathways. Geographic patterns may also emerge, where highly variable and highly conserved environments might host fewer species with important epigenetic inheritance.

Controversies

Humans have long recognized that traits of the parents are often seen in offspring. This insight led to the practical application of selective breeding of plants and animals, but did not address the central question of inheritance: how are these traits conserved between generations, and what causes variation? Several positions have been held in the history of evolutionary thought.

Blending vs. particulate inheritance

Blending inheritance leads to the averaging out of every characteristic, which as the engineer Fleeming Jenkin pointed out, makes evolution by natural selection impossible.
 
Addressing these related questions, scientists during the time of the Enlightenment largely argued for the blending hypothesis, in which parental traits were homogenized in the offspring much like buckets of different colored paint being mixed together. Critics of Charles Darwin's On the Origin of Species, pointed out that under this scheme of inheritance, variation would quickly be swamped by the majority phenotype. In the paint bucket analogy, this would be seen by mixing two colors together and then mixing the resulting color with only one of the parent colors 20 times; the rare variant color would quickly fade.

Unknown to most of the European scientific community, a monk by the name of Gregor Mendel had resolved the question of how traits are conserved between generations through breeding experiments with pea plants. Charles Darwin thus did not know of Mendel's proposed "particulate inheritance" in which traits were not blended but passed to offspring in discrete units that we now call genes. Darwin came to reject the blending hypothesis even though his ideas and Mendel's were not unified until the 1930s, a period referred to as the modern synthesis.

Inheritance of innate vs. acquired characteristics

In his 1809 book, Philosophie Zoologique, Jean-Baptiste Lamarck recognized that each species experiences a unique set of challenges due to its form and environment. Thus, he proposed that the characters used most often would accumulate a "nervous fluid." Such acquired accumulations would then be transmitted to the individual's offspring. In modern terms, a nervous fluid transmitted to offspring would be a form of epigenetic inheritance.

Lamarckism, as this body of thought became known, was the standard explanation for change in species over time when Charles Darwin and Alfred Russel Wallace co-proposed a theory of evolution by natural selection in 1859. Responding to Darwin and Wallace's theory, a revised neo-Lamarckism attracted a small following of biologists, though the Lamarckian zeal was quenched in large part due to Weismann's famous experiment in which he cut off the tails of mice over several successive generations without having any effect on tail length. Thus the emergent consensus that acquired characteristics could not be inherited became canon.

Revision of evolutionary theory

Non-genetic variation and inheritance, however, proved to be quite common. Concurrent to the modern evolutionary synthesis (unifying Mendelian genetics and natural selection), C. H. Waddington was working to unify developmental biology and genetics. In so doing, he coined the word "epigenetic" to represent the ordered differentiation of embryonic cells into functionally distinct cell types despite having identical primary structure of their DNA. Waddington's epigenetics was sporadically discussed, becoming more of a catch-all for puzzling non-genetic heritable characters rather than advancing the body of inquiry. Consequently, the definition of Waddington's word has itself evolved, broadening beyond the subset of developmentally signaled, inherited cell specialization. 

Some scientists have questioned if epigenetic inheritance compromises the foundation of the modern synthesis. Outlining the central dogma of molecular biology, Francis Crick succinctly stated, "DNA is held in a configuration by histone[s] so that it can act as a passive template for the simultaneous synthesis of RNA and protein[s]. None of the detailed 'information' is in the histone." However, he closes the article stating, "this scheme explains the majority of the present experimental results!" Indeed, the emergence of epigenetic inheritance (in addition to advances in the study of evolutionary-development, phenotypic plasticity, evolvability, and systems biology) has strained the current framework of the modern evolutionary synthesis, and prompted the re-examination of previously dismissed evolutionary mechanisms.

There has been much critical discussion of mainstream evolutionary theory by Edward J Steele, Robyn A Lindley and colleagues, Fred Hoyle and N. Chandra Wickramasinghe, Yongsheng Liu Denis Noble, John Mattick and others that the logical inconsistencies as well as Lamarckian Inheritance effects involving direct DNA modifications, as well as the just described indirect, viz. epigenetic, transmissions, challenge conventional thinking in evolutionary biology and adjacent fields.

Social privilege

From Wikipedia, the free encyclopedia https://en.wikipedi...