Search This Blog

Saturday, November 30, 2019

Nucleic acid double helix

From Wikipedia, the free encyclopedia
 
DNA double helix
Simplified representation of a double stranded DNA helix with coloured bases
 
Two complementary regions of nucleic acid molecules will bind and form a double helical structure held together by base pairs.
 
In molecular biology, the term double helix refers to the structure formed by double-stranded molecules of nucleic acids such as DNA. The double helical structure of a nucleic acid complex arises as a consequence of its secondary structure, and is a fundamental component in determining its tertiary structure. The term entered popular culture with the publication in 1968 of The Double Helix: A Personal Account of the Discovery of the Structure of DNA by James Watson

The DNA double helix biopolymer of nucleic acid, held together by nucleotides which base pair together. In B-DNA, the most common double helical structure found in nature, the double helix is right-handed with about 10–10.5 base pairs per turn. The double helix structure of DNA contains a major groove and minor groove. In B-DNA the major groove is wider than the minor groove. Given the difference in widths of the major groove and minor groove, many proteins which bind to B-DNA do so through the wider major groove.

History

The double-helix model of DNA structure was first published in the journal Nature by James Watson and Francis Crick in 1953, (X,Y,Z coordinates in 1954) based upon the crucial X-ray diffraction image of DNA labeled as "Photo 51", from Rosalind Franklin in 1952, followed by her more clarified DNA image with Raymond Gosling, Maurice Wilkins, Alexander Stokes, and Herbert Wilson, and base-pairing chemical and biochemical information by Erwin Chargaff. The prior model was triple-stranded DNA.

The realization that the structure of DNA is that of a double-helix elucidated the mechanism of base pairing by which genetic information is stored and copied in living organisms and is widely considered one of the most important scientific discoveries of the 20th century. Crick, Wilkins, and Watson each received one third of the 1962 Nobel Prize in Physiology or Medicine for their contributions to the discovery. (Franklin, whose breakthrough X-ray diffraction data was used to formulate the DNA structure, died in 1958, and thus was ineligible to be nominated for a Nobel Prize.)

Nucleic acid hybridization

Hybridization is the process of complementary base pairs binding to form a double helix. Melting is the process by which the interactions between the strands of the double helix are broken, separating the two nucleic acid strands. These bonds are weak, easily separated by gentle heating, enzymes, or mechanical force. Melting occurs preferentially at certain points in the nucleic acid. T and A rich regions are more easily melted than C and G rich regions. Some base steps (pairs) are also susceptible to DNA melting, such as T A and T G. These mechanical features are reflected by the use of sequences such as TATA at the start of many genes to assist RNA polymerase in melting the DNA for transcription.

Strand separation by gentle heating, as used in polymerase chain reaction (PCR), is simple, providing the molecules have fewer than about 10,000 base pairs (10 kilobase pairs, or 10 kbp). The intertwining of the DNA strands makes long segments difficult to separate. The cell avoids this problem by allowing its DNA-melting enzymes (helicases) to work concurrently with topoisomerases, which can chemically cleave the phosphate backbone of one of the strands so that it can swivel around the other. Helicases unwind the strands to facilitate the advance of sequence-reading enzymes such as DNA polymerase.

Base pair geometry

Base pair geometries
 
The geometry of a base, or base pair step can be characterized by 6 coordinates: shift, slide, rise, tilt, roll, and twist. These values precisely define the location and orientation in space of every base or base pair in a nucleic acid molecule relative to its predecessor along the axis of the helix. Together, they characterize the helical structure of the molecule. In regions of DNA or RNA where the normal structure is disrupted, the change in these values can be used to describe such disruption.

For each base pair, considered relative to its predecessor, there are the following base pair geometries to consider:
  • Shear
  • Stretch
  • Stagger
  • Buckle
  • Propeller: rotation of one base with respect to the other in the same base pair.
  • Opening
  • Shift: displacement along an axis in the base-pair plane perpendicular to the first, directed from the minor to the major groove.
  • Slide: displacement along an axis in the plane of the base pair directed from one strand to the other.
  • Rise: displacement along the helix axis.
  • Tilt: rotation around the shift axis.
  • Roll: rotation around the slide axis.
  • Twist: rotation around the rise axis.
  • x-displacement
  • y-displacement
  • inclination
  • tip
  • pitch: the height per complete turn of the helix.
Rise and twist determine the handedness and pitch of the helix. The other coordinates, by contrast, can be zero. Slide and shift are typically small in B-DNA, but are substantial in A- and Z-DNA. Roll and tilt make successive base pairs less parallel, and are typically small. 

Note that "tilt" has often been used differently in the scientific literature, referring to the deviation of the first, inter-strand base-pair axis from perpendicularity to the helix axis. This corresponds to slide between a succession of base pairs, and in helix-based coordinates is properly termed "inclination".

Helix geometries

At least three DNA conformations are believed to be found in nature, A-DNA, B-DNA, and Z-DNA. The B form described by James Watson and Francis Crick is believed to predominate in cells. It is 23.7 Å wide and extends 34 Å per 10 bp of sequence. The double helix makes one complete turn about its axis every 10.4–10.5 base pairs in solution. This frequency of twist (termed the helical pitch) depends largely on stacking forces that each base exerts on its neighbours in the chain. The absolute configuration of the bases determines the direction of the helical curve for a given conformation. 

A-DNA and Z-DNA differ significantly in their geometry and dimensions to B-DNA, although still form helical structures. It was long thought that the A form only occurs in dehydrated samples of DNA in the laboratory, such as those used in crystallographic experiments, and in hybrid pairings of DNA and RNA strands, but DNA dehydration does occur in vivo, and A-DNA is now known to have biological functions. Segments of DNA that cells have been methylated for regulatory purposes may adopt the Z geometry, in which the strands turn about the helical axis the opposite way to A-DNA and B-DNA. There is also evidence of protein-DNA complexes forming Z-DNA structures. 

Other conformations are possible; A-DNA, B-DNA, C-DNA, E-DNA, L-DNA (the enantiomeric form of D-DNA), P-DNA, S-DNA, Z-DNA, etc. have been described so far. In fact, only the letters F, Q, U, V, and Y are now available to describe any new DNA structure that may appear in the future. However, most of these forms have been created synthetically and have not been observed in naturally occurring biological systems. There are also triple-stranded DNA forms and quadruplex forms such as the G-quadruplex and the i-motif

The structures of A-, B-, and Z-DNA.
 
The helix axis of A-, B-, and Z-DNA.
 
Structural features of the three major forms of DNA
Geometry attribute A-DNA B-DNA Z-DNA
Helix sense right-handed right-handed left-handed
Repeating unit 1 bp 1 bp 2 bp
Rotation/bp 32.7° 34.3° 60°/2
bp/turn 11 10.5 12
Inclination of bp to axis +19° −1.2° −9°
Rise/bp along axis 2.3 Å (0.23 nm) 3.32 Å (0.332 nm) 3.8 Å (0.38 nm)
Pitch/turn of helix 28.2 Å (2.82 nm) 33.2 Å (3.32 nm) 45.6 Å (4.56 nm)
Mean propeller twist +18° +16°
Glycosyl angle anti anti C: anti,
G: syn
Sugar pucker C3'-endo C2'-endo C: C2'-endo,
G: C2'-exo
Diameter 23 Å (2.3 nm) 20 Å (2.0 nm) 18 Å (1.8 nm)

Grooves

Major and minor grooves of DNA. Minor groove is a binding site for the dye Hoechst 33258.
 
Twin helical strands form the DNA backbone. Another double helix may be found by tracing the spaces, or grooves, between the strands. These voids are adjacent to the base pairs and may provide a binding site. As the strands are not directly opposite each other, the grooves are unequally sized. One groove, the major groove, is 22 Å wide and the other, the minor groove, is 12 Å wide. The narrowness of the minor groove means that the edges of the bases are more accessible in the major groove. As a result, proteins like transcription factors that can bind to specific sequences in double-stranded DNA usually make contacts to the sides of the bases exposed in the major groove. This situation varies in unusual conformations of DNA within the cell (see below), but the major and minor grooves are always named to reflect the differences in size that would be seen if the DNA is twisted back into the ordinary B form.

Non-double helical forms

Alternative non-helical models were briefly considered in the late 1970s as a potential solution to problems in DNA replication in plasmids and chromatin. However, the models were set aside in favor of the double-helical model due to subsequent experimental advances such as X-ray crystallography of DNA duplexes and later the nucleosome core particle, and the discovery of topoisomerases. Also, the non-double-helical models are not currently accepted by the mainstream scientific community.

Single-stranded nucleic acids (ssDNA) do not adopt a helical formation, and are described by models such as the random coil or worm-like chain.

Bending

DNA is a relatively rigid polymer, typically modelled as a worm-like chain. It has three significant degrees of freedom; bending, twisting, and compression, each of which cause certain limits on what is possible with DNA within a cell. Twisting-torsional stiffness is important for the circularisation of DNA and the orientation of DNA bound proteins relative to each other and bending-axial stiffness is important for DNA wrapping and circularisation and protein interactions. Compression-extension is relatively unimportant in the absence of high tension.

Persistence length, axial stiffness

Example sequences and their persistence lengths (B DNA)
Sequence Persistence length
/ base pairs
Random 154±10
(CA)repeat 133±10
(CAG)repeat 124±10
(TATA)repeat 137±10

DNA in solution does not take a rigid structure but is continually changing conformation due to thermal vibration and collisions with water molecules, which makes classical measures of rigidity impossible to apply. Hence, the bending stiffness of DNA is measured by the persistence length, defined as:
The length of DNA over which the time-averaged orientation of the polymer becomes uncorrelated by a factor of e.
This value may be directly measured using an atomic force microscope to directly image DNA molecules of various lengths. In an aqueous solution, the average persistence length is 46–50 nm or 140–150 base pairs (the diameter of DNA is 2 nm), although can vary significantly. This makes DNA a moderately stiff molecule.

The persistence length of a section of DNA is somewhat dependent on its sequence, and this can cause significant variation. The variation is largely due to base stacking energies and the residues which extend into the minor and major grooves.

Models for DNA bending

Stacking stability of base steps (B DNA)
Step Stacking ΔG
/kcal mol−1
T A -0.19
T G or C A -0.55
C G -0.91
A G or C T -1.06
A A or T T -1.11
A T -1.34
G A or T C -1.43
C C or G G -1.44
A C or G T -1.81
G C -2.17

The entropic flexibility of DNA is remarkably consistent with standard polymer physics models, such as the Kratky-Porod worm-like chain model. Consistent with the worm-like chain model is the observation that bending DNA is also described by Hooke's law at very small (sub-piconewton) forces. However, for DNA segments less than the persistence length, the bending force is approximately constant and behaviour deviates from the worm-like chain predictions.

This effect results in unusual ease in circularising small DNA molecules and a higher probability of finding highly bent sections of DNA.

Bending preference

DNA molecules often have a preferred direction to bend, i.e., anisotropic bending. This is, again, due to the properties of the bases which make up the DNA sequence - a random sequence will have no preferred bend direction, i.e., isotropic bending. 

Preferred DNA bend direction is determined by the stability of stacking each base on top of the next. If unstable base stacking steps are always found on one side of the DNA helix then the DNA will preferentially bend away from that direction. As bend angle increases then steric hindrances and ability to roll the residues relative to each other also play a role, especially in the minor groove. A and T residues will be preferentially be found in the minor grooves on the inside of bends. This effect is particularly seen in DNA-protein binding where tight DNA bending is induced, such as in nucleosome particles. See base step distortions above. 

DNA molecules with exceptional bending preference can become intrinsically bent. This was first observed in trypanosomatid kinetoplast DNA. Typical sequences which cause this contain stretches of 4-6 T and A residues separated by G and C rich sections which keep the A and T residues in phase with the minor groove on one side of the molecule. For example:


¦

















¦

















¦

















¦

















¦

















¦
G
A
T
T
C
C
C
A
A
A
A
A
T
G
T
C
A
A
A
A
A
A
T
A
G
G
C
A
A
A
A
A
A
T
G
C
C
A
A
A
A
A
A
T
C
C
C
A
A
A
C

The intrinsically bent structure is induced by the 'propeller twist' of base pairs relative to each other allowing unusual bifurcated Hydrogen-bonds between base steps. At higher temperatures this structure is denatured, and so the intrinsic bend, is lost.

All DNA which bends anisotropically has, on average, a longer persistence length and greater axial stiffness. This increased rigidity is required to prevent random bending which would make the molecule act isotropically.

Circularization

DNA circularization depends on both the axial (bending) stiffness and torsional (rotational) stiffness of the molecule. For a DNA molecule to successfully circularize it must be long enough to easily bend into the full circle and must have the correct number of bases so the ends are in the correct rotation to allow bonding to occur. The optimum length for circularization of DNA is around 400 base pairs (136 nm), with an integral number of turns of the DNA helix, i.e., multiples of 10.4 base pairs. Having a non integral number of turns presents a significant energy barrier for circularization, for example a 10.4 x 30 = 312 base pair molecule will circularize hundreds of times faster than 10.4 x 30.5 ≈ 317 base pair molecule.

Stretching

Elastic stretching regime

Longer stretches of DNA are entropically elastic under tension. When DNA is in solution, it undergoes continuous structural variations due to the energy available in the thermal bath of the solvent. This is due to the thermal vibration of the molecule combined with continual collisions with water molecules. For entropic reasons, more compact relaxed states are thermally accessible than stretched out states, and so DNA molecules are almost universally found in a tangled relaxed layouts. For this reason, one molecule of DNA will stretch under a force, straightening it out. Using optical tweezers, the entropic stretching behavior of DNA has been studied and analyzed from a polymer physics perspective, and it has been found that DNA behaves largely like the Kratky-Porod worm-like chain model under physiologically accessible energy scales.

Phase transitions under stretching

Under sufficient tension and positive torque, DNA is thought to undergo a phase transition with the bases splaying outwards and the phosphates moving to the middle. This proposed structure for overstretched DNA has been called P-form DNA, in honor of Linus Pauling who originally presented it as a possible structure of DNA.

Evidence from mechanical stretching of DNA in the absence of imposed torque points to a transition or transitions leading to further structures which are generally referred to as S-form DNA. These structures have not yet been definitively characterised due to the difficulty of carrying out atomic-resolution imaging in solution while under applied force although many computer simulation studies have been made.

Proposed S-DNA structures include those which preserve base-pair stacking and hydrogen bonding (GC-rich), while releasing extension by tilting, as well as structures in which partial melting of the base-stack takes place, while base-base association is nonetheless overall preserved (AT-rich). 

Periodic fracture of the base-pair stack with a break occurring once per three bp (therefore one out of every three bp-bp steps) has been proposed as a regular structure which preserves planarity of the base-stacking and releases the appropriate amount of extension, with the term "Σ-DNA" introduced as a mnemonic, with the three right-facing points of the Sigma character serving as a reminder of the three grouped base pairs. The Σ form has been shown to have a sequence preference for GNC motifs which are believed under the GNC_hypothesis to be of evolutionary importance.

Supercoiling and topology

Supercoiled structure of circular DNA molecules with low writhe. The helical aspect of the DNA duplex is omitted for clarity.
 
The B form of the DNA helix twists 360° per 10.4-10.5 bp in the absence of torsional strain. But many molecular biological processes can induce torsional strain. A DNA segment with excess or insufficient helical twisting is referred to, respectively, as positively or negatively supercoiled. DNA in vivo is typically negatively supercoiled, which facilitates the unwinding (melting) of the double-helix required for RNA transcription

Within the cell most DNA is topologically restricted. DNA is typically found in closed loops (such as plasmids in prokaryotes) which are topologically closed, or as very long molecules whose diffusion coefficients produce effectively topologically closed domains. Linear sections of DNA are also commonly bound to proteins or physical structures (such as membranes) to form closed topological loops. 

Francis Crick was one of the first to propose the importance of linking numbers when considering DNA supercoils. In a paper published in 1976, Crick outlined the problem as follows:
In considering supercoils formed by closed double-stranded molecules of DNA certain mathematical concepts, such as the linking number and the twist, are needed. The meaning of these for a closed ribbon is explained and also that of the writhing number of a closed curve. Some simple examples are given, some of which may be relevant to the structure of chromatin.
Analysis of DNA topology uses three values:
  • L = linking number - the number of times one DNA strand wraps around the other. It is an integer for a closed loop and constant for a closed topological domain.
  • T = twist - total number of turns in the double stranded DNA helix. This will normally tend to approach the number of turns that a topologically open double stranded DNA helix makes free in solution: number of bases/10.5, assuming there are no intercalating agents (e.g., ethidium bromide) or other elements modifying the stiffness of the DNA.
  • W = writhe - number of turns of the double stranded DNA helix around the superhelical axis
  • L = T + W and ΔL = ΔT + ΔW
Any change of T in a closed topological domain must be balanced by a change in W, and vice versa. This results in higher order structure of DNA. A circular DNA molecule with a writhe of 0 will be circular. If the twist of this molecule is subsequently increased or decreased by supercoiling then the writhe will be appropriately altered, making the molecule undergo plectonemic or toroidal superhelical coiling. 

When the ends of a piece of double stranded helical DNA are joined so that it forms a circle the strands are topologically knotted. This means the single strands cannot be separated any process that does not involve breaking a strand (such as heating). The task of un-knotting topologically linked strands of DNA falls to enzymes termed topoisomerases. These enzymes are dedicated to un-knotting circular DNA by cleaving one or both strands so that another double or single stranded segment can pass through. This un-knotting is required for the replication of circular DNA and various types of recombination in linear DNA which have similar topological constraints.

The linking number paradox

For many years, the origin of residual supercoiling in eukaryotic genomes remained unclear. This topological puzzle was referred to by some as the "linking number paradox". However, when experimentally determined structures of the nucleosome displayed an over-twisted left-handed wrap of DNA around the histone octamer, this paradox was considered to be solved by the scientific community.

Scientometrics

From Wikipedia, the free encyclopedia
 
Scientometrics is the field of study which concerns itself with measuring and analysing scientific literature. Scientometrics is a sub-field of bibliometrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.

Historical development

Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the quantity of publications and research outcomes and the rise of the computers allowed effective analysis of this data. While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications. Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals and the rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became a leading indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level indicators has been proposed (see for example)
.
Around the same time, interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur.

Methods

Methods of research include qualitative, quantitative and computational approaches. The main foci of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings establishing faculty productivity and tenure standards, assessing the influence of top scholarly articles, and developing profiles of top authors and institutions in terms of research performance.

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems. Related fields are the history of science and technology, philosophy of science and sociology of scientific knowledge

Journals in the field include Scientometrics, Journal of the American Society for Information Science and Technology, and Journal of Informetrics. The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field.

Common scientometric indexes

Impact factor

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI).

Science Citation Index

The Science Citation Index (SCI) is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield. It was officially launched in 1964. It is now owned by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters). The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of science and technology, because of a rigorous selection process.

Acknowledgement index

An acknowledgement index (British English spelling) or acknowledgment index (American English spelling) is a method for indexing and analyzing acknowledgments in the scientific literature and, thus, quantifies the impact of acknowledgements. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts. Unlike the impact factor, it does not produce a single overall metric, but analyses the components separately. However, the total number of acknowledgements to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgement appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity.

Criticisms

Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.

Stage theory

From Wikipedia, the free encyclopedia

Stage theories are based on the idea that elements in systems move through a pattern of distinct stages over time and that these stages can be described based on their distinguishing characteristics. Specifically, stages in cognitive development have a constant order of succession, later stages integrate the achievements of earlier stages, and each is characterized by a particular type of structure of mental processes which is specific to it. The time of appearance may vary to a certain extent depending upon environmental conditions.

"Stage theory" can also be applied beyond psychology to describe phenomena more generally where multiple phases lead to an outcome. The term "stage theory" can thus be applied to various scientific, sociological and business disciplines. In these contexts, stages may not be as rigidly defined, and it is possible for individuals within the multi-stage process to revert to earlier stages or skip some stages entirely.

Piaget's theory of cognitive development

Jean Piaget's theory consists of four stages: Sensorimotor: (birth to 2 years), Preoperations: (2 to 7 years), Concrete operations: (7 to 11 years), and Formal Operations: (11 to 16 years). Each stage has at least two substages, usually called early and fully.

Underlying assumptions

  • Each stage lays the foundation for the next.
  • Everyone goes through the stages in the same order.
  • Each stage is qualitatively different. Meaning it is a change in nature, not just quantity
  • The child is an active learner. Basically they have to do it on their own, they cannot be told.

Sensorimotor stage (birth to 2 years)

This stage is represented when infants obtain some control over their surroundings by sensory and motor schemes.  Infants start to identify their actions and the consequences of their actions. 

A child comes into the world knowing almost nothing, but they have the potential that comes in the form of:
Infants use these potentials to explore and gain an understanding about themselves and the environment. They have a lack of object permanence, which means they have little or no ability to conceive things as existing outside their immediate vicinity. For example: when you place a barrier, such as a piece of wood, in front of an object an infant will believe that the object is nonexistent.

Object permanence

Study
  • Infants do not grasp the concept of object permanence when they do not realize that an object exists even when it is not visible at the moment.
  • When an object or toy is hidden from an infant they almost immediately lose interest and fail to search for the toy. This is common in infants that are eight months or younger. 
  • Children who are around eight months can form a mental representation of an object in their head proving that they obtain object permanence (sensory motor stage)

Sensorimotor play

This play does not provide a purpose other than sensation:
  • Swinging on a swing; enjoys the movement
  • Singing songs and simply playing with sounds such as 'Tra-la-la." the sound's purpose is only for the satisfaction they bring
  • Some play includes listening, tasting and smelling

Substages of Piaget's sensorimotor stage

Substage Age Piaget's Label Characteristics
1 Birth-1 month Reflexes Use of built-in schemes or reflexes such as sucking or looking; no imitation; no ability to integrate information from several senses.
2 1-4 months Primary circular reactions Accommodation of basic schemes (grasping, looking, sucking), as baby practices them endlessly. Beginning coordination of schemes from different senses, such as looking toward a sound; baby does not yet link bodily actions to some result outside the body.
3 4-8 months Secondary circular reactions Baby becomes much more aware of events outside his own body and makes them happen again, in a kind of trial-and-error learning. Imitation may occur, but only of schemes already in the baby's repertoire. Beginning understanding of the "object concept."
4 8-12 months Coordination of secondary schemes Clear intentional means-ends behavior. The baby not only goes after what she wants, but may combine two schemes to do so, such as knocking a pillow away to reach a toy. Imitation of novel behaviors occurs, as does transfer of information from one sense to another (cross-modal transfer).
5 12-18 months Tertiary circular reactions "Experimentation" begins, in which the infant tries out new ways of playing with or manipulating objects. Very active, very purposeful trial-and-error exploration.
6 18-24 months Beginning of representational thought Development of use of symbols to represent objects or events. Child understands that the symbol is separate from the object. Deferred imitation first occurs at this stage.

Preoperational stage (2 to 7 years)

Preoperational intelligence means the young child is capable of mental representations, but does not have a system for organizing this thinking (intuitive rather than logical thought). The child is egocentric – which is they have problems distinguishing from their own perceptions and perceptions of others. A classic example is, a preoperational child will cover their eyes so they can not see someone and think that that person can not see them either. 

The child also has rigid thinking, which involves the following:
  • Centration – a child will become completely fixed on one point, not allowing them to see the wider picture. For example, focusing only on the height of the container rather than both the height and width when determining what has the biggest volume.
  • State – can only concentrate on what something looks like at that time.
  • Appearance – focuses on how something appears rather than reality.
  • Lack of Reversibility – can not reverse the steps they have taken. Does not realize that one set of steps can be cancelled by another set of steps.
  • Lack of Conservation – realizing that something can have the same properties even if it appears differently.

Concrete operations (7 to 11 years)

  • Intelligence is now both symbolic and logical.
  • Acquires ‘operations’ = a set of general rules and strategies.
  • The most critical part of operations is realizing ‘reversibility’ = both physical and mental processes can be reversed and cancelled out by others.
The concrete operational child will overcome the aspects of rigidity apparent in a preoperational child. These are:
  • lack of reversibility
  • states
  • appearance
  • conservation
The tasks of concrete operations are:
  • Seriation – putting items (such as toys) in height order.
  • Classification – the difference between two similar items such as daisies and roses.
  • Conservation – realising something can have same properties, even if it appears differently.
It is important to realise that operations and conservations do not develop at the same time. They develop gradually and are not an ‘all or nothing’ phenomenon. For example, the first to develop is number conservation followed by mass conservation, area conservation, liquid conservation and finally solid volume conservation. Thinking is not abstract. It is limited to concrete phenomena and the child’s own past experiences.

Formal operations (11 to 16 years)

  • Child is capable of formulating hypotheses and then testing them against reality.
  • Thinking is abstract, that is a child/adolescent can formulate all the possible outcomes before beginning the problem. They are also capable of deductive reasoning.

Limitations of Piaget's theory

A popular criticism is that Piaget underestimated the abilities of an infant. Studies have shown that they have more of a capacity in memory and understanding of objects than he believed. 

Neo-Piagetian and Post-Piagetian stage theories

Juan Pascaual-Leone was the first to propose a neo-Piagetian stage theory. Since that time there have been several neo-Piagetian theories of cognitive development. Only the ones that cover at least infancy through adulthood are mentioned here. These include the theories of Robbie Case, Grame Halford, Andreas Demetriou and Kurt W. Fischer. The theory of Michael Commons' model of hierarchical complexity is also relevant. The description of stages in these theories is more elaborate and focuses on underlying mechanisms of information processing rather than on reasoning as such. In fact, development in information processing capacity is invoked to explain the development of reasoning. More stages are described (as many as 15 stages), with 4 being added beyond the stage of Formal operations. Most stage sequences map onto one another. PostPiagetian stages are free of content and context and are therefore very powerful and general.

List of books formulating stage theories

Related works

  • Kessen, W., & Kessel F. S., Bornstein M. H., & Sameroff A. J. (1991). Contemporary constructions of the child: essays in honor of William Kessen. Hillsdale, NJ: L. Erlbaum Associates.
  • Bhattacharjee, Y. (2012, Mar 18). Why bilinguals are smarter. New York Times. Retrieved from http://search.proquest.com/docview/934457002

Cognitive development

From Wikipedia, the free encyclopedia
 
Cognitive development is a field of study in neuroscience and psychology focusing on a child's development in terms of information processing, conceptual resources, perceptual skill, language learning, and other aspects of the developed adult brain and cognitive psychology. Qualitative differences between how a child processes their waking experience and how an adult processes their waking experience are acknowledged (Such as object permanence, the understanding of logical relations, and cause-effect reasoning in school age children). Cognitive development is defined in adult terms as the emergence of ability to consciously cognize and consciously understand and articulate their understanding. From an adult point of view, cognitive development can also be called intellectual development. Cognitive development is how a person perceives, thinks, and gains understanding of their world through the relations of genetic and learning factors. There are four stages to Cognitive Development information development, reasoning, intelligence, language, and memory. These stages start when the baby is about 18 months old, they play with toys, listen to their parents speak, they watch tv, anything that catches their attention helps build their Cognitive Development.

Jean Piaget was a major force establishing this field, forming his "theory of cognitive development". Piaget proposed four stages of cognitive development: the sensorimotor, preoperational, concrete operational and formal operational period. Many of Piaget's theoretical claims have since fallen out of favor. Still, his description of the most prominent changes in cognition with age, is generally still accepted today (e.g., how early perception moves from being dependent on concrete, external actions. Later, abstract understanding of observable aspects of reality can be captured; leading to, discovery of underlying abstract rules and principles, usually starting in adolescence)

In recent years, however, alternative models have been advanced, including information-processing theory, neo-Piagetian theories of cognitive development, which aim to integrate Piaget's ideas with more recent models and concepts in developmental and cognitive science, theoretical cognitive neuroscience, and social-constructivist approaches. A major controversy in cognitive development has been "nature versus nurture", that is, the question if cognitive development is mainly determined by an individual's innate qualities ("nature"), or by their personal experiences ("nurture"). However, it is now recognized by most experts that this is a false dichotomy: there is overwhelming evidence from biological and behavioral sciences that from the earliest points in development, gene activity interacts with events and experiences in the environment.

Historical origins: The history and theory of cognitive development

Jean Piaget is inexorably linked to cognitive development. It is clear in Piaget's writings that there are influences from many historical predecessors. A few that are worth mentioning are included in the following Historical Origins chart. It is intended to be a more inclusive list of researchers who have studied the processes of acquiring more complex ways of thinking as people grow and develop: 


DOB/death Contribution to cognitive development
Jean-Jacques Rousseau 1712–1778 Wrote Emile, or On Education (1762). He discusses childhood development as happening in three stages. First stage, up to age 12, the child is guided by their emotions and impulses. The second stage, ages 12–16, the child's reason starts to develop. In the third and final stage, age 16 and up, the child develops into an adult.
James Sully 1842–1923 Wrote several books on childhood development, including Studies of Childhood (1895) and Children's Ways[6] (1897). He used a detailed observational study method with the children. Contemporary research in child development actually repeats observations, and observational methods, summarized by Sully in Studies of Childhood, such as the mirror technique.
Lev Vygotsky 1896–1934 Area of specialty was developmental psychology. Main contribution is the somewhat controversial "zone of proximal development" (ZPD) which states that play should be children's main activity as this is their main source of development in terms of emotional, volitional, and cognitive development. ZPD is the link between children's learning and cognitive development.
Maria Montessori 1870–1952 She began her career working with mentally disabled children in 1897, then conducted observation and experimental research in elementary schools. Wrote The Discovery of the Child (1948). Discussed the Four Planes of Development: birth–6, 6–12, 12–18, and 18–24. The Montessori Method now has three developmentally-meaningful age groups: 2–2.5, 2.5–6, and 6–12. She was working on human behavior in older children but only published lecture notes on the subject.
Jean Piaget 1896–1980 Piaget was the first psychologist and philosopher to brand this type of study as "cognitive development". Other researchers, in multiple disciplines, had studied development in children before, but Piaget is often credited as being the first one to make a systematic study of cognitive development and gave it its name. His main contribution is the stage theory of child cognitive development. He also published his observational studies of cognition in children, and created a series of simple tests to reveal different cognitive abilities in children.
Lawrence Kohlberg 1927–1987 Wrote the theory of stages of moral development, which extended Piaget's findings of cognitive development and showed that they continue through the lifespan. Kohlberg's six stages follow Piaget's constructivist requirements in that stages can not be skipped and it is very rare to regress in stages. Notable works: Moral Stages and Moralization: The Cognitive-Development Approach[8] (1976) and Essays on Moral Development (1981)

Piaget's theory of cognitive development

Jean Piaget (1896–1980) believed that people move through stages of development that allow them to think in new, more complex ways.

Sensorimotor stage

The first stage in Piaget's stages of cognitive development is the sensorimotor stage. This stage lasts from birth to two years old. During this stage, behaviors lack a sense of thought and logic. Behaviors gradually move from acting upon inherited reflexes to interacting with the environment with a goal in mind and being able to represent the external world at the end.

The sensorimotor stage has been broken down into six sub stages that explain the gradual development of infants from birth to age 2. Once the child gains the ability to mentally represent reality, the child begins the transition to the preoperational stage of development.

Birth to one month

Each child is born with inherited reflexes that they use to gain knowledge and understanding about their environment. Examples of these reflexes include grasping and sucking.

1–4 months

Children repeat behaviors that happen unexpectedly because of their reflexes. For example, a child's finger comes in contact with the mouth and the child starts sucking on it. If the sensation is pleasurable to the child, then the child will attempt to recreate the behavior. Infants use their initial reflexes (grasping and sucking) to explore their environment and create schemes. Schemes are groups of similar actions or thoughts that are used repeatedly in response to the environment. Once a child begins to create schemes they use accommodation and assimilation to become progressively adapted to the world. Assimilation is when a child responds to a new event in a way that is consistent with an existing schema. For example, an infant may assimilate a new teddy bear into their putting things in their mouth scheme and use their reflexes to make the teddy bear go into their mouth. Accommodation is when a child either modifies an existing scheme or forms an entirely new schema to deal with a new object or event. For example, an infant may have to open his or her mouth wider than usual to accommodate the teddy bear's paw.

5–8 months

Child has an experience with an external stimulus that they find pleasurable, so they try to recreate that experience. For example, a child accidentally hits the mobile above the crib and likes to watch it spin. When it stops the child begins to grab at the object to make it spin again. In this stage, habits are formed from general schemes that the infant has created but there is not yet, from the child's point of view, any differentiation between means and ends. Children cannot also focus on multiple tasks at once, and only focus on the task at hand. The child may create a habit of spinning the mobile in its crib, but they are still trying to find out methods to reach the mobile in order to get it to spin in the way that they find pleasurable. Once there is another distraction (say the parent walks in the room) the baby will no longer focus on the mobile. Toys should be given to infants that respond to a child's actions to help foster their investigative instincts. For example, a toy plays a song when you push one button, and then a picture pops up if you push another button.

8–12 months

Behaviors will be displayed for a reason rather than by chance. They begin to understand that one action can cause a reaction. They also begin to understand object permanence, which is the realization that objects continue to exist when removed from view. For example: The baby wants a rattle but the blanket is in the way. The baby moves the blanket to get the rattle. Now that the infant can understand that the object still exists, they can differentiate between the object, and the experience of the object. According to psychologist David Elkind, "An internal representation of the absent object is the earliest manifestation of the symbolic function which develops gradually during the second year of life whose activities dominate the next stage of mental growth."

12–18 months

Actions occur deliberately with some variation. For example, a baby drums on a pot with a wooden spoon, then drums on the floor, then on the table.

18–24 months

Children begin to build mental symbols and start to participate in pretend play. For example, a child is mixing ingredients together but doesn't have a spoon so they pretend to use one or use another object to replace the spoon. Symbolic thought is a representation of objects and events as mental entities or symbols which helps foster cognitive development and the formation of imagination. According to Piaget, the infant begins to act upon intelligence rather than habit at this point. The end product is established after the infant has pursued for the appropriate means. The means are formed from the schemes that are known by the child. The child is starting to learn how to use what it has learned in the first two years to develop and further explore their environment.

Preoperational stage

Lasts from 2 years of age until 6 or 7. It can be characterized in two somewhat different ways. In his early work, before he had developed his structuralist theory of cognition, Piaget described the child's thoughts during this period as being governed by principles such as egocentrism, animism and other similar constructs. Egocentrism is when a child can only see a certain situation his or her own way. One cannot comprehend that other people have other views and perceptions of scenarios. Animism is when an individual gives a lifeless object human-like qualities. An individual usually believes that this object has human emotions, thoughts and intentions. Once he had proposed his structuralist theory, Piaget characterized the preoperational child as lacking the cognitive structures possessed by the concrete operational child. The absence of these structures explains, in part, the behaviors Piaget had previously described as egocentric and animistic, for example, an inability to comprehend that another individual may have different emotional responses to similar experiences. During this stage children also become increasingly adept at using symbols as evidenced by the increase in playing and pretending.

Concrete operational stage

Lasts from 6 or 7 years until about 12 or 13. During this stage, the child's cognitive structures can be characterized by reality. Piaget argues that the same general principles can be discerned in a wide range of behaviors. One of the best-known achievements of this stage is conservation. In a typical conservation experiment a child is asked to judge whether or not two quantities are the same – such as two equal quantities of liquid in a short and tall glass. A preoperational child will typically judge the taller, thinner glass to contain more, while a concrete operational child will judge the amounts still to be the same. The ability to reason in this way reflects the development of a principle of conservation.

Formal operational stage

This stage lasts from 12 or 13 until adulthood, when people are advancing from logical reasoning with concrete examples to abstract examples. The need for concrete examples is no longer necessary because abstract thinking can be used instead. In this stage adolescents are also able to view themselves in the future and can picture the ideal life they would like to pursue. Some theorists believe the formal operational stage can be divided into two sub-categories: early formal operational and late formal operation thought. Early formal operational thoughts may be just fantasies, but as adolescents advance to late formal operational thought the life experiences they have encountered changes those fantasy thoughts to realistic thoughts.

Criticism

Many of Piaget's claims have fallen out of favor. For example, he claimed that young children cannot conserve numbers. However, further experiments showed that children did not really understand what was being asked of them. When the experiment is done with candies, and the children are asked which set they want rather than having to tell an adult which is more, they show no confusion about which group has more items.

Piaget's theory of cognitive development ends at the formal operational stage that is usually developed in early adulthood. It does not take into account later stages of adult cognitive development as described by, for example, Harvard University professor Robert Kegan.

Other theoretical perspectives on cognitive development

Lev Vygotsky's theory

Lev Vygotsky's (1896-1934) theory is based on social learning as the most important aspect of cognitive development. In Vygotsky's theory, adults are very important for young children's development. They help children learn through mediation, which is modeling and explaining concepts. Together, adults and children master concepts of their culture and activities. Vygotsky believed we get our complex mental activities through social learning. A significant part of Vygotsky's theory is based on the zone of proximal development, which he believes is when the most effective learning takes place. The Zone of proximal development is what a child cannot accomplish alone but can accomplish with the help of an MKO (more knowledgeable other). Vygotsky also believed culture is a very important part of cognitive development such as the language, writing and counting system used in that culture. Another aspect of Vygotsky’ theory is private speech. Private speech is when a person talks to themselves in order to help themselves problem solve. Scaffolding or providing support to a child and then slowly removing support and allowing the child to do more on their own over time is also an aspect of Vygotsky’s theory.

Speculated core systems of cognition

Empiricists study how these skills may be learned in such a short time. The debate is over whether these systems are learned by general-purpose learning devices, or domain-specific cognition. Moreover, many modern cognitive developmental psychologists, recognizing that the term "innate" does not square with modern knowledge about epigenesis, neurobiological development, or learning, favor a non-nativist framework. Researchers who discuss "core systems" often speculate about differences in thinking and learning between proposed domains.

Researchers who posit a set of so-called "core domains" suggest that children have an innate sensitivity to specific kinds of patterns of information. Those commonly cited include:

Number

Infants appear to have two systems for dealing with numbers. One deals with small numbers, often called subitizing. Another deals with larger numbers in an approximate fashion.

Space

Very young children appear to have some skill in navigation. This basic ability to infer the direction and distance of unseen locations develops in ways that are not entirely clear. However, there is some evidence that it involves the development of complex language skills between 3 and 5 years. Also, there is evidence that this skill depends importantly on visual experience, because congenitally blind individuals have been found to have impaired abilities to infer new paths between familiar locations.

Visual perception

One of the original nativist versus empiricist debates was over depth perception. There is some evidence that children less than 72 hours old can perceive such complex things as biological motion. However, it is unclear how visual experience in the first few days contributes to this perception. There are far more elaborate aspects of visual perception that develop during infancy and beyond.

Essentialism

Young children seem to be predisposed to think of biological entities (e.g., animals and plants) in an essentialistic way. This means that they expect such entities (as opposed to, e.g., artifacts) to have many traits such as internal properties that are caused by some "essence" (such as, in our modern Western conceptual framework, the genome).

Language acquisition

A major, well-studied process and consequence of cognitive development is language acquisition. The traditional view was that this is the result of deterministic, human-specific genetic structures and processes. Other traditions, however, have emphasized the role of social experience in language learning. However, the relation of gene activity, experience, and language development is now recognized as incredibly complex and difficult to specify. Language development is sometimes separated into learning of phonology (systematic organization of sounds), morphology (structure of linguistic units—root words, affixes, parts of speech, intonation, etc.), syntax (rules of grammar within sentence structure), semantics (study of meaning), and discourse or pragmatics (relation between sentences). However, all of these aspects of language knowledge—which were originally posited by the linguist Noam Chomsky to be autonomous or separate—are now recognized to interact in complex ways.

Bilingualism

It wasn’t until recently that bilingualism had been accepted as a contributing factor to cognitive development. There have been a number of studies showing how bilingualism contributes to the executive function of the brain, which is the main center at which cognitive development happens. According to Bialystok in “Bilingualism and the Development of Executive Function: The Role of Attention”, children who are bilingual, have to actively filter through the two different languages to select the one they need to use, which in turn makes the development stronger in that center.

Whorf's hypothesis

Benjamin Whorf (1897–1941), while working as a student of Edward Sapir, posited that a person's thinking depends on the structure and content of their social group's language. In other words, it is the belief that language determines our thoughts and perceptions. For example, it used to be thought that the Greeks, who wrote left to right, thought differently than Egyptians since the Egyptians wrote right to left. Whorf's theory was so strict that he believed if a word is absent in a language, then the individual is unaware of the object's existence. This theory was played out in George Orwell's book, Animal Farm; the pig leaders slowly eliminated words from the citizen's vocabulary so that they were incapable of realizing what they were missing. The Whorfian hypothesis failed to recognize that people can still be aware of the concept or item, even though they lack efficient coding to quickly identify the target information.

Quine's bootstrapping hypothesis

Willard Van Orman Quine (1908–2000) argued that there are innate conceptual biases that enable the acquisition of language, concepts, and beliefs. Quine's theory follows nativist philosophical traditions, such as the European rationalist philosophers, for example Immanuel Kant.

Neo-Piagetian theories of cognitive development

Neo-Piagetian theories of cognitive development emphasized the role of information processing mechanisms in cognitive development, such as attention control and working memory. They suggested that progression along Piagetian stages or other levels of cognitive development is a function of strengthening of control mechanisms and enhancement of working memory storage capacity.

Lev Vygotsky vs. Jean Piaget

Unlike Jean Piaget, who believed development comes before learning, Vygotsky believed that learning comes before development and that one must learn first to be able to develop into a functioning human being. Vygotsky's theory is different from Piaget's theory of cognitive development in four ways. 1. Vygotsky believes culture affects cognitive development more. Piaget thinks that cognitive development is the same across the world, while Vygotsky has the idea that culture makes cognitive development different. 2. Social factors heavily influence cognitive development under Vygotsky's beliefs. Environment and parents the child has will play a big role in a child's cognitive development. The child learns through the Zone of Proximal Development with help from their parent. 3. Vygotsky believes that language is important in cognitive development. While Piaget considers thought as an important role, Vygotsky sees thought and language as different, but eventually coming together. Vygotsky emphasizes the role of inner speech being the first thing to cause cognitive development to form. 4. Cognitive development is strongly influenced by adults. Children observe adults in their life and gain knowledge about their specific culture based on things the adults around them do. They do this through mediation and scaffolding.

Neuroscience

During development, especially the first few years of life, children show interesting patterns of neural development and a high degree of neuroplasticity. Neuroplasticity, as explained by the World Health Organization, can be summed up in three points. 1.) Any adaptive mechanism used by the nervous system to repair itself after injury. 2.) Any means by which the nervous system can repair individually damaged central circuits. 3.) Any means by which the capacity of the central nervous system can adapt to new physiological conditions and environment. The relation of brain development and cognitive development is extremely complex and, since the 1990s, has been a growing area of research. 

Cognitive development and motor development may also be closely interrelated. When a person experiences a neurodevelopmental disorder and their cognitive development is disturbed, we often see adverse effects in motor development as well. Cerebellum, which is the part of brain that is most responsible for motor skills, has been shown to have significant importance in cognitive functions in the same way that prefrontal cortex has important duties in not only cognitive abilities but also development of motor skills. To support this, there is evidence of close co-activation of neocerebellum and dorsolateral prefrontal cortex in functional neuroimaging as well as abnormalities seen in both cerebellum and prefrontal cortex in the same developmental disorder. In this way, we see close interrelation of motor development and cognitive development and they cannot operate in their full capacity when either of them are impaired or delayed.

Cultural influences

From cultural psychologists' view, minds and culture shape each other. In other words, culture can influence brain structures which then influence our interpretation of the culture. These examples reveal cultural variations in neural responses:

Figure-line task (Hedden et al., 2008)

Behavioral research has shown that one's strength in independent or interdependent tasks differ based on their cultural context. In general, East Asian cultures are more interdependent whereas Western cultures are more independent. Hedden et al. assessed functional magnetic resonance imaging (fMRI) responses of East Asians and Americans while they performed independent (absolute) or interdependent (relative) tasks. The study showed that participants used regions of the brain associated with attentional control when they had to perform culturally incongruent tasks. In other words, neural paths used for the same task were different for Americans and East Asians (Hedden et al., 2008).

Transcultural neuroimaging studies (Han s. and Northoff G., 2008)

New studies in transcultural neuroimaging studies have demonstrated that one’s cultural background can influence the neural activity that underlies both high (for example, social cognition) and low (for example, perception) level cognitive functions. Studies demonstrated that groups that come from different cultures or that have been exposed to culturally different stimuli have differences in neural activity. For example, differences were found in that of the pre motor cortex during mental calculation and that of the VMPFC during trait judgements of one’s mother from people with different cultural backgrounds. In conclusion, since differences were found in both high-level and low-level cognition one can assume that our brain’s activity is strongly and, at least in part, constitutionally shaped by its sociocultural context (Han s. and Northoff G., 2008).

Kobayashi et al., 2007

Kobayashi et al. compared American-English monolingual and Japanese-English bilingual children's brain responses in understanding others' intentions through false-belief story and cartoon tasks. They found universal activation of the region bilateral ventromedial prefrontal cortex in theory of mind tasks. However, American children showed greater activity in the left inferior frontal gyrus during the tasks whereas Japanese children had greater activity in the right inferior frontal gyrus during the Japanese Theory of Mind tasks. In conclusion, these examples suggest that the brain's neural activities are not universal but are culture dependent.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...