Search This Blog

Wednesday, January 2, 2019

String theory (updated)

From Wikipedia, the free encyclopedia

In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. It describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.
 
String theory is a broad and varied subject that attempts to address a number of deep questions of fundamental physics. String theory has been applied to a variety of problems in black hole physics, early universe cosmology, nuclear physics, and condensed matter physics, and it has stimulated a number of major developments in pure mathematics. Because string theory potentially provides a unified description of gravity and particle physics, it is a candidate for a theory of everything, a self-contained mathematical model that describes all fundamental forces and forms of matter. Despite much work on these problems, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of its details.

String theory was first studied in the late 1960s as a theory of the strong nuclear force, before being abandoned in favor of quantum chromodynamics. Subsequently, it was realized that the very properties that made string theory unsuitable as a theory of nuclear physics made it a promising candidate for a quantum theory of gravity. The earliest version of string theory, bosonic string theory, incorporated only the class of particles known as bosons. It later developed into superstring theory, which posits a connection called supersymmetry between bosons and the class of particles called fermions. Five consistent versions of superstring theory were developed before it was conjectured in the mid-1990s that they were all different limiting cases of a single theory in eleven dimensions known as M-theory. In late 1997, theorists discovered an important relationship called the AdS/CFT correspondence, which relates string theory to another type of physical theory called a quantum field theory.

One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. Another issue is that the theory is thought to describe an enormous landscape of possible universes, and this has complicated efforts to develop theories of particle physics based on string theory. These issues have led some in the community to criticize these approaches to physics and question the value of continued research on string theory unification.

Fundamentals

A wavy open segment and closed loop of string.
The fundamental objects of string theory are open and closed string models.

In the twentieth century, two theoretical frameworks emerged for formulating the laws of physics. The first is Albert Einstein's general theory of relativity, a theory that explains the force of gravity and the structure of space and time. The other is quantum mechanics which is a completely different formulation to describe physical phenomena using the known probability principles. By the late 1970s, these two frameworks had proven to be sufficient to explain most of the observed features of the universe, from elementary particles to atoms to the evolution of stars and the universe as a whole.

In spite of these successes, there are still many problems that remain to be solved. One of the deepest problems in modern physics is the problem of quantum gravity. The general theory of relativity is formulated within the framework of classical physics, whereas the other fundamental forces are described within the framework of quantum mechanics. A quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, but difficulties arise when one attempts to apply the usual prescriptions of quantum theory to the force of gravity. In addition to the problem of developing a consistent theory of quantum gravity, there are many other fundamental problems in the physics of atomic nuclei, black holes, and the early universe.

String theory is a theoretical framework that attempts to address these questions and many others. The starting point for string theory is the idea that the point-like particles of particle physics can also be modeled as one-dimensional objects called strings. String theory describes how strings propagate through space and interact with each other. In a given version of string theory, there is only one kind of string, which may look like a small loop or segment of ordinary string, and it can vibrate in different ways. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. In string theory, one of the vibrational states of the string gives rise to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.

One of the main developments of the past several decades in string theory was the discovery of certain "dualities", mathematical transformations that identify one physical theory with another. Physicists studying string theory have discovered a number of these dualities between different versions of string theory, and this has led to the conjecture that all consistent versions of string theory are subsumed in a single framework known as M-theory.

Studies of string theory have also yielded a number of results on the nature of black holes and the gravitational interaction. There are certain paradoxes that arise when one attempts to understand the quantum aspects of black holes, and work on string theory has attempted to clarify these issues. In late 1997 this line of work culminated in the discovery of the anti-de Sitter/conformal field theory correspondence or AdS/CFT. This is a theoretical result which relates string theory to other physical theories which are better understood theoretically. The AdS/CFT correspondence has implications for the study of black holes and quantum gravity, and it has been applied to other subjects, including nuclear and condensed matter physics.

Since string theory incorporates all of the fundamental interactions, including gravity, many physicists hope that it fully describes our universe, making it a theory of everything. One of the goals of current research in string theory is to find a solution of the theory that reproduces the observed spectrum of elementary particles, with a small cosmological constant, containing dark matter and a plausible mechanism for cosmic inflation. While there has been progress toward these goals, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of details.

One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. The scattering of strings is most straightforwardly defined using the techniques of perturbation theory, but it is not known in general how to define string theory nonperturbatively. It is also not clear whether there is any principle by which string theory selects its vacuum state, the physical state that determines the properties of our universe. These problems have led some in the community to criticize these approaches to the unification of physics and question the value of continued research on these problems.

Strings

Interaction in the quantum world: worldlines of point-like particles or a worldsheet swept up by closed strings in string theory.
 
The application of quantum mechanics to physical objects such as the electromagnetic field, which are extended in space and time, is known as quantum field theory. In particle physics, quantum field theories form the basis for our understanding of elementary particles, which are modeled as excitations in the fundamental fields.

In quantum field theory, one typically computes the probabilities of various physical events using the techniques of perturbation theory. Developed by Richard Feynman and others in the first half of the twentieth century, perturbative quantum field theory uses special diagrams called Feynman diagrams to organize computations. One imagines that these diagrams depict the paths of point-like particles and their interactions.

The starting point for string theory is the idea that the point-like particles of quantum field theory can also be modeled as one-dimensional objects called strings. The interaction of strings is most straightforwardly defined by generalizing the perturbation theory used in ordinary quantum field theory. At the level of Feynman diagrams, this means replacing the one-dimensional diagram representing the path of a point particle by a two-dimensional surface representing the motion of a string. Unlike in quantum field theory, string theory does not have a full non-perturbative definition, so many of the theoretical questions that physicists would like to answer remain out of reach.

In theories of particle physics based on string theory, the characteristic length scale of strings is assumed to be on the order of the Planck length, or 10−35 meters, the scale at which the effects of quantum gravity are believed to become significant. On much larger length scales, such as the scales visible in physics laboratories, such objects would be indistinguishable from zero-dimensional point particles, and the vibrational state of the string would determine the type of particle. One of the vibrational states of a string corresponds to the graviton, a quantum mechanical particle that carries the gravitational force.

The original version of string theory was bosonic string theory, but this version described only bosons, a class of particles which transmit forces between the matter particles, or fermions. Bosonic string theory was eventually superseded by theories called superstring theories. These theories describe both bosons and fermions, and they incorporate a theoretical idea called supersymmetry. This is a mathematical relation that exists in certain physical theories between the bosons and fermions. In theories with supersymmetry, each boson has a counterpart which is a fermion, and vice versa.

There are several versions of superstring theory: type I, type IIA, type IIB, and two flavors of heterotic string theory (SO(32) and E8×E8). The different theories allow different types of strings, and the particles that arise at low energies exhibit different symmetries. For example, the type I theory includes both open strings (which are segments with endpoints) and closed strings (which form closed loops), while types IIA, IIB and heterotic include only closed strings.

Extra dimensions

A tubular surface and corresponding one-dimensional curve.
An example of compactification: At large distances, a two dimensional surface with one circular dimension looks one-dimensional.
 
In everyday life, there are three familiar dimensions of space: height, width and length. Einstein's general theory of relativity treats time as a dimension on par with the three spatial dimensions; in general relativity, space and time are not modeled as separate entities but are instead unified to a four-dimensional spacetime. In this framework, the phenomenon of gravity is viewed as a consequence of the geometry of spacetime.

In spite of the fact that the universe is well described by four-dimensional spacetime, there are several reasons why physicists consider theories in other dimensions. In some cases, by modeling spacetime in a different number of dimensions, a theory becomes more mathematically tractable, and one can perform calculations and gain general insights more easily. There are also situations where theories in two or three spacetime dimensions are useful for describing phenomena in condensed matter physics. Finally, there exist scenarios in which there could actually be more than four dimensions of spacetime which have nonetheless managed to escape detection.

One notable feature of string theories is that these theories require extra dimensions of spacetime for their mathematical consistency. In bosonic string theory, spacetime is 26-dimensional, while in superstring theory it is 10-dimensional, and in M-theory it is 11-dimensional. In order to describe real physical phenomena using string theory, one must therefore imagine scenarios in which these extra dimensions would not be observed in experiments.

Visualization of a complex mathematical surface with many convolutions and self intersections.
A cross section of a quintic Calabi–Yau manifold

Compactification is one way of modifying the number of dimensions in a physical theory. In compactification, some of the extra dimensions are assumed to "close up" on themselves to form circles. In the limit where these curled up dimensions become very small, one obtains a theory in which spacetime has effectively a lower number of dimensions. A standard analogy for this is to consider a multidimensional object such as a garden hose. If the hose is viewed from a sufficient distance, it appears to have only one dimension, its length. However, as one approaches the hose, one discovers that it contains a second dimension, its circumference. Thus, an ant crawling on the surface of the hose would move in two dimensions.

Compactification can be used to construct models in which spacetime is effectively four-dimensional. However, not every way of compactifying the extra dimensions produces a model with the right properties to describe nature. In a viable model of particle physics, the compact extra dimensions must be shaped like a Calabi–Yau manifold. A Calabi–Yau manifold is a special space which is typically taken to be six-dimensional in applications to string theory. It is named after mathematicians Eugenio Calabi and Shing-Tung Yau.

Another approach to reducing the number of dimensions is the so-called brane-world scenario. In this approach, physicists assume that the observable universe is a four-dimensional subspace of a higher dimensional space. In such models, the force-carrying bosons of particle physics arise from open strings with endpoints attached to the four-dimensional subspace, while gravity arises from closed strings propagating through the larger ambient space. This idea plays an important role in attempts to develop models of real world physics based on string theory, and it provides a natural explanation for the weakness of gravity compared to the other fundamental forces.

Dualities

A diagram indicating the relationships between M-theory and the five superstring theories.
A diagram of string theory dualities. Yellow arrows indicate S-duality. Blue arrows indicate T-duality.

One notable fact about string theory is that the different versions of the theory all turn out to be related in highly nontrivial ways. One of the relationships that can exist between different string theories is called S-duality. This is a relationship which says that a collection of strongly interacting particles in one theory can, in some cases, be viewed as a collection of weakly interacting particles in a completely different theory. Roughly speaking, a collection of particles is said to be strongly interacting if they combine and decay often and weakly interacting if they do so infrequently. Type I string theory turns out to be equivalent by S-duality to the SO(32) heterotic string theory. Similarly, type IIB string theory is related to itself in a nontrivial way by S-duality.

Another relationship between different string theories is T-duality. Here one considers strings propagating around a circular extra dimension. T-duality states that a string propagating around a circle of radius R is equivalent to a string propagating around a circle of radius 1/R in the sense that all observable quantities in one description are identified with quantities in the dual description. For example, a string has momentum as it propagates around a circle, and it can also wind around the circle one or more times. The number of times the string winds around a circle is called the winding number. If a string has momentum p and winding number n in one description, it will have momentum n and winding number p in the dual description. For example, type IIA string theory is equivalent to type IIB string theory via T-duality, and the two versions of heterotic string theory are also related by T-duality.

In general, the term duality refers to a situation where two seemingly different physical systems turn out to be equivalent in a nontrivial way. Two theories related by a duality need not be string theories. For example, Montonen–Olive duality is example of an S-duality relationship between quantum field theories. The AdS/CFT correspondence is example of a duality which relates string theory to a quantum field theory. If two theories are related by a duality, it means that one theory can be transformed in some way so that it ends up looking just like the other theory. The two theories are then said to be dual to one another under the transformation. Put differently, the two theories are mathematically different descriptions of the same phenomena.

Branes

A pair of surfaces joined by wavy line segments.
Open strings attached to a pair of D-branes.
 
In string theory and other related theories, a brane is a physical object that generalizes the notion of a point particle to higher dimensions. For instance, a point particle can be viewed as a brane of dimension zero, while a string can be viewed as a brane of dimension one. It is also possible to consider higher-dimensional branes. In dimension p, these are called p-branes. The word brane comes from the word "membrane" which refers to a two-dimensional brane.

Branes are dynamical objects which can propagate through spacetime according to the rules of quantum mechanics. They have mass and can have other attributes such as charge. A p-brane sweeps out a (p+1)-dimensional volume in spacetime called its worldvolume. Physicists often study fields analogous to the electromagnetic field which live on the worldvolume of a brane.

In string theory, D-branes are an important class of branes that arise when one considers open strings. As an open string propagates through spacetime, its endpoints are required to lie on a D-brane. The letter "D" in D-brane refers to a certain mathematical condition on the system known as the Dirichlet boundary condition. The study of D-branes in string theory has led to important results such as the AdS/CFT correspondence, which has shed light on many problems in quantum field theory.

Branes are frequently studied from a purely mathematical point of view, and they are described as objects of certain categories, such as the derived category of coherent sheaves on a complex algebraic variety, or the Fukaya category of a symplectic manifold. The connection between the physical notion of a brane and the mathematical notion of a category has led to important mathematical insights in the fields of algebraic and symplectic geometry  and representation theory.

M-theory

Prior to 1995, theorists believed that there were five consistent versions of superstring theory (type I, type IIA, type IIB, and two versions of heterotic string theory). This understanding changed in 1995 when Edward Witten suggested that the five theories were just special limiting cases of an eleven-dimensional theory called M-theory. Witten's conjecture was based on the work of a number of other physicists, including Ashoke Sen, Chris Hull, Paul Townsend, and Michael Duff. His announcement led to a flurry of research activity now known as the second superstring revolution.

Unification of superstring theories

A star-shaped diagram with the various limits of M-theory labeled at its six vertices.
A schematic illustration of the relationship between M-theory, the five superstring theories, and eleven-dimensional supergravity. The shaded region represents a family of different physical scenarios that are possible in M-theory. In certain limiting cases corresponding to the cusps, it is natural to describe the physics using one of the six theories labeled there.
 
In the 1970s, many physicists became interested in supergravity theories, which combine general relativity with supersymmetry. Whereas general relativity makes sense in any number of dimensions, supergravity places an upper limit on the number of dimensions. In 1978, work by Werner Nahm showed that the maximum spacetime dimension in which one can formulate a consistent supersymmetric theory is eleven. In the same year, Eugene Cremmer, Bernard Julia, and Joel Scherk of the École Normale Supérieure showed that supergravity not only permits up to eleven dimensions but is in fact most elegant in this maximal number of dimensions.

Initially, many physicists hoped that by compactifying eleven-dimensional supergravity, it might be possible to construct realistic models of our four-dimensional world. The hope was that such models would provide a unified description of the four fundamental forces of nature: electromagnetism, the strong and weak nuclear forces, and gravity. Interest in eleven-dimensional supergravity soon waned as various flaws in this scheme were discovered. One of the problems was that the laws of physics appear to distinguish between clockwise and counterclockwise, a phenomenon known as chirality. Edward Witten and others observed this chirality property cannot be readily derived by compactifying from eleven dimensions.

In the first superstring revolution in 1984, many physicists turned to string theory as a unified theory of particle physics and quantum gravity. Unlike supergravity theory, string theory was able to accommodate the chirality of the standard model, and it provided a theory of gravity consistent with quantum effects. Another feature of string theory that many physicists were drawn to in the 1980s and 1990s was its high degree of uniqueness. In ordinary particle theories, one can consider any collection of elementary particles whose classical behavior is described by an arbitrary Lagrangian. In string theory, the possibilities are much more constrained: by the 1990s, physicists had argued that there were only five consistent supersymmetric versions of the theory.

Although there were only a handful of consistent superstring theories, it remained a mystery why there was not just one consistent formulation. However, as physicists began to examine string theory more closely, they realized that these theories are related in intricate and nontrivial ways. They found that a system of strongly interacting strings can, in some cases, be viewed as a system of weakly interacting strings. This phenomenon is known as S-duality. It was studied by Ashoke Sen in the context of heterotic strings in four dimensions and by Chris Hull and Paul Townsend in the context of the type IIB theory. Theorists also found that different string theories may be related by T-duality. This duality implies that strings propagating on completely different spacetime geometries may be physically equivalent.

At around the same time, as many physicists were studying the properties of strings, a small group of physicists was examining the possible applications of higher dimensional objects. In 1987, Eric Bergshoeff, Ergin Sezgin, and Paul Townsend showed that eleven-dimensional supergravity includes two-dimensional branes. Intuitively, these objects look like sheets or membranes propagating through the eleven-dimensional spacetime. Shortly after this discovery, Michael Duff, Paul Howe, Takeo Inami, and Kellogg Stelle considered a particular compactification of eleven-dimensional supergravity with one of the dimensions curled up into a circle. In this setting, one can imagine the membrane wrapping around the circular dimension. If the radius of the circle is sufficiently small, then this membrane looks just like a string in ten-dimensional spacetime. In fact, Duff and his collaborators showed that this construction reproduces exactly the strings appearing in type IIA superstring theory.

Speaking at a string theory conference in 1995, Edward Witten made the surprising suggestion that all five superstring theories were in fact just different limiting cases of a single theory in eleven spacetime dimensions. Witten's announcement drew together all of the previous results on S- and T-duality and the appearance of higher dimensional branes in string theory. In the months following Witten's announcement, hundreds of new papers appeared on the Internet confirming different parts of his proposal. Today this flurry of work is known as the second superstring revolution.

Initially, some physicists suggested that the new theory was a fundamental theory of membranes, but Witten was skeptical of the role of membranes in the theory. In a paper from 1996, Hořava and Witten wrote "As it has been proposed that the eleven-dimensional theory is a supermembrane theory but there are some reasons to doubt that interpretation, we will non-committally call it the M-theory, leaving to the future the relation of M to membranes." In the absence of an understanding of the true meaning and structure of M-theory, Witten has suggested that the M should stand for "magic", "mystery", or "membrane" according to taste, and the true meaning of the title should be decided when a more fundamental formulation of the theory is known.

Matrix theory

In mathematics, a matrix is a rectangular array of numbers or other data. In physics, a matrix model is a particular kind of physical theory whose mathematical formulation involves the notion of a matrix in an important way. A matrix model describes the behavior of a set of matrices within the framework of quantum mechanics.

One important example of a matrix model is the BFSS matrix model proposed by Tom Banks, Willy Fischler, Stephen Shenker, and Leonard Susskind in 1997. This theory describes the behavior of a set of nine large matrices. In their original paper, these authors showed, among other things, that the low energy limit of this matrix model is described by eleven-dimensional supergravity. These calculations led them to propose that the BFSS matrix model is exactly equivalent to M-theory. The BFSS matrix model can therefore be used as a prototype for a correct formulation of M-theory and a tool for investigating the properties of M-theory in a relatively simple setting.

The development of the matrix model formulation of M-theory has led physicists to consider various connections between string theory and a branch of mathematics called noncommutative geometry. This subject is a generalization of ordinary geometry in which mathematicians define new geometric notions using tools from noncommutative algebra. In a paper from 1998, Alain Connes, Michael R. Douglas, and Albert Schwarz showed that some aspects of matrix models and M-theory are described by a noncommutative quantum field theory, a special kind of physical theory in which spacetime is described mathematically using noncommutative geometry. This established a link between matrix models and M-theory on the one hand, and noncommutative geometry on the other hand. It quickly led to the discovery of other important links between noncommutative geometry and various physical theories.

Black holes

In general relativity, a black hole is defined as a region of spacetime in which the gravitational field is so strong that no particle or radiation can escape. In the currently accepted models of stellar evolution, black holes are thought to arise when massive stars undergo gravitational collapse, and many galaxies are thought to contain supermassive black holes at their centers. Black holes are also important for theoretical reasons, as they present profound challenges for theorists attempting to understand the quantum aspects of gravity. String theory has proved to be an important tool for investigating the theoretical properties of black holes because it provides a framework in which theorists can study their thermodynamics.

Bekenstein–Hawking formula

In the branch of physics called statistical mechanics, entropy is a measure of the randomness or disorder of a physical system. This concept was studied in the 1870s by the Austrian physicist Ludwig Boltzmann, who showed that the thermodynamic properties of a gas could be derived from the combined properties of its many constituent molecules. Boltzmann argued that by averaging the behaviors of all the different molecules in a gas, one can understand macroscopic properties such as volume, temperature, and pressure. In addition, this perspective led him to give a precise definition of entropy as the natural logarithm of the number of different states of the molecules (also called microstates) that give rise to the same macroscopic features.

In the twentieth century, physicists began to apply the same concepts to black holes. In most systems such as gases, the entropy scales with the volume. In the 1970s, the physicist Jacob Bekenstein suggested that the entropy of a black hole is instead proportional to the surface area of its event horizon, the boundary beyond which matter and radiation is lost to its gravitational attraction. When combined with ideas of the physicist Stephen Hawking, Bekenstein's work yielded a precise formula for the entropy of a black hole. The Bekenstein–Hawking formula expresses the entropy S as
where c is the speed of light, k is Boltzmann's constant, ħ is the reduced Planck constant, G is Newton's constant, and A is the surface area of the event horizon.

Like any physical system, a black hole has an entropy defined in terms of the number of different microstates that lead to the same macroscopic features. The Bekenstein–Hawking entropy formula gives the expected value of the entropy of a black hole, but by the 1990s, physicists still lacked a derivation of this formula by counting microstates in a theory of quantum gravity. Finding such a derivation of this formula was considered an important test of the viability of any theory of quantum gravity such as string theory.

Derivation within string theory

In a paper from 1996, Andrew Strominger and Cumrun Vafa showed how to derive the Beckenstein–Hawking formula for certain black holes in string theory. Their calculation was based on the observation that D-branes—which look like fluctuating membranes when they are weakly interacting—become dense, massive objects with event horizons when the interactions are strong. In other words, a system of strongly interacting D-branes in string theory is indistinguishable from a black hole. Strominger and Vafa analyzed such D-brane systems and calculated the number of different ways of placing D-branes in spacetime so that their combined mass and charge is equal to a given mass and charge for the resulting black hole. Their calculation reproduced the Bekenstein–Hawking formula exactly, including the factor of 1/4. Subsequent work by Strominger, Vafa, and others refined the original calculations and gave the precise values of the "quantum corrections" needed to describe very small black holes.

The black holes that Strominger and Vafa considered in their original work were quite different from real astrophysical black holes. One difference was that Strominger and Vafa considered only extremal black holes in order to make the calculation tractable. These are defined as black holes with the lowest possible mass compatible with a given charge. Strominger and Vafa also restricted attention to black holes in five-dimensional spacetime with unphysical supersymmetry.

Although it was originally developed in this very particular and physically unrealistic context in string theory, the entropy calculation of Strominger and Vafa has led to a qualitative understanding of how black hole entropy can be accounted for in any theory of quantum gravity. Indeed, in 1998, Strominger argued that the original result could be generalized to an arbitrary consistent theory of quantum gravity without relying on strings or supersymmetry. In collaboration with several other authors in 2010, he showed that some results on black hole entropy could be extended to non-extremal astrophysical black holes.

One approach to formulating string theory and studying its properties is provided by the anti-de Sitter/conformal field theory (AdS/CFT) correspondence. This is a theoretical result which implies that string theory is in some cases equivalent to a quantum field theory. In addition to providing insights into the mathematical structure of string theory, the AdS/CFT correspondence has shed light on many aspects of quantum field theory in regimes where traditional calculational techniques are ineffective. The AdS/CFT correspondence was first proposed by Juan Maldacena in late 1997. Important aspects of the correspondence were elaborated in articles by Steven Gubser, Igor Klebanov, and Alexander Markovich Polyakov, and by Edward Witten. By 2010, Maldacena's article had over 7000 citations, becoming the most highly cited article in the field of high energy physics.

Overview of the correspondence

A disk tiled by triangles and quadrilaterals which become smaller and smaller near the boundary circle.

In the AdS/CFT correspondence, the geometry of spacetime is described in terms of a certain vacuum solution of Einstein's equation called anti-de Sitter space. In very elementary terms, anti-de Sitter space is a mathematical model of spacetime in which the notion of distance between points (the metric) is different from the notion of distance in ordinary Euclidean geometry. It is closely related to hyperbolic space, which can be viewed as a disk as illustrated on the left. This image shows a tessellation of a disk by triangles and squares. One can define the distance between points of this disk in such a way that all the triangles and squares are the same size and the circular outer boundary is infinitely far from any point in the interior.

One can imagine a stack of hyperbolic disks where each disk represents the state of the universe at a given time. The resulting geometric object is three-dimensional anti-de Sitter space. It looks like a solid cylinder in which any cross section is a copy of the hyperbolic disk. Time runs along the vertical direction in this picture. The surface of this cylinder plays an important role in the AdS/CFT correspondence. As with the hyperbolic plane, anti-de Sitter space is curved in such a way that any point in the interior is actually infinitely far from this boundary surface.

A cylinder formed by stacking copies of the disk illustrated in the previous figure.
Three-dimensional anti-de Sitter space is like a stack of hyperbolic disks, each one representing the state of the universe at a given time. The resulting spacetime looks like a solid cylinder.
 
This construction describes a hypothetical universe with only two space dimensions and one time dimension, but it can be generalized to any number of dimensions. Indeed, hyperbolic space can have more than two dimensions and one can "stack up" copies of hyperbolic space to get higher-dimensional models of anti-de Sitter space.

An important feature of anti-de Sitter space is its boundary (which looks like a cylinder in the case of three-dimensional anti-de Sitter space). One property of this boundary is that, within a small region on the surface around any given point, it looks just like Minkowski space, the model of spacetime used in nongravitational physics. One can therefore consider an auxiliary theory in which "spacetime" is given by the boundary of anti-de Sitter space. This observation is the starting point for AdS/CFT correspondence, which states that the boundary of anti-de Sitter space can be regarded as the "spacetime" for a quantum field theory. The claim is that this quantum field theory is equivalent to a gravitational theory, such as string theory, in the bulk anti-de Sitter space in the sense that there is a "dictionary" for translating entities and calculations in one theory into their counterparts in the other theory. For example, a single particle in the gravitational theory might correspond to some collection of particles in the boundary theory. In addition, the predictions in the two theories are quantitatively identical so that if two particles have a 40 percent chance of colliding in the gravitational theory, then the corresponding collections in the boundary theory would also have a 40 percent chance of colliding.

Applications to quantum gravity

The discovery of the AdS/CFT correspondence was a major advance in physicists' understanding of string theory and quantum gravity. One reason for this is that the correspondence provides a formulation of string theory in terms of quantum field theory, which is well understood by comparison. Another reason is that it provides a general framework in which physicists can study and attempt to resolve the paradoxes of black holes.

In 1975, Stephen Hawking published a calculation which suggested that black holes are not completely black but emit a dim radiation due to quantum effects near the event horizon. At first, Hawking's result posed a problem for theorists because it suggested that black holes destroy information. More precisely, Hawking's calculation seemed to conflict with one of the basic postulates of quantum mechanics, which states that physical systems evolve in time according to the Schrödinger equation. This property is usually referred to as unitarity of time evolution. The apparent contradiction between Hawking's calculation and the unitarity postulate of quantum mechanics came to be known as the black hole information paradox.

The AdS/CFT correspondence resolves the black hole information paradox, at least to some extent, because it shows how a black hole can evolve in a manner consistent with quantum mechanics in some contexts. Indeed, one can consider black holes in the context of the AdS/CFT correspondence, and any such black hole corresponds to a configuration of particles on the boundary of anti-de Sitter space. These particles obey the usual rules of quantum mechanics and in particular evolve in a unitary fashion, so the black hole must also evolve in a unitary fashion, respecting the principles of quantum mechanics. In 2005, Hawking announced that the paradox had been settled in favor of information conservation by the AdS/CFT correspondence, and he suggested a concrete mechanism by which black holes might preserve information.

Applications to nuclear physics

A magnet levitating over a superconducting material.
A magnet levitating above a high-temperature superconductor. Today some physicists are working to understand high-temperature superconductivity using the AdS/CFT correspondence.
 
In addition to its applications to theoretical problems in quantum gravity, the AdS/CFT correspondence has been applied to a variety of problems in quantum field theory. One physical system that has been studied using the AdS/CFT correspondence is the quark–gluon plasma, an exotic state of matter produced in particle accelerators. This state of matter arises for brief instants when heavy ions such as gold or lead nuclei are collided at high energies. Such collisions cause the quarks that make up atomic nuclei to deconfine at temperatures of approximately two trillion kelvins, conditions similar to those present at around 10−11 seconds after the Big Bang.

The physics of the quark–gluon plasma is governed by a theory called quantum chromodynamics, but this theory is mathematically intractable in problems involving the quark–gluon plasma. In an article appearing in 2005, Đàm Thanh Sơn and his collaborators showed that the AdS/CFT correspondence could be used to understand some aspects of the quark–gluon plasma by describing it in the language of string theory. By applying the AdS/CFT correspondence, Sơn and his collaborators were able to describe the quark gluon plasma in terms of black holes in five-dimensional spacetime. The calculation showed that the ratio of two quantities associated with the quark–gluon plasma, the shear viscosity and volume density of entropy, should be approximately equal to a certain universal constant. In 2008, the predicted value of this ratio for the quark–gluon plasma was confirmed at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory.

Applications to condensed matter physics

The AdS/CFT correspondence has also been used to study aspects of condensed matter physics. Over the decades, experimental condensed matter physicists have discovered a number of exotic states of matter, including superconductors and superfluids. These states are described using the formalism of quantum field theory, but some phenomena are difficult to explain using standard field theoretic techniques. Some condensed matter theorists including Subir Sachdev hope that the AdS/CFT correspondence will make it possible to describe these systems in the language of string theory and learn more about their behavior.

So far some success has been achieved in using string theory methods to describe the transition of a superfluid to an insulator. A superfluid is a system of electrically neutral atoms that flows without any friction. Such systems are often produced in the laboratory using liquid helium, but recently experimentalists have developed new ways of producing artificial superfluids by pouring trillions of cold atoms into a lattice of criss-crossing lasers. These atoms initially behave as a superfluid, but as experimentalists increase the intensity of the lasers, they become less mobile and then suddenly transition to an insulating state. During the transition, the atoms behave in an unusual way. For example, the atoms slow to a halt at a rate that depends on the temperature and on Planck's constant, the fundamental parameter of quantum mechanics, which does not enter into the description of the other phases. This behavior has recently been understood by considering a dual description where properties of the fluid are described in terms of a higher dimensional black hole.

Phenomenology

In addition to being an idea of considerable theoretical interest, string theory provides a framework for constructing models of real world physics that combine general relativity and particle physics. Phenomenology is the branch of theoretical physics in which physicists construct realistic models of nature from more abstract theoretical ideas. String phenomenology is the part of string theory that attempts to construct realistic or semi-realistic models based on string theory.

Partly because of theoretical and mathematical difficulties and partly because of the extremely high energies needed to test these theories experimentally, there is so far no experimental evidence that would unambiguously point to any of these models being a correct fundamental description of nature. This has led some in the community to criticize these approaches to unification and question the value of continued research on these problems.

Particle physics

The currently accepted theory describing elementary particles and their interactions is known as the standard model of particle physics. This theory provides a unified description of three of the fundamental forces of nature: electromagnetism and the strong and weak nuclear forces. Despite its remarkable success in explaining a wide range of physical phenomena, the standard model cannot be a complete description of reality. This is because the standard model fails to incorporate the force of gravity and because of problems such as the hierarchy problem and the inability to explain the structure of fermion masses or dark matter. 

String theory has been used to construct a variety of models of particle physics going beyond the standard model. Typically, such models are based on the idea of compactification. Starting with the ten- or eleven-dimensional spacetime of string or M-theory, physicists postulate a shape for the extra dimensions. By choosing this shape appropriately, they can construct models roughly similar to the standard model of particle physics, together with additional undiscovered particles. One popular way of deriving realistic physics from string theory is to start with the heterotic theory in ten dimensions and assume that the six extra dimensions of spacetime are shaped like a six-dimensional Calabi–Yau manifold. Such compactifications offer many ways of extracting realistic physics from string theory. Other similar methods can be used to construct realistic or semi-realistic models of our four-dimensional world based on M-theory.

Cosmology

The Big Bang theory is the prevailing cosmological model for the universe from the earliest known periods through its subsequent large-scale evolution. Despite its success in explaining many observed features of the universe including galactic redshifts, the relative abundance of light elements such as hydrogen and helium, and the existence of a cosmic microwave background, there are several questions that remain unanswered. For example, the standard Big Bang model does not explain why the universe appears to be same in all directions, why it appears flat on very large distance scales, or why certain hypothesized particles such as magnetic monopoles are not observed in experiments.

Currently, the leading candidate for a theory going beyond the Big Bang is the theory of cosmic inflation. Developed by Alan Guth and others in the 1980s, inflation postulates a period of extremely rapid accelerated expansion of the universe prior to the expansion described by the standard Big Bang theory. The theory of cosmic inflation preserves the successes of the Big Bang while providing a natural explanation for some of the mysterious features of the universe. The theory has also received striking support from observations of the cosmic microwave background, the radiation that has filled the sky since around 380,000 years after the Big Bang.

In the theory of inflation, the rapid initial expansion of the universe is caused by a hypothetical particle called the inflaton. The exact properties of this particle are not fixed by the theory but should ultimately be derived from a more fundamental theory such as string theory. Indeed, there have been a number of attempts to identify an inflaton within the spectrum of particles described by string theory, and to study inflation using string theory. While these approaches might eventually find support in observational data such as measurements of the cosmic microwave background, the application of string theory to cosmology is still in its early stages.

Connections to mathematics

In addition to influencing research in theoretical physics, string theory has stimulated a number of major developments in pure mathematics. Like many developing ideas in theoretical physics, string theory does not at present have a mathematically rigorous formulation in which all of its concepts can be defined precisely. As a result, physicists who study string theory are often guided by physical intuition to conjecture relationships between the seemingly different mathematical structures that are used to formalize different parts of the theory. These conjectures are later proved by mathematicians, and in this way, string theory serves as a source of new ideas in pure mathematics.

Mirror symmetry

A complex mathematical surface in three dimensions.
The Clebsch cubic is an example of a kind of geometric object called an algebraic variety. A classical result of enumerative geometry states that there are exactly 27 straight lines that lie entirely on this surface.
 
After Calabi–Yau manifolds had entered physics as a way to compactify extra dimensions in string theory, many physicists began studying these manifolds. In the late 1980s, several physicists noticed that given such a compactification of string theory, it is not possible to reconstruct uniquely a corresponding Calabi–Yau manifold. Instead, two different versions of string theory, type IIA and type IIB, can be compactified on completely different Calabi–Yau manifolds giving rise to the same physics. In this situation, the manifolds are called mirror manifolds, and the relationship between the two physical theories is called mirror symmetry.

Regardless of whether Calabi–Yau compactifications of string theory provide a correct description of nature, the existence of the mirror duality between different string theories has significant mathematical consequences. The Calabi–Yau manifolds used in string theory are of interest in pure mathematics, and mirror symmetry allows mathematicians to solve problems in enumerative geometry, a branch of mathematics concerned with counting the numbers of solutions to geometric questions.

Enumerative geometry studies a class of geometric objects called algebraic varieties which are defined by the vanishing of polynomials. For example, the Clebsch cubic illustrated on the right is an algebraic variety defined using a certain polynomial of degree three in four variables. A celebrated result of nineteenth-century mathematicians Arthur Cayley and George Salmon states that there are exactly 27 straight lines that lie entirely on such a surface.

Generalizing this problem, one can ask how many lines can be drawn on a quintic Calabi–Yau manifold, such as the one illustrated above, which is defined by a polynomial of degree five. This problem was solved by the nineteenth-century German mathematician Hermann Schubert, who found that there are exactly 2,875 such lines. In 1986, geometer Sheldon Katz proved that the number of curves, such as circles, that are defined by polynomials of degree two and lie entirely in the quintic is 609,250.

By the year 1991, most of the classical problems of enumerative geometry had been solved and interest in enumerative geometry had begun to diminish. The field was reinvigorated in May 1991 when physicists Philip Candelas, Xenia de la Ossa, Paul Green, and Linda Parks showed that mirror symmetry could be used to translate difficult mathematical questions about one Calabi–Yau manifold into easier questions about its mirror. In particular, they used mirror symmetry to show that a six-dimensional Calabi–Yau manifold can contain exactly 317,206,375 curves of degree three. In addition to counting degree-three curves, Candelas and his collaborators obtained a number of more general results for counting rational curves which went far beyond the results obtained by mathematicians.

Originally, these results of Candelas were justified on physical grounds. However, mathematicians generally prefer rigorous proofs that do not require an appeal to physical intuition. Inspired by physicists' work on mirror symmetry, mathematicians have therefore constructed their own arguments proving the enumerative predictions of mirror symmetry. Today mirror symmetry is an active area of research in mathematics, and mathematicians are working to develop a more complete mathematical understanding of mirror symmetry based on physicists' intuition. Major approaches to mirror symmetry include the homological mirror symmetry program of Maxim Kontsevich and the SYZ conjecture of Andrew Strominger, Shing-Tung Yau, and Eric Zaslow.

Monstrous moonshine

An equilateral triangle with a line joining each vertex to the midpoint of the opposite side
An equilateral triangle can be rotated through 120°, 240°, or 360°, or reflected in any of the three lines pictured without changing its shape.
 
Group theory is the branch of mathematics that studies the concept of symmetry. For example, one can consider a geometric shape such as an equilateral triangle. There are various operations that one can perform on this triangle without changing its shape. One can rotate it through 120°, 240°, or 360°, or one can reflect in any of the lines labeled S0, S1, or S2 in the picture. Each of these operations is called a symmetry, and the collection of these symmetries satisfies certain technical properties making it into what mathematicians call a group. In this particular example, the group is known as the dihedral group of order 6 because it has six elements. A general group may describe finitely many or infinitely many symmetries; if there are only finitely many symmetries, it is called a finite group.

Mathematicians often strive for a classification (or list) of all mathematical objects of a given type. It is generally believed that finite groups are too diverse to admit a useful classification. A more modest but still challenging problem is to classify all finite simple groups. These are finite groups which may be used as building blocks for constructing arbitrary finite groups in the same way that prime numbers can be used to construct arbitrary whole numbers by taking products. One of the major achievements of contemporary group theory is the classification of finite simple groups, a mathematical theorem which provides a list of all possible finite simple groups.

This classification theorem identifies several infinite families of groups as well as 26 additional groups which do not fit into any family. The latter groups are called the "sporadic" groups, and each one owes its existence to a remarkable combination of circumstances. The largest sporadic group, the so-called monster group, has over 1053 elements, more than a thousand times the number of atoms in the Earth.

A graph of the j-function in the complex plane
 
A seemingly unrelated construction is the j-function of number theory. This object belongs to a special class of functions called modular functions, whose graphs form a certain kind of repeating pattern. Although this function appears in a branch of mathematics which seems very different from the theory of finite groups, the two subjects turn out to be intimately related. In the late 1970s, mathematicians John McKay and John Thompson noticed that certain numbers arising in the analysis of the monster group (namely, the dimensions of its irreducible representations) are related to numbers that appear in a formula for the j-function (namely, the coefficients of its Fourier series). This relationship was further developed by John Horton Conway and Simon Norton who called it monstrous moonshine because it seemed so far fetched.

In 1992, Richard Borcherds constructed a bridge between the theory of modular functions and finite groups and, in the process, explained the observations of McKay and Thompson. Borcherds' work used ideas from string theory in an essential way, extending earlier results of Igor Frenkel, James Lepowsky, and Arne Meurman, who had realized the monster group as the symmetries of a particular version of string theory. In 1998, Borcherds was awarded the Fields medal for his work.

Since the 1990s, the connection between string theory and moonshine has led to further results in mathematics and physics. In 2010, physicists Tohru Eguchi, Hirosi Ooguri, and Yuji Tachikawa discovered connections between a different sporadic group, the Mathieu group M24, and a certain version of string theory. Miranda Cheng, John Duncan, and Jeffrey A. Harvey proposed a generalization of this moonshine phenomenon called umbral moonshine, and their conjecture was proved mathematically by Duncan, Michael Griffin, and Ken Ono. Witten has also speculated that the version of string theory appearing in monstrous moonshine might be related to a certain simplified model of gravity in three spacetime dimensions.

History

Early results

Some of the structures reintroduced by string theory arose for the first time much earlier as part of the program of classical unification started by Albert Einstein. The first person to add a fifth dimension to a theory of gravity was Gunnar Nordström in 1914, who noted that gravity in five dimensions describes both gravity and electromagnetism in four. Nordström attempted to unify electromagnetism with his theory of gravitation, which was however superseded by Einstein's general relativity in 1919. Thereafter, German mathematician Theodor Kaluza combined the fifth dimension with general relativity, and only Kaluza is usually credited with the idea. In 1926, the Swedish physicist Oskar Klein gave a physical interpretation of the unobservable extra dimension—it is wrapped into a small circle. Einstein introduced a non-symmetric metric tensor, while much later Brans and Dicke added a scalar component to gravity. These ideas would be revived within string theory, where they are demanded by consistency conditions. 

String theory was originally developed during the late 1960s and early 1970s as a never completely successful theory of hadrons, the subatomic particles like the proton and neutron that feel the strong interaction. In the 1960s, Geoffrey Chew and Steven Frautschi discovered that the mesons make families called Regge trajectories with masses related to spins in a way that was later understood by Yoichiro Nambu, Holger Bech Nielsen and Leonard Susskind to be the relationship expected from rotating strings. Chew advocated making a theory for the interactions of these trajectories that did not presume that they were composed of any fundamental particles, but would construct their interactions from self-consistency conditions on the S-matrix. The S-matrix approach was started by Werner Heisenberg in the 1940s as a way of constructing a theory that did not rely on the local notions of space and time, which Heisenberg believed break down at the nuclear scale. While the scale was off by many orders of magnitude, the approach he advocated was ideally suited for a theory of quantum gravity.

Working with experimental data, R. Dolen, D. Horn and C. Schmid developed some sum rules for hadron exchange. When a particle and antiparticle scatter, virtual particles can be exchanged in two qualitatively different ways. In the s-channel, the two particles annihilate to make temporary intermediate states that fall apart into the final state particles. In the t-channel, the particles exchange intermediate states by emission and absorption. In field theory, the two contributions add together, one giving a continuous background contribution, the other giving peaks at certain energies. In the data, it was clear that the peaks were stealing from the background—the authors interpreted this as saying that the t-channel contribution was dual to the s-channel one, meaning both described the whole amplitude and included the other.

The result was widely advertised by Murray Gell-Mann, leading Gabriele Veneziano to construct a scattering amplitude that had the property of Dolen–Horn–Schmid duality, later renamed world-sheet duality. The amplitude needed poles where the particles appear, on straight line trajectories, and there is a special mathematical function whose poles are evenly spaced on half the real line—the gamma function— which was widely used in Regge theory. By manipulating combinations of gamma functions, Veneziano was able to find a consistent scattering amplitude with poles on straight lines, with mostly positive residues, which obeyed duality and had the appropriate Regge scaling at high energy. The amplitude could fit near-beam scattering data as well as other Regge type fits, and had a suggestive integral representation that could be used for generalization. 

Over the next years, hundreds of physicists worked to complete the bootstrap program for this model, with many surprises. Veneziano himself discovered that for the scattering amplitude to describe the scattering of a particle that appears in the theory, an obvious self-consistency condition, the lightest particle must be a tachyon. Miguel Virasoro and Joel Shapiro found a different amplitude now understood to be that of closed strings, while Ziro Koba and Holger Nielsen generalized Veneziano's integral representation to multiparticle scattering. Veneziano and Sergio Fubini introduced an operator formalism for computing the scattering amplitudes that was a forerunner of world-sheet conformal theory, while Virasoro understood how to remove the poles with wrong-sign residues using a constraint on the states. Claud Lovelace calculated a loop amplitude, and noted that there is an inconsistency unless the dimension of the theory is 26. Charles Thorn, Peter Goddard and Richard Brower went on to prove that there are no wrong-sign propagating states in dimensions less than or equal to 26. 

In 1969–70, Yoichiro Nambu, Holger Bech Nielsen, and Leonard Susskind recognized that the theory could be given a description in space and time in terms of strings. The scattering amplitudes were derived systematically from the action principle by Peter Goddard, Jeffrey Goldstone, Claudio Rebbi, and Charles Thorn, giving a space-time picture to the vertex operators introduced by Veneziano and Fubini and a geometrical interpretation to the Virasoro conditions

In 1971, Pierre Ramond added fermions to the model, which led him to formulate a two-dimensional supersymmetry to cancel the wrong-sign states. John Schwarz and André Neveu added another sector to the fermi theory a short time later. In the fermion theories, the critical dimension was 10. Stanley Mandelstam formulated a world sheet conformal theory for both the bose and fermi case, giving a two-dimensional field theoretic path-integral to generate the operator formalism. Michio Kaku and Keiji Kikkawa gave a different formulation of the bosonic string, as a string field theory, with infinitely many particle types and with fields taking values not on points, but on loops and curves. 

In 1974, Tamiaki Yoneya discovered that all the known string theories included a massless spin-two particle that obeyed the correct Ward identities to be a graviton. John Schwarz and Joel Scherk came to the same conclusion and made the bold leap to suggest that string theory was a theory of gravity, not a theory of hadrons. They reintroduced Kaluza–Klein theory as a way of making sense of the extra dimensions. At the same time, quantum chromodynamics was recognized as the correct theory of hadrons, shifting the attention of physicists and apparently leaving the bootstrap program in the dustbin of history

String theory eventually made it out of the dustbin, but for the following decade all work on the theory was completely ignored. Still, the theory continued to develop at a steady pace thanks to the work of a handful of devotees. Ferdinando Gliozzi, Joel Scherk, and David Olive realized in 1977 that the original Ramond and Neveu Schwarz-strings were separately inconsistent and needed to be combined. The resulting theory did not have a tachyon, and was proven to have space-time supersymmetry by John Schwarz and Michael Green in 1984. The same year, Alexander Polyakov gave the theory a modern path integral formulation, and went on to develop conformal field theory extensively. In 1979, Daniel Friedan showed that the equations of motions of string theory, which are generalizations of the Einstein equations of general relativity, emerge from the renormalization group equations for the two-dimensional field theory. Schwarz and Green discovered T-duality, and constructed two superstring theories—IIA and IIB related by T-duality, and type I theories with open strings. The consistency conditions had been so strong, that the entire theory was nearly uniquely determined, with only a few discrete choices.

First superstring revolution

In the early 1980s, Edward Witten discovered that most theories of quantum gravity could not accommodate chiral fermions like the neutrino. This led him, in collaboration with Luis Álvarez-Gaumé, to study violations of the conservation laws in gravity theories with anomalies, concluding that type I string theories were inconsistent. Green and Schwarz discovered a contribution to the anomaly that Witten and Alvarez-Gaumé had missed, which restricted the gauge group of the type I string theory to be SO(32). In coming to understand this calculation, Edward Witten became convinced that string theory was truly a consistent theory of gravity, and he became a high-profile advocate. Following Witten's lead, between 1984 and 1986, hundreds of physicists started to work in this field, and this is sometimes called the first superstring revolution.

During this period, David Gross, Jeffrey Harvey, Emil Martinec, and Ryan Rohm discovered heterotic strings. The gauge group of these closed strings was two copies of E8, and either copy could easily and naturally include the standard model. Philip Candelas, Gary Horowitz, Andrew Strominger and Edward Witten found that the Calabi–Yau manifolds are the compactifications that preserve a realistic amount of supersymmetry, while Lance Dixon and others worked out the physical properties of orbifolds, distinctive geometrical singularities allowed in string theory. Cumrun Vafa generalized T-duality from circles to arbitrary manifolds, creating the mathematical field of mirror symmetry. Daniel Friedan, Emil Martinec and Stephen Shenker further developed the covariant quantization of the superstring using conformal field theory techniques. David Gross and Vipul Periwal discovered that string perturbation theory was divergent. Stephen Shenker showed it diverged much faster than in field theory suggesting that new non-perturbative objects were missing. 

In the 1990s, Joseph Polchinski discovered that the theory requires higher-dimensional objects, called D-branes and identified these with the black-hole solutions of supergravity. These were understood to be the new objects suggested by the perturbative divergences, and they opened up a new field with rich mathematical structure. It quickly became clear that D-branes and other p-branes, not just strings, formed the matter content of the string theories, and the physical interpretation of the strings and branes was revealed—they are a type of black hole. Leonard Susskind had incorporated the holographic principle of Gerardus 't Hooft into string theory, identifying the long highly excited string states with ordinary thermal black hole states. As suggested by 't Hooft, the fluctuations of the black hole horizon, the world-sheet or world-volume theory, describes not only the degrees of freedom of the black hole, but all nearby objects too.

Second superstring revolution

In 1995, at the annual conference of string theorists at the University of Southern California (USC), Edward Witten gave a speech on string theory that in essence united the five string theories that existed at the time, and giving birth to a new 11-dimensional theory called M-theory. M-theory was also foreshadowed in the work of Paul Townsend at approximately the same time. The flurry of activity that began at this time is sometimes called the second superstring revolution.

During this period, Tom Banks, Willy Fischler, Stephen Shenker and Leonard Susskind formulated matrix theory, a full holographic description of M-theory using IIA D0 branes. This was the first definition of string theory that was fully non-perturbative and a concrete mathematical realization of the holographic principle. It is an example of a gauge-gravity duality and is now understood to be a special case of the AdS/CFT correspondence. Andrew Strominger and Cumrun Vafa calculated the entropy of certain configurations of D-branes and found agreement with the semi-classical answer for extreme charged black holes. Petr Hořava and Witten found the eleven-dimensional formulation of the heterotic string theories, showing that orbifolds solve the chirality problem. Witten noted that the effective description of the physics of D-branes at low energies is by a supersymmetric gauge theory, and found geometrical interpretations of mathematical structures in gauge theory that he and Nathan Seiberg had earlier discovered in terms of the location of the branes. 

In 1997, Juan Maldacena noted that the low energy excitations of a theory near a black hole consist of objects close to the horizon, which for extreme charged black holes looks like an anti-de Sitter space. He noted that in this limit the gauge theory describes the string excitations near the branes. So he hypothesized that string theory on a near-horizon extreme-charged black-hole geometry, an anti-de Sitter space times a sphere with flux, is equally well described by the low-energy limiting gauge theory, the N = 4 supersymmetric Yang–Mills theory. This hypothesis, which is called the AdS/CFT correspondence, was further developed by Steven Gubser, Igor Klebanov and Alexander Polyakov, and by Edward Witten, and it is now well-accepted. It is a concrete realization of the holographic principle, which has far-reaching implications for black holes, locality and information in physics, as well as the nature of the gravitational interaction. Through this relationship, string theory has been shown to be related to gauge theories like quantum chromodynamics and this has led to more quantitative understanding of the behavior of hadrons, bringing string theory back to its roots.

Criticism

Number of solutions

To construct models of particle physics based on string theory, physicists typically begin by specifying a shape for the extra dimensions of spacetime. Each of these different shapes corresponds to a different possible universe, or "vacuum state", with a different collection of particles and forces. String theory as it is currently understood has an enormous number of vacuum states, typically estimated to be around 10500, and these might be sufficiently diverse to accommodate almost any phenomena that might be observed at low energies.

Many critics of string theory have expressed concerns about the large number of possible universes described by string theory. In his book Not Even Wrong, Peter Woit, a lecturer in the mathematics department at Columbia University, has argued that the large number of different physical scenarios renders string theory vacuous as a framework for constructing models of particle physics. According to Woit,
The possible existence of, say, 10500 consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything. If one picks among this large set just those states whose properties agree with present experimental observations, it is likely there still will be such a large number of these that one can get just about whatever value one wants for the results of any new observation.
Some physicists believe this large number of solutions is actually a virtue because it may allow a natural anthropic explanation of the observed values of physical constants, in particular the small value of the cosmological constant. The anthropic principle is the idea that some of the numbers appearing in the laws of physics are not fixed by any fundamental principle but must be compatible with the evolution of intelligent life. In 1987, Steven Weinberg published an article in which he argued that the cosmological constant could not have been too large, or else galaxies and intelligent life would not have been able to develop. Weinberg suggested that there might be a huge number of possible consistent universes, each with a different value of the cosmological constant, and observations indicate a small value of the cosmological constant only because humans happen to live in a universe that has allowed intelligent life, and hence observers, to exist.

String theorist Leonard Susskind has argued that string theory provides a natural anthropic explanation of the small value of the cosmological constant. According to Susskind, the different vacuum states of string theory might be realized as different universes within a larger multiverse. The fact that the observed universe has a small cosmological constant is just a tautological consequence of the fact that a small value is required for life to exist. Many prominent theorists and critics have disagreed with Susskind's conclusions. According to Woit, "in this case [anthropic reasoning] is nothing more than an excuse for failure. Speculative scientific ideas fail not just when they make incorrect predictions, but also when they turn out to be vacuous and incapable of predicting anything."

Background independence

One of the fundamental properties of Einstein's general theory of relativity is that it is background independent, meaning that the formulation of the theory does not in any way privilege a particular spacetime geometry.

One of the main criticisms of string theory from early on is that it is not manifestly background independent. In string theory, one must typically specify a fixed reference geometry for spacetime, and all other possible geometries are described as perturbations of this fixed one. In his book The Trouble With Physics, physicist Lee Smolin of the Perimeter Institute for Theoretical Physics claims that this is the principal weakness of string theory as a theory of quantum gravity, saying that string theory has failed to incorporate this important insight from general relativity.

Others have disagreed with Smolin's characterization of string theory. In a review of Smolin's book, string theorist Joseph Polchinski writes
[Smolin] is mistaking an aspect of the mathematical language being used for one of the physics being described. New physical theories are often discovered using a mathematical language that is not the most suitable for them… In string theory it has always been clear that the physics is background-independent even if the language being used is not, and the search for more suitable language continues. Indeed, as Smolin belatedly notes, [AdS/CFT] provides a solution to this problem, one that is unexpected and powerful.
Polchinski notes that an important open problem in quantum gravity is to develop holographic descriptions of gravity which do not require the gravitational field to be asymptotically anti-de Sitter. Smolin has responded by saying that the AdS/CFT correspondence, as it is currently understood, may not be strong enough to resolve all concerns about background independence.

Sociology of science

Since the superstring revolutions of the 1980s and 1990s, string theory has become the dominant paradigm of high energy theoretical physics. Some string theorists have expressed the view that there does not exist an equally successful alternative theory addressing the deep questions of fundamental physics. In an interview from 1987, Nobel laureate David Gross made the following controversial comments about the reasons for the popularity of string theory:
The most important [reason] is that there are no other good ideas around. That's what gets most people into it. When people started to get interested in string theory they didn't know anything about it. In fact, the first reaction of most people is that the theory is extremely ugly and unpleasant, at least that was the case a few years ago when the understanding of string theory was much less developed. It was difficult for people to learn about it and to be turned on. So I think the real reason why people have got attracted by it is because there is no other game in town. All other approaches of constructing grand unified theories, which were more conservative to begin with, and only gradually became more and more radical, have failed, and this game hasn't failed yet.
Several other high-profile theorists and commentators have expressed similar views, suggesting that there are no viable alternatives to string theory.

Many critics of string theory have commented on this state of affairs. In his book criticizing string theory, Peter Woit views the status of string theory research as unhealthy and detrimental to the future of fundamental physics. He argues that the extreme popularity of string theory among theoretical physicists is partly a consequence of the financial structure of academia and the fierce competition for scarce resources. In his book The Road to Reality, mathematical physicist Roger Penrose expresses similar views, stating "The often frantic competitiveness that this ease of communication engenders leads to bandwagon effects, where researchers fear to be left behind if they do not join in." Penrose also claims that the technical difficulty of modern physics forces young scientists to rely on the preferences of established researchers, rather than forging new paths of their own. Lee Smolin expresses a slightly different position in his critique, claiming that string theory grew out of a tradition of particle physics which discourages speculation about the foundations of physics, while his preferred approach, loop quantum gravity, encourages more radical thinking. According to Smolin,
String theory is a powerful, well-motivated idea and deserves much of the work that has been devoted to it. If it has so far failed, the principal reason is that its intrinsic flaws are closely tied to its strengths—and, of course, the story is unfinished, since string theory may well turn out to be part of the truth. The real question is not why we have expended so much energy on string theory but why we haven't expended nearly enough on alternative approaches.
Smolin goes on to offer a number of prescriptions for how scientists might encourage a greater diversity of approaches to quantum gravity research.

Tuesday, January 1, 2019

Secularization

From Wikipedia, the free encyclopedia
Secularization (or secularisation) is the transformation of a society from close identification and affiliation with religious values and institutions toward nonreligious values and secular institutions. The secularization thesis refers to the belief that as societies progress, particularly through modernization and rationalization, religion loses its authority in all aspects of social life and governance. The term secularization is also used in the context of the lifting of the monastic restrictions from a member of the clergy.

Secularization refers to the historical process in which religion loses social and cultural significance. As a result of secularization the role of religion in modern societies becomes restricted. In secularized societies faith lacks cultural authority, and religious organizations have little social power.

Secularization has many levels of meaning, both as a theory and a historical process. Social theorists such as Karl Marx, Sigmund Freud, Max Weber, and Émile Durkheim, postulated that the modernization of society would include a decline in levels of religiosity. Study of this process seeks to determine the manner in which, or extent to which religious creeds, practices and institutions are losing social significance. Some theorists argue that the secularization of modern civilization partly results from our inability to adapt broad ethical and spiritual needs of mankind to the increasingly fast advance of the physical sciences.

In contrast to the “modernization” thesis, Christian Smith and others argue that secularization is promoted by intellectual and cultural elites to enhance their own status and influence. Smith believes intellectuals have an inherent tendency to be hostile to their native cultures, causing them to embrace secularism.

The term also has additional meanings, primarily historical and religious. Applied to church property, historically it refers to the seizure of monastic lands and buildings, such as Henry VIII's Dissolution of the Monasteries in England and the later acts during the French Revolution as well as by various anti-clerical enlightened absolutist European governments during the 18th and 19th centuries, which resulted in the expulsion and suppression of the religious communities which occupied them. The 19th-century Kulturkampf in Germany and Switzerland and similar events in many other countries also were expressions of secularization.

Still another form of Secularization refers to the act of Prince-Bishops or holders of a position in a Monastic or Military Order - holding a combined religious and secular authority under the Catholic Church - who broke away and made themselves into completely secular (typically, Protestant) hereditary rulers. For example, Gotthard Kettler, the last Master of the Livonian Order, converted to Lutheranism, secularised (and took to himself) the lands of Semigallia and Courland which he had held on behalf of the order - which enabled him to marry and leave to his descendants the Duchy of Courland and Semigallia.

In the 1960s there was a shift toward secularization in Western Europe, North America, Australia, and New Zealand. This transformation was intertwined with major social factors: economic prosperity, youth rebelling against the rules and conventions of society, women's liberation, radical theology, and radical politics.

Background

Secularization is sometimes credited both to the cultural shifts in society following the emergence of rationality and the development of science as a substitute for superstitionMax Weber called this process the "disenchantment of the world"—and to the changes made by religious institutions to compensate. At the most basic stages, this begins with a slow transition from oral traditions to a writing culture that diffuses knowledge. This first reduces the authority of clerics as the custodians of revealed knowledge. As the responsibility for education has moved from the family and community to the state, two consequences have arisen:
  • Collective conscience as defined by Durkheim is diminished
  • Fragmentation of communal activities leads to religion becoming more a matter of individual choice rather than an observed social obligation.
A major issue in the study of secularization is the extent to which certain trends such as decreased attendance at places of worship indicate a decrease in religiosity or simply a privatization of religious belief, where religious beliefs no longer play a dominant role in public life or in other aspects of decision making. 

The issue of secularization is discussed in various religious traditions. The government of Turkey is an often cited example, following the abolition of the Ottoman Caliphate and foundation of the Turkish republic in 1923. This established popular sovereignty in a secular republican framework, in opposition to a system whose authority is based on religion. As one of many examples of state modernization, this shows secularization and democratization as mutually reinforcing processes, relying on a separation of religion and state. In expressly secular states like India, it has been argued that the need was to legislate for toleration and respect between quite different religions, and likewise, the secularization of the West was a response to drastically violent intra-Christian feuds between Catholicism and Protestantism. Some have therefore argued that Western and Indian secularization is radically different in that it deals with autonomy from religious regulation and control. Considerations of both tolerance and autonomy are relevant to any secular state.

Definitions

C. John Sommerville (1998) outlined six uses of the term secularization in the scientific literature. The first five are more along the lines of 'definitions' while the sixth is more of a 'clarification of use':
  1. When discussing macro social structures, secularization can refer to differentiation: a process in which the various aspects of society, economic, political, legal, and moral, become increasingly specialized and distinct from one another.
  2. When discussing individual institutions, secularization can denote the transformation of a religious into a secular institution. Examples would be the evolution of institutions such as Harvard University from a predominantly religious institution into a secular institution (with a divinity school now housing the religious element illustrating differentiation).
  3. When discussing activities, secularization refers to the transfer of activities from religious to secular institutions, such as a shift in provision of social services from churches to the government.
  4. When discussing mentalities, secularization refers to the transition from ultimate concerns to proximate concerns. E.g., individuals in the West are now more likely to moderate their behavior in response to more immediately applicable consequences rather than out of concern for post-mortem consequences. This is a personal religious decline or movement toward a secular lifestyle.
  5. When discussing populations, secularization refers to broad patterns of societal decline in levels of religiosity as opposed to the individual-level secularization of (4) above. This understanding of secularization is also distinct from (1.) above in that it refers specifically to religious decline rather than societal differentiation.
  6. When discussing religion, secularization can only be used unambiguously to refer to religion in a generic sense. For example, a reference to Christianity is not clear unless one specifies exactly which denominations of Christianity are being discussed.
Abdel Wahab Elmessiri (2002) outlined two meanings of the term secularization:
  1. Partial Secularization: which is the common meaning of the word, and expresses "The separation between religion and state".
  2. Complete Secularization: this definition is not limited to the partial definition, but exceeds it to "The separation between all (religion, moral, and human) values, and (not just the state) but also to (the human nature in its public and private sides), so that the holiness is removed from the world, and this world is transformed into a usable matter that can be employed for the sake of the strong".

Sociological use and differentiation

As studied by sociologists, one of the major themes of secularization is that of "differentiation"—i.e., the tendency for areas of life to become more distinct and specialized as a society becomes modernized. European sociology, influenced by anthropology, was interested in the process of change from the so-called primitive societies to increasingly advanced societies. In the United States, the emphasis was initially on change as an aspect of progress, but Talcott Parsons refocused on society as a system immersed in a constant process of increased differentiation, which he saw as a process in which new institutions take over the tasks necessary in a society to guarantee its survival as the original monolithic institutions break up. This is a devolution from single, less differentiated institutions to an increasingly differentiated subset of institutions.

Following Parsons, this concept of differentiation has been widely applied. As phrased by José Casanova, this "core and the central thesis of the theory of secularization is the conceptualization of the process of societal modernization as a process of functional differentiation and emancipation of the secular spheres—primarily the state, the economy, and science—from the religious sphere and the concomitant differentiation and specialization of religion within its own newly found religious sphere". Casanova also describes this as the theory of "privatization" of religion, which he partially criticizes. While criticizing certain aspects of the traditional sociological theory of secularization, however, David Martin argues that the concept of social differentiation has been its "most useful element".

Current issues in secularization

At present, secularization as understood in the West is being debated in the sociology of religion. In his works Legitimacy of the Modern Age (1966) and The Genesis of the Copernican World (1975), Hans Blumenberg has rejected the idea of a historical continuity – fundamental the so-called 'theorem of secularization'; the Modern age in his view represents an independent epoch opposed to Antiquity and the Middle Ages by a rehabilitation of human curiosity in reaction to theological absolutism. "Blumenberg targets Löwith's argument that progress is the secularization of Hebrew and Christian beliefs and argues to the contrary that the modern age, including its belief in progress, grew out of a new secular self-affirmation of culture against the Christian tradition." Wolfhart Pannenberg, a student of Löwith, has continued the debate against Blumenberg.

Charles Taylor in "A Secular Age" challenges what he calls 'the subtraction thesis' – that science leads to religion being subtracted from more and more areas of life.

Proponents of "secularization theory" demonstrate widespread declines in the prevalence of religious belief throughout the West, particularly in Europe. Some scholars (e.g., Rodney Stark, Peter Berger) have argued that levels of religiosity are not declining, while other scholars (e.g., Mark Chaves, N. J. Demerath) have countered by introducing the idea of neo-secularization, which broadens the definition of secularization to include the decline of religious authority and its ability to influence society. 

In other words, rather than using the proportion of irreligious apostates as the sole measure of secularity, neo-secularization argues that individuals increasingly look outside of religion for authoritative positions. Neo-secularizationists would argue that religion has diminishing authority on issues such as birth control, and argue that religion's authority is declining and secularization is taking place even if religious affiliation may not be declining in the United States (a debate still taking place).

Finally, some claim that demographic forces offset the process of secularization, and may do so to such an extent that individuals can consistently drift away from religion even as society becomes more religious. This is especially the case in societies like Israel (with the ultra-Orthodox and religious Zionists) where committed religious groups have several times the birth rate of seculars. The religious fertility effect operates to a greater or lesser extent in all countries, and is amplified in the West by religious immigration. For instance, even as native whites became more secular, London, England, has become more religious in the past 25 years as religious immigrants and their descendants have increased their share of the population.

Regional developments

United States

1870-1930. Christian Smith examined the secularization of American public life between 1870 and 1930. He noted that in 1870 a Protestant establishment thoroughly dominated American culture and its public institutions. By the turn of the 20th century, however, positivism had displaced the Baconian method (which had hitherto bolstered natural theology) and higher education had been thoroughly secularized. In the 1910s "legal realism" gained prominence, de-emphasizing the religious basis for law. That same decade publishing houses emerged that were independent of the Protestant establishment. During the 1920s secularization extended into popular culture and mass public education ceased to be under Protestant cultural influence. Although the general public was still highly religious during this time period, by 1930 the old Protestant establishment was in "shambles".

Key to understanding the secularization, Smith argues, was the rise of an elite intellectual class skeptical of religious orthodoxies and influenced by the European Enlightenment tradition. They consciously sought to displace a Protestant establishment they saw as standing in their way.

2008-2017. Annual Gallup polls from 2008 through 2015 showed that the fraction of American who did not identify with any particular religion steadily rose from 14.6% in 2008 to 19.6% in 2015. At the same time, the fraction of Americans identifying as Christians sunk from 80.1% to 75% during the same time. This trend continued until 2017 when 21.3% of Americans declared no religious identity. Given that non-Christian religions stayed roughly the same (at about 5% from 2008 to 2015) secularization seems to have affected primarily Christians.

Britain

History

In Britain, secularization came much later than in most of Western Europe. It began in the 1960s as part of a much larger social and cultural revolution. Until then the postwar years had seen a revival of religiosity in Britain. Sociologists and historians have engaged in vigorous debates over when it started, how fast it happened, and what caused it.

Sponsorship by royalty, aristocracy, and influential local gentry provided an important support-system for organized religion. The sponsorship faded away in the 20th century, as the local élites were no longer so powerful or so financially able to subsidize their favourite activities. In coal-mining districts, local collieries typically funded local chapels, but that ended as the industry grew distressed and the unionized miners rejected élite interference in their local affairs. This allowed secularizing forces to gain strength.

Recent developments

Data from the annual British Social Attitudes survey and the biennial European Social Survey suggest that the proportion of Britons who identify as Christian fell from 55% (in 1983) to 43% (in 2015). While members of non-Christian religions – principally Muslims and Hindus – quadrupled, the non-religious ("nones") now make up 53% of the British population. More than six in 10 “nones” were brought up as Christians, mainly Anglican or Catholic. Only 2% of “nones” were raised in religions other than Christian.

People who were brought up to practise a religion, but who now identifies as having no religion, so-called "non-verts", had different "non-version" rates, namely 14% for Jews, 10% for Muslims and Sikhs and 6% for Hindus. The proportions of the non-religious who convert to a faith are small: 3% now identify as Anglicans, less than 0.5% convert to Catholicism, 2% join other Christian denominations and 2% convert to non-Christian faiths.

India

Hinduism, which is the dominant way of life in India, has been described as a 'culture and civilisation of ancient origin' that is 'intrinsically secular'. India, post-independence, has seen the emergence of an assertive secular state.

China

One traditional view of Chinese culture sees the teachings of Confucianism - influential over many centuries - as basically secular.

Chang Pao-min summarises perceived historical consequences of very early secularization in China:
The early secularization of Chinese society, which must be recognized as a sign of modernity [...] has ironically left China for centuries without a powerful and stable source of morality and law. All this simply means that the pursuit of wealth or power or simply the competition for survival can be and often has been ruthless without any sense of restraint. [...] Along with the early secularization of Chinese society which was equally early, the concomitant demise of feudalism and hereditary aristocracy, another remarkable development, transformed China earlier than any other country into a unitary system politically, with one single power center. It also rendered Chinese society much more egalitarian than Western Europe and Japan.
In this arguably secular setting, the Chinese Communist Party régime of the People's Republic of China (in power on the Chinese mainland from 1949) promoted deliberate secularization.

Arab world

Many countries in the Arab world show signs of increasing secularization. For instance, in Egypt, support for imposing sharia (Islamic law) fell from 84% in 2011 to 34% in 2016. Egyptians also pray less: among older Egyptians (55+) 90% prayed daily in 2011. Among the younger generation (age 18-24) that fraction was only 70%. By contrast, in 2016 these numbers had fallen to less than 80% (55+) and under 40% (18-24). The other age groups were in between these values. In Lebanon and Morocco, the number of people listening to daily recitals of the Q'uran fell by half from 2011 to 2016. Some of these developments seem to be driven by need, e.g. by stagnating incomes which force women to contribute to household income and therefore to work. High living costs delay marriage and, as a consequence, seem to encourage pre-marital sex. However, in other countries, such as Algeria, Jordan, and Palestine, support for sharia and islamist ideas seems to grow. Even in countries in which secularization is growing, there are backlashes. For instance, the president of Egypt, Abdel-Fattah al-Sisi, has banned hundreds of newspapers and websites who may provoke opposition

Exegesis

From Wikipedia, the free encyclopedia
 
A Bible open to the Book of Isaiah
 
Exegesis (/ˌɛksɪˈsɪs/; from the Greek ἐξήγησις from ἐξηγεῖσθαι, "to lead out") is a critical explanation or interpretation of a text, particularly a religious text. Traditionally the term was used primarily for work with the Bible; however, in modern usage "biblical exegesis" is used for greater specificity to distinguish it from any other broader critical text explanation. 

Exegesis includes a wide range of critical disciplines: textual criticism is the investigation into the history and origins of the text, but exegesis may include the study of the historical and cultural backgrounds of the author, text, and original audience. Other analyses include classification of the type of literary genres presented in the text and analysis of grammatical and syntactical features in the text itself. 

The terms exegesis and hermeneutics have been used interchangeably.

Usage

One who practices exegesis is called an exegete (/ˌɛksɪˈt/; from Greek ἐξηγητής). The plural of exegesis is exegeses (/ˌɛksɪˈsz/). Adjectives are exegetic or exegetical (e.g., exegetical commentaries). In biblical exegesis, the opposite of exegesis (to draw out) is eisegesis (to draw in), in the sense of an eisegetic commentator "importing" or "drawing in" his or her own purely subjective interpretations into the text, unsupported by the text itself. Eisegesis is often used as a derogatory term.

Mesopotamian commentaries

The earliest examples, and also one of the largest corpora of text commentaries from the ancient world, come from Mesopotamia (modern Iraq) in the first millennium BCE. Known from over 860 manuscripts, the majority of which date to the period 700–100 BCE, most of these commentaries explore numerous types of texts, including literary works (such as the Babylonian Epic of Creation), medical treatises, magical texts, ancient dictionaries, and law collections (the Code of Hammurabi). Most of them, however, comment on divination treatises, in particular treatises that predict the future from the appearance and movement of celestial bodies on the one hand (Enūma Anu Enlil), and from the appearance of a sacrificed sheep’s liver on the other (Bārûtu). 

As with the majority of the thousands of texts from the ancient Near East that have survived to the present day, Mesopotamian text commentaries are written on clay tablets in cuneiform script. Text commentaries are written in the East Semitic language of Akkadian, but due to the influence of lexical lists written in Sumerian language on cuneiform scholarship, they often contain Sumerian words or phrases as well. 

Cuneiform commentaries are important because they provide information about Mesopotamian languages and culture that are not available elsewhere in the cuneiform record. To give but one example, the pronunciation of the cryptically written name of Gilgamesh, the hero of the Epic of Gilgamesh, was discovered in a cuneiform commentary on a medical text. However, the significance of cuneiform commentaries extends beyond the light they shed on specific details of Mesopotamian civilization. They open a window onto what the concerns of the Mesopotamian literate elite were when they read some of the most widely studied texts in the Mesopotamian intellectual tradition, a perspective that is important for “seeing things their way.” Finally, cuneiform commentaries are also the earliest examples of textual interpretation. It has been repeatedly argued that they influenced rabbinical exegesis.

The publication and interpretation of these texts began in the mid-nineteenth century, with the discovery of the royal Assyrian libraries at Nineveh, from which ca. 454 text commentaries have been recovered. The study of cuneiform commentaries is, however, far from complete. It is the subject of on-going research by the small, international community of scholars who specialize in the field of Assyriology.

Bible commentaries

A common published form of biblical exegesis is known as a Bible commentary and typically takes the form of a set of books, each of which is devoted to the exposition of one or two books of the Bible. Long books or those that contain much material either for theological or historical-critical speculation, such as Genesis or Psalms, may be split over two or three volumes. Some, such as the Four Gospels, may be multiple- or single-volume, while short books such as the deuterocanonical portions of Daniel, Esther, and Jeremiah (i.e. Book of Susanna, Prayer of Azariah, Bel and the Dragon, Additions to Esther, Baruch and the Epistle of Jeremiah), or the pastoral or Johannine epistles are often condensed into one volume. 

The form of each book may be identical or allow for variations in methodology between the many authors who collaborate to write a full commentary. Each book's commentary generally consists of a background and introductory section, followed by detailed commentary of the book pericope-by-pericope or verse-by-verse. Before the 20th century, a commentary would be written by a sole author, but today a publishing board will commission a team of scholars to write a commentary, with each volume being divided out among them.

A single commentary will generally attempt to give a coherent and unified view on the Bible as a whole, for example, from a Catholic or Reformed (Calvinist) perspective, or a commentary that focuses on textual criticism or historical criticism from a secular point of view. However, each volume will inevitably lean toward the personal emphasis of its author, and within any commentaries there may be great variety in the depth, accuracy, and critical or theological strength of each volume.

Christianity

Views

The main Christian exegetical methods are historical-grammatical, historical criticism, revealed, and rational. 

The historical-grammatical method is a Christian hermeneutical method that strives to discover the Biblical author's original intended meaning in the text. It is the primary method of interpretation for many conservative Protestant exegetes who reject the historical-critical method to various degrees (from the complete rejection of historical criticism of some fundamentalist Protestants to the moderated acceptance of it in the Catholic Church since Pope Pius XII), in contrast to the overwhelming reliance on historical-critical interpretation, often to the exclusion of all other hermeneutics, in liberal Christianity

Historical criticism also known as the historical-critical method or higher criticism, is a branch of literary criticism that investigates the origins of ancient texts in order to understand "the world behind the text". This is done to discover the text's primitive or original meaning in its original historical context and its literal sense.

Revealed exegesis considers that the Holy Spirit inspired the authors of the scriptural texts, and so the words of those texts convey a divine revelation. In this view of exegesis, the principle of sensus plenior applies — that because of its divine authorship, the Bible has a "fuller meaning" than its human authors intended or could have foreseen. 

Rational exegesis bases its operation on the idea that the authors have their own inspiration (in this sense, synonymous with artistic inspiration), so their works are completely and utterly a product of the social environment and human intelligence of their authors.

Catholic

Catholic centres of biblical exegesis include:

Protestant

For more than a century, German universities such as Tübingen have had reputations as centers of exegesis; in the USA, the Divinity Schools of Chicago, Harvard and Yale became famous.

Robert A. Traina's book Methodical Bible Study is an example of Protestant Christian exegesis.

Judaism

Traditional Jewish forms of exegesis appear throughout rabbinic literature, which includes the Mishnah, the two Talmuds, and the midrash literature.

Jewish exegetes have the title mefarshim מפרשים‬ (commentators).

Midrash

The Midrash is a homiletic method of exegesis and a compilation of homiletic teachings or commentaries on the Tanakh (Hebrew Bible), a biblical exegesis of the Pentateuch and its paragraphs related to the Law or Torah, which also forms an object of analysis. It comprises the legal and ritual Halakha, the collective body of Jewish laws, and exegesis of the written Law; and the non-legalistic Aggadah, a compendium of Rabbinic homilies of the parts of the Pentateuch not connected with Law.
Biblical interpretation by the Tannaim and the Amoraim, which may be best designated as scholarly interpretations of the Midrash, was a product of natural growth and of great freedom in the treatment of the words of the Bible. However, it proved an obstacle to further development when, endowed with the authority of a sacred tradition in the Talmud and in the Midrash (collections edited subsequently to the Talmud), it became the sole source for the interpretation of the Bible among later generations. Traditional literature contains explanations that are in harmony with the wording and the context. It reflects evidence of linguistic sense, judgment, and an insight into the peculiarities and difficulties of the biblical text. But side by side with these elements of a natural and simple Bible exegesis, of value even today, the traditional literature contains an even larger mass of expositions removed from the actual meaning of the text.
Halakha and Aggadah
In the halakhic as well as in the haggadic exegesis the expounder endeavored not so much to seek the original meaning of the text as to find authority in some Bible passage for concepts and ideas, rules of conduct and teachings, for which he wished to have a biblical foundation. The talmudical hermeneutics form asmachta is defined as finding hints for a given law rather than basing on the bible text. To this were added, on the one hand, the belief that the words of the Bible had many meanings, and, on the other, the importance attached to the smallest portion, the slightest peculiarity of the text. Because of this move towards particularities the exegesis of the Midrash strayed further and further away from a natural and common-sense interpretation.
Midrash
Midrash exegesis was largely in the nature of homiletics, expounding the Bible not in order to investigate its actual meaning and to understand the documents of the past but to find religious edification, moral instruction, and sustenance for the thoughts and feelings of the present. The contrast between explanation of the literal sense and the Midrash, that did not follow the words, was recognized by the Tannaim and the Amoraim, although their idea of the literal meaning of a biblical passage may not be allowed by more modern standards. The above-mentioned tanna, Ishmael b. Elisha said, rejecting an exposition of Eliezer b. Hyrcanus: "Truly, you say to Scripture, 'Be silent while I am expounding!'" (Sifra on Lev. xiii. 49).
Tannaim
Tannaitic exegesis distinguishes principally between the actual deduction of a thesis from a Bible passage as a means of proving a point, and the use of such a passage as a mere mnemonic device – a distinction that was also made in a different form later in the Babylonian schools. The Babylonian Amoraim were the first to use the expression "Peshaṭ" ("simple" or face value method) to designate the primary sense, contrasting it with the "Drash," the Midrashic exegesis. These two terms were later on destined to become important features in the history of Jewish Bible exegesis. In Babylonia was formulated the important principle that the Midrashic exegesis could not annul the primary sense. This principle subsequently became the watchword of commonsense Bible exegesis. How little it was known or recognized may be seen from the admission of Kahana, a Babylonian amora of the fourth century, that while at 18 years of age he had already learned the whole Mishnah, he had only heard of that principle a great many years later (Shab 63a). Kahana's admission is characteristic of the centuries following the final redaction of the Talmud. The primary meaning is no longer considered, but it becomes more and more the fashion to interpret the text according to the meaning given to it in traditional literature. The ability and even the desire for original investigation of the text succumbed to the overwhelming authority of the Midrash. It was, therefore, providential that, just at the time when the Midrash was paramount, the close study of the text of the Bible, at least in one direction, was pursued with rare energy and perseverance by the Masorites, who set themselves to preserving and transmitting the pronunciation and correct reading of the text. By introducing punctuation (vowel-points and accents) into the biblical text, in the seventh century, they supplied that protecting hedge which, according to Rabbi Akiva's saying, the Masorah was to be for the words of the Bible. Punctuation, on the one hand, protected the tradition from being forgotten, and, on the other, was the precursor of an independent Bible science to be developed in a later age.

Mikra

The Mikra, the fundamental part of the national science, was the subject of the primary instruction. It was also divided into the three historic groups of the books of the Bible: the Pentateuch, the Prophets, and the Hagiographa, called in traditional Hebrew attribution the Torah (the Law or Teaching), the Nevi'im (the Prophets) and the Kethuvim (the Writings) respectively. The intelligent reading and comprehension of the text, arrived at by a correct division of the sentences and words, formed the course of instruction in the Bible. The scribes were also required to know the Targum, the Aramaic translation of the text. The Targum made possible an immediate comprehension of the text, but was continuously influenced by the exegesis taught in the schools. The synagogues were preeminently the centers for instruction in the Bible and its exegesis. The reading of the biblical text, which was combined with that of the Targum, served to widen the knowledge of the scholars learned in the first division of the national science. The scribes found the material for their discourses, which formed a part of the synagogue service, in the second division of the several branches of the tradition. The Haggadah, the third of these branches, was the source material for the sermon.

Jewish exegesis did not finish with the redaction of the Talmud, but continued during ancient times, the Middle Ages and the Renaissance; it remains a subject of study today. Jews have centres for exegetic studies around the world, in each community: they consider exegesis an important tool for the understanding of the Scriptures.

Indian philosophy

The Mimamsa school of Indian philosophy, also known as Pūrva Mīmāṃsā ("prior" inquiry, also Karma-Mīmāṃsā), in contrast to Uttara Mīmāṃsā ("posterior" inquiry, also Brahma-Mīmāṃsā), is strongly concerned with textual exegesis, and consequently gave rise to the study of philology and the philosophy of language. Its notion of shabda "speech" as indivisible unity of sound and meaning (signifier and signified) is due to Bhartrhari (7th century).

Islam

Tafsir (Arabic: تفسير‎, tafsīr, "interpretation") is the Arabic word for exegesis or commentary, usually of the Qur'an. An author of tafsīr is a mufassir (Arabic: 'مُفسر‎, mufassir, plural: Arabic: مفسرون‎, mufassirūn). 

Tafsir does not include esoteric or mystical interpretations, which are covered by the related word Ta'wil. Shi'ite organization Ahlul Bayt Digital Islamic Library Project cites the Islamic prophet Muhammad as stating that the Qur'an has an inner meaning, and that this inner meaning conceals an even deeper inner meaning, in support of this view. Adherents of people for Sufism and Ilm al-Kalam pioneered this thought.

Zoroastrianism

Zoroastrian exegesis consists basically of the interpretation of the Avesta. However, the closest equivalent Iranian concept, zand, generally includes Pahlavi texts which were believed to derive from commentaries upon Avestan scripture, but whose extant form contains no Avestan passages. Zoroastrian exegesis differs from similar phenomena in many other religions in that it developed as part of a religious tradition which made little or no use of writing until well into the Sasanian era. This lengthy period of oral transmission has clearly helped to give the Middle Persian Zand its characteristic shape and has, in a sense, limited its scope. Although the later tradition makes a formal distinction between “Gathic” (gāhānīg), “legal” (dādīg), and perhaps “ritual” (hādag-mānsrīg) Avestan texts, there appear to be no significant differences in approach between the Pahlavi commentary on the Gathas and those on dādīg texts, such as the Vendīdād, the Hērbedestān and the Nērangestān. Since many 19th and 20th century works by Zoroastrians contain an element of exegesis, while on the other hand no exegetical literature in the strict sense of the word can be said to exist, the phenomenon of modern Zoroastrian exegesis as such will be discussed here, without detailed reference to individual texts.

In a secular context

Several universities, including the Sorbonne in Paris, Leiden University, and the Université Libre de Bruxelles (Free University of Brussels), put exegesis in a secular context, next to exegesis in a religious tradition. Secular exegesis is an element of the study of religion.

At Australian universities, the exegesis is part of practice-based doctorate projects. It is a scholarly text accompanying a film, literary text, etc. produced by the PhD. candidate.

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...