Search This Blog

Monday, February 3, 2020

Being and Time

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Being_and_Time
 
Being and Time
Being and Time (German edition).jpg
Cover of the first edition
AuthorMartin Heidegger
Original titleSein und Zeit
Translator1962: John Macquarrie and Edward Robinson
1996: Joan Stambaugh
CountryGermany
LanguageGerman
SubjectBeing
Published1927 (in German)
1962: SCM Press
1996: State University of New York Press
2008: Harper Perennial Modern Thought
Pages589 (Macquarrie and Robinson translation)
482 (Stambaugh translation)
ISBN0-631-19770-2 (Blackwell edition)
978-1-4384-3276-2 (State University of New York Press edition)
Followed byKant and the Problem of Metaphysics 

Being and Time (German: Sein und Zeit) is a 1927 book by the German philosopher Martin Heidegger, in which the author seeks to analyse the concept of Being. Heidegger maintains that this has fundamental importance for philosophy and that, since the time of the Ancient Greeks, philosophy has avoided the question, turning instead to the analysis of particular beings. Heidegger attempts to revive ontology through a reawakening of the question of the meaning of being. He approaches this through a fundamental ontology that is a preliminary analysis of the being of the being to whom the question of being is important, i.e., Dasein.

Heidegger wrote that Being and Time was made possible by his study of Edmund Husserl's Logical Investigations (1900–1901), and it is dedicated to Husserl "in friendship and admiration". Although Heidegger did not complete the project outlined in the introduction, Being and Time remains his most important work. It was immediately recognized as an original and groundbreaking philosophical work, and later became a focus of debates and controversy, and a profound influence on 20th-century philosophy, particularly existentialism, hermeneutics, deconstruction, and the enactivist approach to cognition. Being and Time has been described as the most influential version of existential philosophy, and Heidegger's achievements in the work have been compared to those of Immanuel Kant in the Critique of Pure Reason (1781) and Georg Wilhelm Friedrich Hegel in The Phenomenology of Spirit (1807) and Science of Logic (1812–1816). The work influenced philosophical treatises such as Jean-Paul Sartre's Being and Nothingness (1943).

Background

According to Heidegger's statement in Being and Time, the work was made possible by his study of Husserl's Logical Investigations (1900–1901). Being and Time was originally intended to consist of two major parts, each part consisting of three divisions. Heidegger was forced to prepare the book for publication when he had completed only the first two divisions of part one. The remaining divisions planned for Being and Time (particularly the divisions on time and being, Immanuel Kant, and Aristotle) were never published, although in many respects they were addressed in one form or another in Heidegger's other works. In terms of structure, Being and Time remains as it was when it first appeared in print; it consists of the lengthy two-part introduction, followed by Division One, the "Preparatory Fundamental Analysis of Dasein," and Division Two, "Dasein and Temporality." 

Summary


Being

Heidegger describes his project in the following way: "our aim in the following treatise is to work out the question of the sense of being and to do so concretely." Heidegger claims that traditional ontology has prejudicially overlooked this question, dismissing it on the basis that being is the most universal and emptiest concept, that is indefinable or obvious.

Instead Heidegger proposes to understand being itself, as distinguished from any specific entities (beings). "'Being' is not something like a being." Being, Heidegger claims, is "what determines beings as beings, that in terms of which beings are already understood." Heidegger is seeking to identify the criteria or conditions by which any specific entity can show up at all (see world disclosure).

If we grasp Being, we will clarify the meaning of being, or "sense" of being (Sinn des Seins), where by "sense" Heidegger means that "in terms of which something becomes intelligible as something." Presented in relation to the quality of knowledge, according to Heidegger, this sense of being precedes any notions of how or in what manner any particular being or beings exist, and is thus pre-scientific. Thus, in Heidegger's view, the question of the meaning of being would be an explanation of the understanding preceding any other way of knowing, such as the use of logic, theory, specific regional ontology. At the same time, there is no access to being other than via beings themselves—hence pursuing the question of being inevitably means questioning a being with regard to its being. Heidegger argues that a true understanding of being (Seinsverständnis) can only proceed by referring to particular beings, and that the best method of pursuing being must inevitably, he says, involve a kind of hermeneutic circle, that is (as he explains in his critique of prior work in the field of hermeneutics), it must rely upon repetitive yet progressive acts of interpretation. "The methodological sense of phenomenological description is interpretation."

Dasein

Thus the question Heidegger asks in the introduction to Being and Time is: what is the being that will give access to the question of the meaning of Being? Heidegger's answer is that it can only be that being for whom the question of Being is important, the being for whom Being matters. As this answer already indicates, the being for whom Being is a question is not a what, but a who. Heidegger calls this being Dasein (an ordinary German word literally meaning "being-there," i.e., existence), and the method pursued in Being and Time consists in the attempt to delimit the characteristics of Dasein, in order thereby to approach the meaning of Being itself through an interpretation of the temporality of Dasein. Dasein is not "man," but is nothing other than "man"—it is this distinction that enables Heidegger to claim that Being and Time is something other than philosophical anthropology.

Heidegger's account of Dasein passes through a dissection of the experiences of Angst and mortality, and then through an analysis of the structure of "care" as such. From there he raises the problem of "authenticity," that is, the potentiality or otherwise for mortal Dasein to exist fully enough that it might actually understand being. Heidegger is clear throughout the book that nothing makes certain that Dasein is capable of this understanding. 

Time

Finally, this question of the authenticity of individual Dasein cannot be separated from the "historicality" of Dasein. On the one hand, Dasein, as mortal, is "stretched along" between birth and death, and thrown into its world, that is, thrown into its possibilities, possibilities which Dasein is charged with the task of assuming. On the other hand, Dasein's access to this world and these possibilities is always via a history and a tradition—this is the question of "world historicality," and among its consequences is Heidegger's argument that Dasein's potential for authenticity lies in the possibility of choosing a "hero."

Thus, more generally, the outcome of the progression of Heidegger's argument is the thought that the being of Dasein is time. Nevertheless, Heidegger concludes his work with a set of enigmatic questions foreshadowing the necessity of a destruction (that is, a transformation) of the history of philosophy in relation to temporality—these were the questions to be taken up in the never completed continuation of his project:
The existential and ontological constitution of the totality of Dasein is grounded in temporality. Accordingly, a primordial mode of temporalizing of ecstatic temporality itself must make the ecstatic project of being in general possible. How is this mode of temporalizing of temporality to be interpreted? Is there a way leading from primordial time to the meaning of being? Does time itself reveal itself as the horizon of being?

Phenomenology in Heidegger and Husserl

Although Heidegger describes his method in Being and Time as phenomenological, the question of its relation to the phenomenology of Husserl is complex. The fact that Heidegger believes that ontology includes an irreducible hermeneutic (interpretative) aspect, for example, might be thought to run counter to Husserl's claim that phenomenological description is capable of a form of scientific positivity. On the other hand, however, several aspects of the approach and method of Being and Time seem to relate more directly to Husserl's work.

The central Husserlian concept of the directedness of all thought—intentionality—for example, while scarcely mentioned in Being and Time, has been identified by some with Heidegger's central notion of Sorge (cura, care or concern). However, for Heidegger, theoretical knowledge represents only one kind of intentional behaviour, and he asserts that it is grounded in more fundamental modes of behaviour and forms of practical engagement with the surrounding world. Whereas a theoretical understanding of things grasps them according to "presence," for example, this may conceal that our first experience of a being may be in terms of its being "ready-to-hand." Thus, for instance, when someone reaches for a tool such as a hammer, their understanding of what a hammer is is not determined by a theoretical understanding of its presence, but by the fact that it is something we need at the moment we wish to do hammering. Only a later understanding might come to contemplate a hammer as an object.

Hermeneutics

The total understanding of being results from an explication of the implicit knowledge of being that inheres in Dasein. Philosophy thus becomes a form of interpretation, but since there is no external reference point outside being from which to begin this interpretation, the question becomes to know in which way to proceed with this interpretation. This is the problem of the "hermeneutic circle," and the necessity for the interpretation of the meaning of being to proceed in stages: this is why Heidegger's technique in Being and Time is sometimes referred to as hermeneutical phenomenology.

Destructuring of metaphysics

As part of his ontological project, Heidegger undertakes a reinterpretation of previous Western philosophy. He wants to explain why and how theoretical knowledge came to seem like the most fundamental relation to being. This explanation takes the form of a destructuring (Destruktion) of the philosophical tradition, an interpretative strategy that reveals the fundamental experience of being at the base of previous philosophies that had become entrenched and hidden within the theoretical attitude of the metaphysics of presence. This use of the word Destruktion is meant to signify not a negative operation but rather a positive transformation or recovery.

In Being and Time Heidegger briefly undertakes a destructuring of the philosophy of René Descartes, but the second volume, which was intended to be a Destruktion of Western philosophy in all its stages, was never written. In later works Heidegger uses this approach to interpret the philosophies of Aristotle, Kant, Hegel, Plato, Nietzsche, and Hölderlin, among others.

Related work

Being and Time is the major achievement of Heidegger's early career, but he produced other important works during this period:
  • The publication in 1992 of the early lecture course, Platon: Sophistes (Plato's Sophist, 1924), made clear the way in which Heidegger's reading of Aristotle's Nicomachean Ethics was crucial to the formulation of the thought expressed in Being and Time.
  • The lecture course, Prolegomena zur Geschichte des Zeitbegriffs (History of the Concept of Time: Prolegomena, 1925), was something like an early version of Being and Time.
  • The lecture courses immediately following the publication of Being and Time, such as Die Grundprobleme der Phänomenologie (The Basic Problems of Phenomenology, 1927), and Kant und das Problem der Metaphysik (Kant and the Problem of Metaphysics, 1929), elaborated some elements of the destruction of metaphysics which Heidegger intended to pursue in the unwritten second part of Being and Time.
Although Heidegger did not complete the project outlined in Being and Time, later works explicitly addressed the themes and concepts of Being and Time. Most important among the works which do so are the following:
  • Heidegger's inaugural lecture upon his return to Freiburg, "Was ist Metaphysik?" (What Is Metaphysics?, 1929), was an important and influential clarification of what Heidegger meant by being, non-being, and nothingness.
  • Einführung in die Metaphysik (An Introduction to Metaphysics), a lecture course delivered in 1935, is identified by Heidegger, in his preface to the seventh German edition of Being and Time, as relevant to the concerns which the second half of the book would have addressed.
  • Beiträge zur Philosophie (Vom Ereignis) (Contributions to Philosophy [From Enowning], composed 1936–38, published 1989), a sustained attempt at reckoning with the legacy of Being and Time.
  • Zeit und Sein (Time and Being), a lecture delivered at the University of Freiburg on January 31, 1962. This was Heidegger's most direct confrontation with Being and Time. It was followed by a seminar on the lecture, which took place at Todtnauberg on September 11–13, 1962, a summary of which was written by Alfred Guzzoni. Both the lecture and the summary of the seminar are included in Zur Sache des Denkens (1969; translated as On Time and Being [New York: Harper & Row, 1972]).

Influence and reception

The critic George Steiner argues that Being and Time is a product of the crisis of German culture following Germany's defeat in World War I, similar in this respect to works such as Ernst Bloch's The Spirit of Utopia (1918), Oswald Spengler's The Decline of the West (1918), Franz Rosenzweig's The Star of Redemption (1921), Karl Barth's The Epistle to the Romans (1922), and Adolf Hitler's Mein Kampf (1925). Upon its publication, it was recognized as a groundbreaking philosophical work, with reviewers crediting Heidegger with "brilliance" and "genius". The book, which has been described as the "most influential version of existential philosophy", quickly became "the focus of debates and controversy". Heidegger claimed in the 1930s that commentators had attempted to show similarities between his views and those of Hegel in order to undermine the idea that Being and Time was an original work. In response, Heidegger maintained that his thesis that the essence of being is time is the opposite of Hegel's view that being is the essence of time. Karl Jaspers, writing in the first volume of his work Philosophy (1932), credited Heidegger with making essential points about "being in the world" and also about "existence and historicity".

Heidegger's work has been suggested as a possible influence on Herbert Marcuse's Hegel's Ontology and the Theory of Historicity (1932), though Marcuse later questioned the political implications of Heidegger's work. Jean-Paul Sartre, who wrote Being and Nothingness (1943) under the influence of Heidegger's work, has been said to have responded to Being and Time with "a sense of shock". Sartre's existentialism has been described as "a version and variant of the idiom and propositions" in Being and Time. Because of Heidegger's revival of the question of being, Being and Time also influenced other philosophers of Sartre's generation, and it altered the course of French philosophy. Maurice Merleau-Ponty argued in Phenomenology of Perception (1945) that Being and Time, "springs from an indication given by Husserl and amounts to no more than an explicit account of the 'natürlicher Weltbegriff' or the 'Lebenswelt' which Husserl, towards the end of his life, identified as the central theme of phenomenology". Heidegger influenced psychoanalysis through Jacques Lacan, who quotes from Being and Time in a 1953 text.

The publication of the English translation of the work by John Macquarrie and Edward Robinson in 1962, helped to shape the way in which Heidegger's work was discussed in English. Gilles Deleuze's Difference and Repetition (1968) was influenced by Heidegger's Being and Time, though Deleuze replaces Heidegger's key terms of being and time with difference and repetition respectively. Frank Herbert's science fiction novel The Santaroga Barrier (1968) was loosely based on the ideas of Being and Time. The philosopher Lucien Goldmann argued in his posthumously published Lukacs and Heidegger: Towards a New Philosophy (1973) that the concept of reification as employed in Being and Time showed the strong influence of György Lukács' History and Class Consciousness (1923), though Goldmann's suggestion has been disputed. Being and Time influenced Alain Badiou's work Being and Event (1988). Roger Scruton writes that Being and Time is "the most complex of the many works inspired, directly or indirectly, by Kant's theory of time as 'the form of inner sense'." He considers Heidegger's language "metaphorical" and almost incomprehensible. Scruton suggests that this necessarily follows from the nature of Heidegger's phenomenological method. He finds Heidegger's "description of the world of phenomena" to be "fascinating, but maddeningly abstract". He suggests that much of Being and Time is a "description of a private spiritual journey" rather than genuine philosophy, and notes that Heidegger's assertions are unsupported by argument.

Stephen Houlgate compares Heidegger's achievements in Being and Time to those of Kant in the Critique of Pure Reason (1781) and Hegel in The Phenomenology of Spirit (1807) and Science of Logic (1812-1816). Simon Critchley calls the work Heidegger's magnum opus, and writes that it is impossible to understand developments in continental philosophy after Heidegger without understanding it. Dennis J. Schmidt praises the "range and subtlety" of Being and Time, and describes its importance by quoting a comment the writer Johann Wolfgang von Goethe made in a different context, "from here and today a new epoch of world history sets forth." Heidegger has become common background for the political movement concerned with protection of the environment, and his narrative of the history of Being frequently appears when capitalism, consumerism and technology are thoughtfully opposed. Michael E. Zimmerman writes that, "Because he criticized technological modernity’s domineering attitude toward nature, and because he envisioned a postmodern era in which people would “let things be,” Heidegger has sometimes been read as an intellectual forerunner of today’s “deep ecology” movement.

Being and Time also influenced the enactivist approach to cognition.

Brans–Dicke theory

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Brans%E2%80%93Dicke_theory
 
In theoretical physics, the Brans–Dicke theory of gravitation (sometimes called the Jordan–Brans–Dicke theory) is a theoretical framework to explain gravitation. It is a competitor of Einstein's theory of general relativity. It is an example of a scalar–tensor theory, a gravitational theory in which the gravitational interaction is mediated by a scalar field as well as the tensor field of general relativity. The gravitational constant G is not presumed to be constant but instead 1/G is replaced by a scalar field which can vary from place to place and with time.

The theory was developed in 1961 by Robert H. Dicke and Carl H. Brans building upon, among others, the earlier 1959 work of Pascual Jordan. At present, both Brans–Dicke theory and general relativity are generally held to be in agreement with observation. Brans–Dicke theory represents a minority viewpoint in physics.

Comparison with general relativity

Both Brans–Dicke theory and general relativity are examples of a class of relativistic classical field theories of gravitation, called metric theories. In these theories, spacetime is equipped with a metric tensor, , and the gravitational field is represented (in whole or in part) by the Riemann curvature tensor , which is determined by the metric tensor.

All metric theories satisfy the Einstein equivalence principle, which in modern geometric language states that in a very small region (too small to exhibit measurable curvature effects), all the laws of physics known in special relativity are valid in local Lorentz frames. This implies in turn that metric theories all exhibit the gravitational redshift effect.

As in general relativity, the source of the gravitational field is considered to be the stress–energy tensor or matter tensor. However, the way in which the immediate presence of mass-energy in some region affects the gravitational field in that region differs from general relativity. So does the way in which spacetime curvature affects the motion of matter. In the Brans–Dicke theory, in addition to the metric, which is a rank two tensor field, there is a scalar field, , which has the physical effect of changing the effective gravitational constant from place to place. (This feature was actually a key desideratum of Dicke and Brans; see the paper by Brans cited below, which sketches the origins of the theory.)

The field equations of Brans–Dicke theory contain a parameter, , called the Brans–Dicke coupling constant. This is a true dimensionless constant which must be chosen once and for all. However, it can be chosen to fit observations. Such parameters are often called tuneable parameters. In addition, the present ambient value of the effective gravitational constant must be chosen as a boundary condition. General relativity contains no dimensionless parameters whatsoever, and therefore is easier to falsify (show whether false) than Brans–Dicke theory. Theories with tuneable parameters are sometimes deprecated on the principle that, of two theories which both agree with observation, the more parsimonious is preferable. On the other hand, it seems as though they are a necessary feature of some theories, such as the weak mixing angle of the Standard Model.

Brans–Dicke theory is "less stringent" than general relativity in another sense: it admits more solutions. In particular, exact vacuum solutions to the Einstein field equation of general relativity, augmented by the trivial scalar field , become exact vacuum solutions in Brans–Dicke theory, but some spacetimes which are not vacuum solutions to the Einstein field equation become, with the appropriate choice of scalar field, vacuum solutions of Brans–Dicke theory. Similarly, an important class of spacetimes, the pp-wave metrics, are also exact null dust solutions of both general relativity and Brans–Dicke theory, but here too, Brans–Dicke theory allows additional wave solutions having geometries which are incompatible with general relativity.

Like general relativity, Brans–Dicke theory predicts light deflection and the precession of perihelia of planets orbiting the Sun. However, the precise formulas which govern these effects, according to Brans–Dicke theory, depend upon the value of the coupling constant . This means that it is possible to set an observational lower bound on the possible value of from observations of the solar system and other gravitational systems. The value of consistent with experiment has risen with time. In 1973 was consistent with known data. By 1981 was consistent with known data. In 2003 evidence – derived from the Cassini–Huygens experiment – shows that the value of must exceed 40,000.

It is also often taught that general relativity is obtained from the Brans–Dicke theory in the limit . But Faraoni claims that this breaks down when the trace of the stress-energy momentum vanishes, i.e.  . An example of which is the Campanelli-Lousto wormhole solution. Some have argued that only general relativity satisfies the strong equivalence principle

The field equations

The field equations of the Brans/Dicke theory are
,
where
is the dimensionless Dicke coupling constant;
is the metric tensor;
is the Einstein tensor, a kind of average curvature;
is the Ricci tensor, a kind of trace of the curvature tensor;
is the Ricci scalar, the trace of the Ricci tensor;
is the stress–energy tensor;
is the trace of the stress–energy tensor;
is the scalar field; and
is the Laplace–Beltrami operator or covariant wave operator, .
The first equation says that the trace of the stress–energy tensor acts as the source for the scalar field . Since electromagnetic fields contribute only a traceless term to the stress–energy tensor, this implies that in a region of spacetime containing only an electromagnetic field (plus the gravitational field), the right hand side vanishes, and obeys the (curved spacetime) wave equation. Therefore, changes in propagate through electrovacuum regions; in this sense, we say that is a long-range field

The second equation describes how the stress–energy tensor and scalar field together affect spacetime curvature. The left hand side, the Einstein tensor, can be thought of as a kind of average curvature. It is a matter of pure mathematics that, in any metric theory, the Riemann tensor can always be written as the sum of the Weyl curvature (or conformal curvature tensor) plus a piece constructed from the Einstein tensor.

For comparison, the field equation of general relativity is simply
This means that in general relativity, the Einstein curvature at some event is entirely determined by the stress–energy tensor at that event; the other piece, the Weyl curvature, is the part of the gravitational field which can propagate as a gravitational wave across a vacuum region. But in the Brans–Dicke theory, the Einstein tensor is determined partly by the immediate presence of mass-energy and momentum, and partly by the long-range scalar field .

The vacuum field equations of both theories are obtained when the stress–energy tensor vanishes. This models situations in which no non-gravitational fields are present. 

The action principle

The following Lagrangian contains the complete description of the Brans/Dicke theory:
 
where is the determinant of the metric, is the four-dimensional volume form, and is the matter term or matter Lagrangian.

The matter term includes the contribution of ordinary matter (e.g. gaseous matter) and also electromagnetic fields. In a vacuum region, the matter term vanishes identically; the remaining term is the gravitational term. To obtain the vacuum field equations, we must vary the gravitational term in the Lagrangian with respect to the metric ; this gives the second field equation above. When we vary with respect to the scalar field , we obtain the first field equation. 

Note that, unlike for the General Relativity field equations, the term does not vanish, as the result is not a total derivative. It can be shown that
To prove this result, use
By evaluating the s in Riemann normal coordinates, 6 individual terms vanish. 6 further terms combine when manipulated using Stokes' theorem to provide the desired .

For comparison, the Lagrangian defining general relativity is
Varying the gravitational term with respect to gives the vacuum Einstein field equation.
In both theories, the full field equations can be obtained by variations of the full Lagrangian.

Numerical relativity

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Numerical_relativity

Numerical relativity is one of the branches of general relativity that uses numerical methods and algorithms to solve and analyze problems. To this end, supercomputers are often employed to study black holes, gravitational waves, neutron stars and many other phenomena governed by Einstein's theory of general relativity. A currently active field of research in numerical relativity is the simulation of relativistic binaries and their associated gravitational waves. Other branches are also active.

Overview

A primary goal of numerical relativity is to study spacetimes whose exact form is not known. The spacetimes so found computationally can either be fully dynamical, stationary or static and may contain matter fields or vacuum. In the case of stationary and static solutions, numerical methods may also be used to study the stability of the equilibrium spacetimes. In the case of dynamical spacetimes, the problem may be divided into the initial value problem and the evolution, each requiring different methods.

Numerical relativity is applied to many areas, such as cosmological models, critical phenomena, perturbed black holes and neutron stars, and the coalescence of black holes and neutron stars, for example. In any of these cases, Einstein's equations can be formulated in several ways that allow us to evolve the dynamics. While Cauchy methods have received a majority of the attention, characteristic and Regge calculus based methods have also been used. All of these methods begin with a snapshot of the gravitational fields on some hypersurface, the initial data, and evolve these data to neighboring hypersurfaces.

Like all problems in numerical analysis, careful attention is paid to the stability and convergence of the numerical solutions. In this line, much attention is paid to the gauge conditions, coordinates, and various formulations of the Einstein equations and the effect they have on the ability to produce accurate numerical solutions.

Numerical relativity research is distinct from work on classical field theories as many techniques implemented in these areas are inapplicable in relativity. Many facets are however shared with large scale problems in other computational sciences like computational fluid dynamics, electromagnetics, and solid mechanics. Numerical relativists often work with applied mathematicians and draw insight from numerical analysis, scientific computation, partial differential equations, and geometry among other mathematical areas of specialization. 

History


Foundations in theory

Albert Einstein published his theory of general relativity in 1915. It, like his earlier theory of special relativity, described space and time as a unified spacetime subject to what are now known as the Einstein field equations. These form a set of coupled nonlinear partial differential equations (PDEs). After more than 100 years since the first publication of the theory, relatively few closed-form solutions are known for the field equations, and, of those, most are cosmological solutions that assume special symmetry to reduce the complexity of the equations.

The field of numerical relativity emerged from the desire to construct and study more general solutions to the field equations by approximately solving the Einstein equations numerically. A necessary precursor to such attempts was a decomposition of spacetime back into separated space and time. This was first published by Richard Arnowitt, Stanley Deser, and Charles W. Misner in the late 1950s in what has become known as the ADM formalism.[3] Although for technical reasons the precise equations formulated in the original ADM paper are rarely used in numerical simulations, most practical approaches to numerical relativity use a "3+1 decomposition" of spacetime into three-dimensional space and one-dimensional time that is closely related to the ADM formulation, because the ADM procedure reformulates the Einstein field equations into a constrained initial value problem that can be addressed using computational methodologies.

At the time that ADM published their original paper, computer technology would not have supported numerical solution to their equations on any problem of any substantial size. The first documented attempt to solve the Einstein field equations numerically appears to be Hahn and Lindquist in 1964, followed soon thereafter by Smarr and by Eppley. These early attempts were focused on evolving Misner data in axisymmetry (also known as "2+1 dimensions"). At around the same time Tsvi Piran wrote the first code that evolved a system with gravitational radiation using a cylindrical symmetry. In this calculation Piran has set the foundation for many of the concepts used today in evolving ADM equations, like "free evolution" versus "constrained evolution", which deal with the fundamental problem of treating the constraint equations that arise in the ADM formalism. Applying symmetry reduced the computational and memory requirements associated with the problem, allowing the researchers to obtain results on the supercomputers available at the time. 

Early results

The first realistic calculations of rotating collapse were carried out in the early eighties by Richard Stark and Tsvi Piran in which the gravitational wave forms resulting from formation of a rotating black hole were calculated for the first time. For nearly 20 years following the initial results, there were fairly few other published results in numerical relativity, probably due to the lack of sufficiently powerful computers to address the problem. In the late 1990s, the Binary Black Hole Grand Challenge Alliance successfully simulated a head-on binary black hole collision. As a post-processing step the group computed the event horizon for the spacetime. This result still required imposing and exploiting axisymmetry in the calculations.

Some of the first documented attempts to solve the Einstein equations in three dimensions were focused on a single Schwarzschild black hole, which is described by a static and spherically symmetric solution to the Einstein field equations. This provides an excellent test case in numerical relativity because it does have a closed-form solution so that numerical results can be compared to an exact solution, because it is static, and because it contains one of the most numerically challenging features of relativity theory, a physical singularity. One of the earliest groups to attempt to simulate this solution was Anninos et al. in 1995. In their paper they point out that
"Progress in three dimensional numerical relativity has been impeded in part by lack of computers with sufficient memory and computational power to perform well resolved calculations of 3D spacetimes."
 

Maturation of the field

In the years that followed, not only did computers become more powerful, but also various research groups developed alternate techniques to improve the efficiency of the calculations. With respect to black hole simulations specifically, two techniques were devised to avoid problems associated with the existence of physical singularities in the solutions to the equations: (1) Excision, and (2) the "puncture" method. In addition the Lazarus group developed techniques for using early results from a short-lived simulation solving the nonlinear ADM equations, in order to provide initial data for a more stable code based on linearized equations derived from perturbation theory. More generally, adaptive mesh refinement techniques, already used in computational fluid dynamics were introduced to the field of numerical relativity. 

Excision

In the excision technique, which was first proposed in the late 1990s, a portion of a spacetime inside of the event horizon surrounding the singularity of a black hole is simply not evolved. In theory this should not affect the solution to the equations outside of the event horizon because of the principle of causality and properties of the event horizon (i.e. nothing physical inside the black hole can influence any of the physics outside the horizon). Thus if one simply does not solve the equations inside the horizon one should still be able to obtain valid solutions outside. One "excises" the interior by imposing ingoing boundary conditions on a boundary surrounding the singularity but inside the horizon. While the implementation of excision has been very successful, the technique has two minor problems. The first is that one has to be careful about the coordinate conditions. While physical effects cannot propagate from inside to outside, coordinate effects could. For example, if the coordinate conditions were elliptical, coordinate changes inside could instantly propagate out through the horizon. This then means that one needs hyperbolic type coordinate conditions with characteristic velocities less than that of light for the propagation of coordinate effects (e.g., using harmonic coordinates coordinate conditions). The second problem is that as the black holes move, one must continually adjust the location of the excision region to move with the black hole.

The excision technique was developed over several years including the development of new gauge conditions that increased stability and work that demonstrated the ability of the excision regions to move through the computational grid. The first stable, long-term evolution of the orbit and merger of two black holes using this technique was published in 2005.

Punctures

In the puncture method the solution is factored into an analytical part, which contains the singularity of the black hole, and a numerically constructed part, which is then singularity free. This is a generalization of the Brill-Lindquist  prescription for initial data of black holes at rest and can be generalized to the Bowen-York prescription for spinning and moving black hole initial data. Until 2005, all published usage of the puncture method required that the coordinate position of all punctures remain fixed during the course of the simulation. Of course black holes in proximity to each other will tend to move under the force of gravity, so the fact that the coordinate position of the puncture remained fixed meant that the coordinate systems themselves became "stretched" or "twisted," and this typically led to numerical instabilities at some stage of the simulation.

Breakthrough

In 2005 researchers demonstrated for the first time the ability to allow punctures to move through the coordinate system, thus eliminating some of the earlier problems with the method. This allowed accurate long-term evolutions of black holes. By choosing appropriate coordinate conditions and making crude analytic assumption about the fields near the singularity (since no physical effects can propagate out of the black hole, the crudeness of the approximations does not matter), numerical solutions could be obtained to the problem of two black holes orbiting each other, as well as accurate computation of gravitational radiation (ripples in spacetime) emitted by them. 

Lazarus project

The Lazarus project (1998–2005) was developed as a post-Grand Challenge technique to extract astrophysical results from short lived full numerical simulations of binary black holes. It combined approximation techniques before (post-Newtonian trajectories) and after (perturbations of single black holes) with full numerical simulations attempting to solve General Relativity field equations. All previous attempts to numerically integrate in supercomputers the Hilbert-Einstein equations describing the gravitational field around binary black holes led to software failure before a single orbit was completed.

The Lazarus approach, in the meantime, gave the best insight into the binary black hole problem and produced numerous and relatively accurate results, such as the radiated energy and angular momentum emitted in the latest merging state, the linear momentum radiated by unequal mass holes, and the final mass and spin of the remnant black hole. The method also computed detailed gravitational waves emitted by the merger process and predicted that the collision of black holes is the most energetic single event in the Universe, releasing more energy in a fraction of a second in the form of gravitational radiation than an entire galaxy in its lifetime. 

Adaptive mesh refinement

Adaptive mesh refinement (AMR) as a numerical method has roots that go well beyond its first application in the field of numerical relativity. Mesh refinement first appears in the numerical relativity literature in the 1980s, through the work of Choptuik in his studies of critical collapse of scalar fields. The original work was in one dimension, but it was subsequently extended to two dimensions. In two dimensions, AMR has also been applied to the study of inhomogeneous cosmologies, and to the study of Schwarzschild black holes. The technique has now become a standard tool in numerical relativity and has been used to study the merger of black holes and other compact objects in addition to the propagation of gravitational radiation generated by such astronomical events.

Recent developments

In the past few years, hundreds of research papers have been published leading to a wide spectrum of mathematical relativity, gravitational wave, and astrophysical results for the orbiting black hole problem. This technique extended to astrophysical binary systems involving neutron stars and black holes, and multiple black holes. One of the most surprising predictions is that the merger of two black holes can give the remnant hole a speed of up to 4000 km/s that can allow it to escape from any known galaxy. The simulations also predict an enormous release of gravitational energy in this merger process, amounting up to 8% of its total rest mass.

Gene

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Gene Chromosome ...