Search This Blog

Thursday, June 17, 2021

Speculative realism

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Speculative realism is a movement in contemporary Continental-inspired philosophy (also known as post-Continental philosophy) that defines itself loosely in its stance of metaphysical realism against its interpretation of the dominant forms of post-Kantian philosophy (or what it terms "correlationism").

Speculative realism takes its name from a conference held at Goldsmiths College, University of London in April 2007. The conference was moderated by Alberto Toscano of Goldsmiths College, and featured presentations by Ray Brassier of American University of Beirut (then at Middlesex University), Iain Hamilton Grant of the University of the West of England, Graham Harman of the American University in Cairo, and Quentin Meillassoux of the École Normale Supérieure in Paris. Credit for the name "speculative realism" is generally ascribed to Brassier, though Meillassoux had already used the term "speculative materialism" to describe his own position.

A second conference, entitled "Speculative Realism/Speculative Materialism", took place at the UWE Bristol on Friday 24 April 2009, two years after the original event at Goldsmiths. The line-up consisted of Ray Brassier, Iain Hamilton Grant, Graham Harman, and (in place of Meillassoux, who was unable to attend) Alberto Toscano.

Critique of correlationism

While often in disagreement over basic philosophical issues, the speculative realist thinkers have a shared resistance to what they interpret as philosophies of human finitude inspired by the tradition of Immanuel Kant.

What unites the four core members of the movement is an attempt to overcome both "correlationism" and "philosophies of access". In After Finitude, Meillassoux defines correlationism as "the idea according to which we only ever have access to the correlation between thinking and being, and never to either term considered apart from the other." Philosophies of access are any of those philosophies which privilege the human being over other entities. Both ideas represent forms of anthropocentrism.

All four of the core thinkers within speculative realism work to overturn these forms of philosophy which privilege the human being, favouring distinct forms of realism against the dominant forms of idealism in much of contemporary Continental philosophy.

Variations

While sharing in the goal of overturning the dominant strands of post-Kantian thought in Continental philosophy, there are important differences separating the core members of the speculative realist movement and their followers.

Speculative materialism

In his critique of correlationism, Quentin Meillassoux (who uses the term speculative materialism to describe his position) finds two principles as the locus of Kant's philosophy. The first is the principle of correlation itself, which claims essentially that we can only know the correlate of Thought and Being; what lies outside that correlate is unknowable. The second is termed by Meillassoux the principle of factiality, which states that things could be otherwise than what they are. This principle is upheld by Kant in his defence of the thing-in-itself as unknowable but imaginable. We can imagine reality as being fundamentally different even if we never know such a reality. According to Meillassoux, the defence of both principles leads to "weak" correlationism (such as those of Kant and Husserl), while the rejection of the thing-in-itself leads to the "strong" correlationism of thinkers such as late Ludwig Wittgenstein and late Martin Heidegger, for whom it makes no sense to suppose that there is anything outside of the correlate of Thought and Being, and so the principle of factiality is eliminated in favour of a strengthened principle of correlation.

Meillassoux follows the opposite tactic in rejecting the principle of correlation for the sake of a bolstered principle of factiality in his post-Kantian return to Hume. By arguing in favour of such a principle, Meillassoux is led to reject the necessity not only of all physical laws of nature, but all logical laws except the Principle of Non-Contradiction (since eliminating this would undermine the Principle of Factiality which claims that things can always be otherwise than what they are). By rejecting the Principle of Sufficient Reason, there can be no justification for the necessity of physical laws, meaning that while the universe may be ordered in such and such a way, there is no reason it could not be otherwise. Meillassoux rejects the Kantian a priori in favour of a Humean a priori, claiming that the lesson to be learned from Hume on the subject of causality is that "the same cause may actually bring about 'a hundred different events' (and even many more)."

Object-oriented ontology

The central tenet of Graham Harman and Levi Bryant's object-oriented ontology (OOO) is that objects have been neglected in philosophy in favor of a "radical philosophy" that tries to "undermine" objects by saying that objects are the crusts to a deeper underlying reality, either in the form of monism or a perpetual flux, or those that try to "overmine" objects by saying that the idea of a whole object is a form of folk ontology. According to Harman, everything is an object, whether it be a mailbox, electromagnetic radiation, curved spacetime, the Commonwealth of Nations, or a propositional attitude; all things, whether physical or fictional, are equally objects. Sympathetic to panpsychism, Harman proposes a new philosophical discipline called "speculative psychology" dedicated to investigating the "cosmic layers of psyche" and "ferreting out the specific psychic reality of earthworms, dust, armies, chalk, and stone".

Harman defends a version of the Aristotelian notion of substance. Unlike Leibniz, for whom there were both substances and aggregates, Harman maintains that when objects combine, they create new objects. In this way, he defends an a priori metaphysics that claims that reality is made up only of objects and that there is no "bottom" to the series of objects. For Harman, an object is in itself an infinite recess, unknowable and inaccessible by any other thing. This leads to his account of what he terms "vicarious causality". Inspired by the occasionalists of medieval Islamic philosophy, Harman maintains that no two objects can ever interact save through the mediation of a "sensual vicar". There are two types of objects, then, for Harman: real objects and the sensual objects that allow for interaction. The former are the things of everyday life, while the latter are the caricatures that mediate interaction. For example, when fire burns cotton, Harman argues that the fire does not touch the essence of that cotton which is inexhaustible by any relation, but that the interaction is mediated by a caricature of the cotton which causes it to burn.

Transcendental materialism

Iain Hamilton Grant defends a position he calls transcendental materialism. He argues against what he terms "somatism", the philosophy and physics of bodies. In his Philosophies of Nature After Schelling, Grant tells a new history of philosophy from Plato onward based on the definition of matter. Aristotle distinguished between Form and Matter in such a way that Matter was invisible to philosophy, whereas Grant argues for a return to the Platonic Matter as not only the basic building blocks of reality, but the forces and powers that govern our reality. He traces this same argument to the post-Kantian German Idealists Johann Gottlieb Fichte and Friedrich Wilhelm Joseph Schelling, claiming that the distinction between Matter as substantive versus useful fiction persists to this day and that we should end our attempts to overturn Plato and instead attempt to overturn Kant and return to "speculative physics" in the Platonic tradition, that is, not a physics of bodies, but a "physics of the All".

Eugene Thacker has examined how the concept of "life itself" is both determined within regional philosophy and also how "life itself" comes to acquire metaphysical properties. His book After Life shows how the ontology of life operates by way of a split between "Life" and "the living," making possible a "metaphysical displacement" in which life is thought via another metaphysical term, such as time, form, or spirit: "Every ontology of life thinks of life in terms of something-other-than-life...that something-other-than-life is most often a metaphysical concept, such as time and temporality, form and causality, or spirit and immanence" Thacker traces this theme from Aristotle, to Scholasticism and mysticism/negative theology, to Spinoza and Kant, showing how this three-fold displacement is also alive in philosophy today (life as time in process philosophy and Deleuzianism, life as form in biopolitical thought, life as spirit in post-secular philosophies of religion). Thacker examines the relation of speculative realism to the ontology of life, arguing for a "vitalist correlation": "Let us say that a vitalist correlation is one that fails to conserve the correlationist dual necessity of the separation and inseparability of thought and object, self and world, and which does so based on some ontologized notion of 'life'.'' Ultimately Thacker argues for a skepticism regarding "life": "Life is not only a problem of philosophy, but a problem for philosophy."

Other thinkers have emerged within this group, united in their allegiance to what has been known as "process philosophy", rallying around such thinkers as Schelling, Bergson, Whitehead, and Deleuze, among others. A recent example is found in Steven Shaviro's book Without Criteria: Kant, Whitehead, Deleuze, and Aesthetics, which argues for a process-based approach that entails panpsychism as much as it does vitalism or animism. For Shaviro, it is Whitehead's philosophy of prehensions and nexus that offers the best combination of continental and analytical philosophy. Another recent example is found in Jane Bennett's book Vibrant Matter, which argues for a shift from human relations to things, to a "vibrant matter" that cuts across the living and non-living, human bodies and non-human bodies. Leon Niemoczynski, in his book Charles Sanders Peirce and a Religious Metaphysics of Nature, invokes what he calls "speculative naturalism" so as to argue that nature can afford lines of insight into its own infinitely productive "vibrant" ground, which he identifies as natura naturans.

Transcendental nihilism

In Nihil Unbound: Extinction and Enlightenment, Ray Brassier defends transcendental nihilism. He maintains that philosophy has avoided the traumatic idea of extinction, instead attempting to find meaning in a world conditioned by the very idea of its own annihilation. Thus Brassier critiques both the phenomenological and hermeneutic strands of continental philosophy as well as the vitality of thinkers like Gilles Deleuze, who work to ingrain meaning in the world and stave off the "threat" of nihilism. Instead, drawing on thinkers such as Alain Badiou, François Laruelle, Paul Churchland and Thomas Metzinger, Brassier defends a view of the world as inherently devoid of meaning. That is, rather than avoiding nihilism, Brassier embraces it as the truth of reality. Brassier concludes from his readings of Badiou and Laruelle that the universe is founded on the nothing, but also that philosophy is the "organon of extinction," that it is only because life is conditioned by its own extinction that there is thought at all. Brassier then defends a radically anti-correlationist philosophy proposing that Thought is conjoined not with Being, but with Non-Being.

Controversy about the term

In an interview with Kronos magazine published in March 2011, Ray Brassier denied that there is any such thing as a "speculative realist movement" and firmly distanced himself from those who continue to attach themselves to the brand name:

The "speculative realist movement" exists only in the imaginations of a group of bloggers promoting an agenda for which I have no sympathy whatsoever: actor-network theory spiced with pan-psychist metaphysics and morsels of process philosophy. I don't believe the internet is an appropriate medium for serious philosophical debate; nor do I believe it is acceptable to try to concoct a philosophical movement online by using blogs to exploit the misguided enthusiasm of impressionable graduate students. I agree with Deleuze's remark that ultimately the most basic task of philosophy is to impede stupidity, so I see little philosophical merit in a "movement" whose most signal achievement thus far is to have generated an online orgy of stupidity.

Publications

Speculative realism has close ties to the journal Collapse, which published the proceedings of the inaugural conference at Goldsmiths and has featured numerous other articles by 'speculative realist' thinkers; as has the academic journal Pli, which is edited and produced by members of the Graduate School of the Department of Philosophy at the University of Warwick. The journal Speculations, founded in 2010 published by Punctum books, regularly features articles related to Speculative Realism. Edinburgh University Press publishes a book series called Speculative Realism.

Internet presence

Speculative realism is notable for its fast expansion via the Internet in the form of blogs. Websites have formed as resources for essays, lectures, and planned future books by those within the speculative realist movement. Many other blogs, as well as podcasts, have emerged with original material on speculative realism or expanding on its themes and ideas.

Argument from authority

From Wikipedia, the free encyclopedia

An argument from authority (argumentum ab auctoritate), also called an appeal to authority, or argumentum ad verecundiam, is a form of argument in which the opinion of an authority on a topic is used as evidence to support an argument. Some consider that it is used in a cogent form if all sides of a discussion agree on the reliability of the authority in the given context, and others consider it to always be a fallacy to cite an authority on the discussed topic as the primary means of supporting an argument.

Overview

Historically, opinion on the appeal to authority has been divided: it is listed as a non-fallacious argument as often as a fallacious argument in various sources, as some hold that it can be a strong or at least valid defeasible argument and others that it is weak or an outright fallacy.

Some consider that it is a valid inductive argument if all parties of a discussion agree on the reliability of the authority in the given context, while others consider that it is always a fallacy to cite an authority on the debated topic as the primary means of supporting an argument.

Use in science

Scientific knowledge is best established by evidence and experiment rather than argued through authority as authority has no place in science. Carl Sagan wrote of arguments from authority:

One of the great commandments of science is, "Mistrust arguments from authority." ... Too many such arguments have proved too painfully wrong. Authorities must prove their contentions like everybody else.

One example of the use of the appeal to authority in science dates to 1923, when leading American zoologist Theophilus Painter declared, based on poor data and conflicting observations he had made, that humans had 24 pairs of chromosomes. From the 1920s until 1956, scientists propagated this "fact" based on Painter's authority, despite subsequent counts totaling the correct number of 23. Even textbooks with photos showing 23 pairs incorrectly declared the number to be 24 based on the authority of the then-consensus of 24 pairs.

This seemingly established number generated confirmation bias among researchers, and "most cytologists, expecting to detect Painter's number, virtually always did so". Painter's "influence was so great that many scientists preferred to believe his count over the actual evidence", and scientists who obtained the accurate number modified or discarded their data to agree with Painter's count.

A more recent example involved the paper "When contact changes minds: An experiment on transmission of support for gay equality", published in 2014. The paper was a fraud based on forged data, yet concerns about it were ignored in many cases due to appeals to authority. One analysis of the affair notes that "Over and over again, throughout the scientific community and the media, LaCour's impossible-seeming results were treated as truth, in part because of the weight [the study's co-author] Green's name carried". One psychologist stated his reaction to the paper was "that's very surprising and doesn't fit with a huge literature of evidence. It doesn't sound plausible to me... [then I pull it up and] I see Don Green is an author. I trust him completely, so I'm no longer doubtful". The forger, LaCour, would use appeals to authority to defend his research: "if his responses sometimes seemed to lack depth when he was pressed for details, his impressive connections often allayed concerns", as one of his partners states "when he and I really had a disagreement, he would often rely on the kind of arguments where he'd basically invoke authority, right? He's the one with advanced training, and his adviser is this very high-powered, very experienced person...and they know a lot more than we do".

Much like the erroneous chromosome number taking decades to refute until microscopy made the error unmistakable, the one who would go on to debunk this paper "was consistently told by friends and advisers to keep quiet about his concerns lest he earn a reputation as a troublemaker", up until "the very last moment when multiple 'smoking guns' finally appeared", and he found that "There was almost no encouragement for him to probe the hints of weirdness he'd uncovered".

Appeal to false authority

This fallacy is used when a person appeals to a false authority as evidence for their claim. These fallacious arguments from authority are the result of citing a non-authority as an authority. The philosophers Irving Copi and Carl Cohen characterized it as a fallacy "when the appeal is made to parties having no legitimate claim to authority in the matter at hand".

An example of the fallacy of appealing to an authority in an unrelated field would be citing Albert Einstein as an authority for a determination on religion when his primary expertise was in physics.

It is also a fallacious ad hominem argument to argue that a person presenting statements lacks authority and thus their arguments do not need to be considered. As appeals to a perceived lack of authority, these types of argument are fallacious for much the same reasons as an appeal to authority.

Other related fallacious arguments assume that a person without status or authority is inherently reliable. For instance, the appeal to poverty is the fallacy of thinking that someone is more likely to be correct because they are poor. When an argument holds that a conclusion is likely to be true precisely because the one who holds or is presenting it lacks authority, it is a fallacious appeal to the common man.

Roots in cognitive bias

Arguments from authority that are based on the idea that a person should conform to the opinion of a perceived authority or authoritative group are rooted in psychological cognitive biases such as the Asch effect. In repeated and modified instances of the Asch conformity experiments, it was found that high-status individuals create a stronger likelihood of a subject agreeing with an obviously false conclusion, despite the subject normally being able to clearly see that the answer was incorrect.

Further, humans have been shown to feel strong emotional pressure to conform to authorities and majority positions. A repeat of the experiments by another group of researchers found that "Participants reported considerable distress under the group pressure", with 59% conforming at least once and agreeing with the clearly incorrect answer, whereas the incorrect answer was much more rarely given when no such pressures were present.

Another study shining light on the psychological basis of the fallacy as it relates to perceived authorities are the Milgram experiments, which demonstrated that people are more likely to go along with something when it is presented by an authority. In a variation of a study where the researchers did not wear lab coats, thus reducing the perceived authority of the tasker, the obedience level dropped to 20% from the original rate, which had been higher than 50%. Obedience is encouraged by reminding the individual of what a perceived authority states and by showing them that their opinion goes against this authority.

Scholars have noted that certain environments can produce an ideal situation for these processes to take hold, giving rise to groupthink. In groupthink, individuals in a group feel inclined to minimize conflict and encourage conformity. Through an appeal to authority, a group member might present that opinion as a consensus and encourage the other group members to engage in groupthink by not disagreeing with this perceived consensus or authority. One paper about the philosophy of mathematics notes that, within academia,

If...a person accepts our discipline, and goes through two or three years of graduate study in mathematics, he absorbs our way of thinking, and is no longer the critical outsider he once was...If the student is unable to absorb our way of thinking, we flunk him out, of course. If he gets through our obstacle course and then decides that our arguments are unclear or incorrect, we dismiss him as a crank, crackpot, or misfit.

Corporate environments are similarly vulnerable to appeals to perceived authorities and experts leading to groupthink, as are governments and militaries.

 

Wednesday, June 16, 2021

Gravity

From Wikipedia, the free encyclopedia
Hammer and feather drop: astronaut David Scott (from mission Apollo 15) on the Moon enacting the legend of Galileo's gravity experiment

Gravity (from Latin gravitas 'weight'), or gravitation, is a natural phenomenon by which all things with mass or energy—including planets, stars, galaxies, and even light—are attracted to (or gravitate toward) one another. On Earth, gravity gives weight to physical objects, and the Moon's gravity causes the ocean tides. The gravitational attraction of the original gaseous matter present in the Universe caused it to begin coalescing and forming stars and caused the stars to group together into galaxies, so gravity is responsible for many of the large-scale structures in the Universe. Gravity has an infinite range, although its effects become weaker as objects get further away.

Gravity is most accurately described by the general theory of relativity (proposed by Albert Einstein in 1915), which describes gravity not as a force, but as a consequence of masses moving along geodesic lines in a curved spacetime caused by the uneven distribution of mass. The most extreme example of this curvature of spacetime is a black hole, from which nothing—not even light—can escape once past the black hole's event horizon. However, for most applications, gravity is well approximated by Newton's law of universal gravitation, which describes gravity as a force causing any two bodies to be attracted toward each other, with magnitude proportional to the product of their masses and inversely proportional to the square of the distance between them.

Gravity is the weakest of the four fundamental interactions of physics, approximately 1038 times weaker than the strong interaction, 1036 times weaker than the electromagnetic force and 1029 times weaker than the weak interaction. As a consequence, it has no significant influence at the level of subatomic particles. In contrast, it is the dominant interaction at the macroscopic scale, and is the cause of the formation, shape and trajectory (orbit) of astronomical bodies.

Current models of particle physics imply that the earliest instance of gravity in the Universe, possibly in the form of quantum gravity, supergravity or a gravitational singularity, along with ordinary space and time, developed during the Planck epoch (up to 10−43 seconds after the birth of the Universe), possibly from a primeval state, such as a false vacuum, quantum vacuum or virtual particle, in a currently unknown manner. Attempts to develop a theory of gravity consistent with quantum mechanics, a quantum gravity theory, which would allow gravity to be united in a common mathematical framework (a theory of everything) with the other three fundamental interactions of physics, are a current area of research.

History of gravitational theory

Ancient world

The ancient Greek philosopher Archimedes discovered the center of gravity of a triangle. He also postulated that if two equal weights did not have the same center of gravity, the center of gravity of the two weights together would be in the middle of the line that joins their centers of gravity.

The Roman architect and engineer Vitruvius in De Architectura postulated that gravity of an object did not depend on weight but its "nature".

Scientific revolution

Modern work on gravitational theory began with the work of Galileo Galilei in the late 16th and early 17th centuries. In his famous (though possibly apocryphal) experiment dropping balls from the Tower of Pisa, and later with careful measurements of balls rolling down inclines, Galileo showed that gravitational acceleration is the same for all objects. This was a major departure from Aristotle's belief that heavier objects have a higher gravitational acceleration. Galileo postulated air resistance as the reason that objects with low density and a high surface area fall more slowly in an atmosphere. Galileo's work set the stage for the formulation of Newton's theory of gravity.

Newton's theory of gravitation

English physicist and mathematician, Sir Isaac Newton (1642–1727)

In 1687, English mathematician Sir Isaac Newton published Principia, which hypothesizes the inverse-square law of universal gravitation. In his own words, "I deduced that the forces which keep the planets in their orbs must [be] reciprocally as the squares of their distances from the centers about which they revolve: and thereby compared the force requisite to keep the Moon in her Orb with the force of gravity at the surface of the Earth; and found them answer pretty nearly." The equation is the following:

Where F is the force, m1 and m2 are the masses of the objects interacting, r is the distance between the centers of the masses and G is the gravitational constant.

Newton's theory enjoyed its greatest success when it was used to predict the existence of Neptune based on motions of Uranus that could not be accounted for by the actions of the other planets. Calculations by both John Couch Adams and Urbain Le Verrier predicted the general position of the planet, and Le Verrier's calculations are what led Johann Gottfried Galle to the discovery of Neptune.

A discrepancy in Mercury's orbit pointed out flaws in Newton's theory. By the end of the 19th century, it was known that its orbit showed slight perturbations that could not be accounted for entirely under Newton's theory, but all searches for another perturbing body (such as a planet orbiting the Sun even closer than Mercury) had been fruitless. The issue was resolved in 1915 by Albert Einstein's new theory of general relativity, which accounted for the small discrepancy in Mercury's orbit. This discrepancy was the advance in the perihelion of Mercury of 42.98 arcseconds per century.

Although Newton's theory has been superseded by Albert Einstein's general relativity, most modern non-relativistic gravitational calculations are still made using Newton's theory because it is simpler to work with and it gives sufficiently accurate results for most applications involving sufficiently small masses, speeds and energies.

Equivalence principle

The equivalence principle, explored by a succession of researchers including Galileo, Loránd Eötvös, and Einstein, expresses the idea that all objects fall in the same way, and that the effects of gravity are indistinguishable from certain aspects of acceleration and deceleration. The simplest way to test the weak equivalence principle is to drop two objects of different masses or compositions in a vacuum and see whether they hit the ground at the same time. Such experiments demonstrate that all objects fall at the same rate when other forces (such as air resistance and electromagnetic effects) are negligible. More sophisticated tests use a torsion balance of a type invented by Eötvös. Satellite experiments, for example STEP, are planned for more accurate experiments in space.

Formulations of the equivalence principle include:

  • The weak equivalence principle: The trajectory of a point mass in a gravitational field depends only on its initial position and velocity, and is independent of its composition.
  • The Einsteinian equivalence principle: The outcome of any local non-gravitational experiment in a freely falling laboratory is independent of the velocity of the laboratory and its location in spacetime.
  • The strong equivalence principle requiring both of the above.

General relativity

Two-dimensional analogy of spacetime distortion generated by the mass of an object. Matter changes the geometry of spacetime, this (curved) geometry being interpreted as gravity. White lines do not represent the curvature of space but instead represent the coordinate system imposed on the curved spacetime, which would be rectilinear in a flat spacetime.

In general relativity, the effects of gravitation are ascribed to spacetime curvature instead of a force. The starting point for general relativity is the equivalence principle, which equates free fall with inertial motion and describes free-falling inertial objects as being accelerated relative to non-inertial observers on the ground. In Newtonian physics, however, no such acceleration can occur unless at least one of the objects is being operated on by a force.

Einstein proposed that spacetime is curved by matter, and that free-falling objects are moving along locally straight paths in curved spacetime. These straight paths are called geodesics. Like Newton's first law of motion, Einstein's theory states that if a force is applied on an object, it would deviate from a geodesic. For instance, we are no longer following geodesics while standing because the mechanical resistance of the Earth exerts an upward force on us, and we are non-inertial on the ground as a result. This explains why moving along the geodesics in spacetime is considered inertial.

Einstein discovered the field equations of general relativity, which relate the presence of matter and the curvature of spacetime and are named after him. The Einstein field equations are a set of 10 simultaneous, non-linear, differential equations. The solutions of the field equations are the components of the metric tensor of spacetime. A metric tensor describes a geometry of spacetime. The geodesic paths for a spacetime are calculated from the metric tensor.

Solutions

Notable solutions of the Einstein field equations include:

Tests

The tests of general relativity included the following:

  • General relativity accounts for the anomalous perihelion precession of Mercury.
  • The prediction that time runs slower at lower potentials (gravitational time dilation) has been confirmed by the Pound–Rebka experiment (1959), the Hafele–Keating experiment, and the GPS.
  • The prediction of the deflection of light was first confirmed by Arthur Stanley Eddington from his observations during the Solar eclipse of 29 May 1919. Eddington measured starlight deflections twice those predicted by Newtonian corpuscular theory, in accordance with the predictions of general relativity. However, his interpretation of the results was later disputed. More recent tests using radio interferometric measurements of quasars passing behind the Sun have more accurately and consistently confirmed the deflection of light to the degree predicted by general relativity.
  • The time delay of light passing close to a massive object was first identified by Irwin I. Shapiro in 1964 in interplanetary spacecraft signals.
  • Gravitational radiation has been indirectly confirmed through studies of binary pulsars. On 11 February 2016, the LIGO and Virgo collaborations announced the first observation of a gravitational wave.
  • Alexander Friedmann in 1922 found that Einstein equations have non-stationary solutions (even in the presence of the cosmological constant). In 1927 Georges Lemaître showed that static solutions of the Einstein equations, which are possible in the presence of the cosmological constant, are unstable, and therefore the static Universe envisioned by Einstein could not exist. Later, in 1931, Einstein himself agreed with the results of Friedmann and Lemaître. Thus general relativity predicted that the Universe had to be non-static—it had to either expand or contract. The expansion of the Universe discovered by Edwin Hubble in 1929 confirmed this prediction.
  • The theory's prediction of frame dragging was consistent with the recent Gravity Probe B results.
  • General relativity predicts that light should lose its energy when traveling away from massive bodies through gravitational redshift. This was verified on earth and in the solar system around 1960.

Gravity and quantum mechanics

An open question is whether it is possible to describe the small-scale interactions of gravity with the same framework as quantum mechanics. General relativity describes large-scale bulk properties whereas quantum mechanics is the framework to describe the smallest scale interactions of matter. Without modifications these frameworks are incompatible.

One path is to describe gravity in the framework of quantum field theory, which has been successful to accurately describe the other fundamental interactions. The electromagnetic force arises from an exchange of virtual photons, where the QFT description of gravity is that there is an exchange of virtual gravitons. This description reproduces general relativity in the classical limit. However, this approach fails at short distances of the order of the Planck length, where a more complete theory of quantum gravity (or a new approach to quantum mechanics) is required.

Specifics

Earth's gravity

An initially-stationary object that is allowed to fall freely under gravity drops a distance that is proportional to the square of the elapsed time. This image spans half a second and was captured at 20 flashes per second.

Every planetary body (including the Earth) is surrounded by its own gravitational field, which can be conceptualized with Newtonian physics as exerting an attractive force on all objects. Assuming a spherically symmetrical planet, the strength of this field at any given point above the surface is proportional to the planetary body's mass and inversely proportional to the square of the distance from the center of the body.

If an object with comparable mass to that of the Earth were to fall towards it, then the corresponding acceleration of the Earth would be observable.

The strength of the gravitational field is numerically equal to the acceleration of objects under its influence. The rate of acceleration of falling objects near the Earth's surface varies very slightly depending on latitude, surface features such as mountains and ridges, and perhaps unusually high or low sub-surface densities. For purposes of weights and measures, a standard gravity value is defined by the International Bureau of Weights and Measures, under the International System of Units (SI).

That value, denoted g, is g = 9.80665 m/s2 (32.1740 ft/s2).

The standard value of 9.80665 m/s2 is the one originally adopted by the International Committee on Weights and Measures in 1901 for 45° latitude, even though it has been shown to be too high by about five parts in ten thousand. This value has persisted in meteorology and in some standard atmospheres as the value for 45° latitude even though it applies more precisely to latitude of 45°32'33".

Assuming the standardized value for g and ignoring air resistance, this means that an object falling freely near the Earth's surface increases its velocity by 9.80665 m/s (32.1740 ft/s or 22 mph) for each second of its descent. Thus, an object starting from rest will attain a velocity of 9.80665 m/s (32.1740 ft/s) after one second, approximately 19.62 m/s (64.4 ft/s) after two seconds, and so on, adding 9.80665 m/s (32.1740 ft/s) to each resulting velocity. Also, again ignoring air resistance, any and all objects, when dropped from the same height, will hit the ground at the same time.

According to Newton's 3rd Law, the Earth itself experiences a force equal in magnitude and opposite in direction to that which it exerts on a falling object. This means that the Earth also accelerates towards the object until they collide. Because the mass of the Earth is huge, however, the acceleration imparted to the Earth by this opposite force is negligible in comparison to the object's. If the object does not bounce after it has collided with the Earth, each of them then exerts a repulsive contact force on the other which effectively balances the attractive force of gravity and prevents further acceleration.

The force of gravity on Earth is the resultant (vector sum) of two forces: (a) The gravitational attraction in accordance with Newton's universal law of gravitation, and (b) the centrifugal force, which results from the choice of an earthbound, rotating frame of reference. The force of gravity is weakest at the equator because of the centrifugal force caused by the Earth's rotation and because points on the equator are furthest from the center of the Earth. The force of gravity varies with latitude and increases from about 9.780 m/s2 at the Equator to about 9.832 m/s2 at the poles.

Equations for a falling body near the surface of the Earth

Under an assumption of constant gravitational attraction, Newton's law of universal gravitation simplifies to F = mg, where m is the mass of the body and g is a constant vector with an average magnitude of 9.81 m/s2 on Earth. This resulting force is the object's weight. The acceleration due to gravity is equal to this g. An initially stationary object which is allowed to fall freely under gravity drops a distance which is proportional to the square of the elapsed time. The image on the right, spanning half a second, was captured with a stroboscopic flash at 20 flashes per second. During the first 120 of a second the ball drops one unit of distance (here, a unit is about 12 mm); by 220 it has dropped at total of 4 units; by 320, 9 units and so on.

Under the same constant gravity assumptions, the potential energy, Ep, of a body at height h is given by Ep = mgh (or Ep = Wh, with W meaning weight). This expression is valid only over small distances h from the surface of the Earth. Similarly, the expression for the maximum height reached by a vertically projected body with initial velocity v is useful for small heights and small initial velocities only.

Gravity and astronomy

Gravity acts on stars that form the Milky Way.

The application of Newton's law of gravity has enabled the acquisition of much of the detailed information we have about the planets in the Solar System, the mass of the Sun, and details of quasars; even the existence of dark matter is inferred using Newton's law of gravity. Although we have not traveled to all the planets nor to the Sun, we know their masses. These masses are obtained by applying the laws of gravity to the measured characteristics of the orbit. In space an object maintains its orbit because of the force of gravity acting upon it. Planets orbit stars, stars orbit galactic centers, galaxies orbit a center of mass in clusters, and clusters orbit in superclusters. The force of gravity exerted on one object by another is directly proportional to the product of those objects' masses and inversely proportional to the square of the distance between them.

The earliest gravity (possibly in the form of quantum gravity, supergravity or a gravitational singularity), along with ordinary space and time, developed during the Planck epoch (up to 10−43 seconds after the birth of the Universe), possibly from a primeval state (such as a false vacuum, quantum vacuum or virtual particle), in a currently unknown manner.

Gravitational radiation

LIGO Hanford Observatory
The LIGO Hanford Observatory located in Washington, US where gravitational waves were first observed in September 2015.

General relativity predicts that energy can be transported out of a system through gravitational radiation. Any accelerating matter can create curvatures in the space-time metric, which is how the gravitational radiation is transported away from the system. Co-orbiting objects can generate curvatures in space-time such as the Earth-Sun system, pairs of neutron stars, and pairs of black holes. Another astrophysical system predicted to lose energy in the form of gravitational radiation are exploding supernovae.

The first indirect evidence for gravitational radiation was through measurements of the Hulse–Taylor binary in 1973. This system consists of a pulsar and neutron star in orbit around one another. Its orbital period has decreased since its initial discovery due to a loss of energy, which is consistent for the amount of energy loss due to gravitational radiation. This research was awarded the Nobel Prize in Physics in 1993.

The first direct evidence for gravitational radiation was measured on 14 September 2015 by the LIGO detectors. The gravitational waves emitted during the collision of two black holes 1.3 billion-light years from Earth were measured. This observation confirms the theoretical predictions of Einstein and others that such waves exist. It also opens the way for practical observation and understanding of the nature of gravity and events in the Universe including the Big Bang. Neutron star and black hole formation also create detectable amounts of gravitational radiation. This research was awarded the Nobel Prize in physics in 2017.

As of 2020, the gravitational radiation emitted by the Solar System is far too small to measure with current technology.

Speed of gravity

In December 2012, a research team in China announced that it had produced measurements of the phase lag of Earth tides during full and new moons which seem to prove that the speed of gravity is equal to the speed of light. This means that if the Sun suddenly disappeared, the Earth would keep orbiting the vacant point normally for 8 minutes, which is the time light takes to travel that distance. The team's findings were released in the Chinese Science Bulletin in February 2013.

In October 2017, the LIGO and Virgo detectors received gravitational wave signals within 2 seconds of gamma ray satellites and optical telescopes seeing signals from the same direction. This confirmed that the speed of gravitational waves was the same as the speed of light.

Anomalies and discrepancies

There are some observations that are not adequately accounted for, which may point to the need for better theories of gravity or perhaps be explained in other ways.

Rotation curve of a typical spiral galaxy: predicted (A) and observed (B). The discrepancy between the curves is attributed to dark matter.
  • Extra-fast stars: Stars in galaxies follow a distribution of velocities where stars on the outskirts are moving faster than they should according to the observed distributions of normal matter. Galaxies within galaxy clusters show a similar pattern. Dark matter, which would interact through gravitation but not electromagnetically, would account for the discrepancy. Various modifications to Newtonian dynamics have also been proposed.
  • Flyby anomaly: Various spacecraft have experienced greater acceleration than expected during gravity assist maneuvers.
  • Accelerating expansion: The metric expansion of space seems to be speeding up. Dark energy has been proposed to explain this. A recent alternative explanation is that the geometry of space is not homogeneous (due to clusters of galaxies) and that when the data are reinterpreted to take this into account, the expansion is not speeding up after all, however this conclusion is disputed.
  • Anomalous increase of the astronomical unit: Recent measurements indicate that planetary orbits are widening faster than if this were solely through the Sun losing mass by radiating energy.
  • Extra energetic photons: Photons travelling through galaxy clusters should gain energy and then lose it again on the way out. The accelerating expansion of the Universe should stop the photons returning all the energy, but even taking this into account photons from the cosmic microwave background radiation gain twice as much energy as expected. This may indicate that gravity falls off faster than inverse-squared at certain distance scales.
  • Extra massive hydrogen clouds: The spectral lines of the Lyman-alpha forest suggest that hydrogen clouds are more clumped together at certain scales than expected and, like dark flow, may indicate that gravity falls off slower than inverse-squared at certain distance scales.

Mathematical universe hypothesis

From Wikipedia, the free encyclopedia

In physics and cosmology, the mathematical universe hypothesis (MUH), also known as the ultimate ensemble theory and struogony (from mathematical structure, Latin: struō), is a speculative "theory of everything" (TOE) proposed by cosmologist Max Tegmark.

Description

Tegmark's MUH is: Our external physical reality is a mathematical structure. That is, the physical universe is not merely described by mathematics, but is mathematics (specifically, a mathematical structure). Mathematical existence equals physical existence, and all structures that exist mathematically exist physically as well. Observers, including humans, are "self-aware substructures (SASs)". In any mathematical structure complex enough to contain such substructures, they "will subjectively perceive themselves as existing in a physically 'real' world".

The theory can be considered a form of Pythagoreanism or Platonism in that it proposes the existence of mathematical entities; a form of mathematical monism in that it denies that anything exists except mathematical objects; and a formal expression of ontic structural realism.

Tegmark claims that the hypothesis has no free parameters and is not observationally ruled out. Thus, he reasons, it is preferred over other theories-of-everything by Occam's Razor. Tegmark also considers augmenting the MUH with a second assumption, the computable universe hypothesis (CUH), which says that the mathematical structure that is our external physical reality is defined by computable functions.

The MUH is related to Tegmark's categorization of four levels of the multiverse. This categorization posits a nested hierarchy of increasing diversity, with worlds corresponding to different sets of initial conditions (level 1), physical constants (level 2), quantum branches (level 3), and altogether different equations or mathematical structures (level 4).

Reception

Andreas Albrecht of Imperial College in London, called it a "provocative" solution to one of the central problems facing physics. Although he "wouldn't dare" go so far as to say he believes it, he noted that "it's actually quite difficult to construct a theory where everything we see is all there is".

Criticisms and responses

Definition of the ensemble

Jürgen Schmidhuber argues that "Although Tegmark suggests that '... all mathematical structures are a priori given equal statistical weight,' there is no way of assigning equal non-vanishing probability to all (infinitely many) mathematical structures." Schmidhuber puts forward a more restricted ensemble which admits only universe representations describable by constructive mathematics, that is, computer programs; e.g., the Global Digital Mathematics Library and Digital Library of Mathematical Functions, linked open data representations of formalized fundamental theorems intended to serve as building blocks for additional mathematical results. He explicitly includes universe representations describable by non-halting programs whose output bits converge after finite time, although the convergence time itself may not be predictable by a halting program, due to the undecidability of the halting problem.

In response, Tegmark notes (sec. V.E) that a constructive mathematics formalized measure of free parameter variations of physical dimensions, constants, and laws over all universes has not yet been constructed for the string theory landscape either, so this should not be regarded as a "show-stopper".

Consistency with Gödel's theorem

It has also been suggested that the MUH is inconsistent with Gödel's incompleteness theorem. In a three-way debate between Tegmark and fellow physicists Piet Hut and Mark Alford, the "secularist" (Alford) states that "the methods allowed by formalists cannot prove all the theorems in a sufficiently powerful system... The idea that math is 'out there' is incompatible with the idea that it consists of formal systems."

Tegmark's response in (sec VI.A.1) is to offer a new hypothesis "that only Gödel-complete (fully decidable) mathematical structures have physical existence. This drastically shrinks the Level IV multiverse, essentially placing an upper limit on complexity, and may have the attractive side effect of explaining the relative simplicity of our universe." Tegmark goes on to note that although conventional theories in physics are Gödel-undecidable, the actual mathematical structure describing our world could still be Gödel-complete, and "could in principle contain observers capable of thinking about Gödel-incomplete mathematics, just as finite-state digital computers can prove certain theorems about Gödel-incomplete formal systems like Peano arithmetic." In (sec. VII) he gives a more detailed response, proposing as an alternative to MUH the more restricted "Computable Universe Hypothesis" (CUH) which only includes mathematical structures that are simple enough that Gödel's theorem does not require them to contain any undecidable or uncomputable theorems. Tegmark admits that this approach faces "serious challenges", including (a) it excludes much of the mathematical landscape; (b) the measure on the space of allowed theories may itself be uncomputable; and (c) "virtually all historically successful theories of physics violate the CUH".

Observability

Stoeger, Ellis, and Kircher note that in a true multiverse theory, "the universes are then completely disjoint and nothing that happens in any one of them is causally linked to what happens in any other one. This lack of any causal connection in such multiverses really places them beyond any scientific support". Ellis specifically criticizes the MUH, stating that an infinite ensemble of completely disconnected universes is "completely untestable, despite hopeful remarks sometimes made, see, e.g., Tegmark (1998)." Tegmark maintains that MUH is testable, stating that it predicts (a) that "physics research will uncover mathematical regularities in nature", and (b) by assuming that we occupy a typical member of the multiverse of mathematical structures, one could "start testing multiverse predictions by assessing how typical our universe is".

Plausibility of radical Platonism

The MUH is based on the radical Platonist view that math is an external reality ( sec V.C). However, Jannes argues that "mathematics is at least in part a human construction", on the basis that if it is an external reality, then it should be found in some other animals as well: "Tegmark argues that, if we want to give a complete description of reality, then we will need a language independent of us humans, understandable for non-human sentient entities, such as aliens and future supercomputers". Brian Greene argues similarly: "The deepest description of the universe should not require concepts whose meaning relies on human experience or interpretation. Reality transcends our existence and so shouldn't, in any fundamental way, depend on ideas of our making."

However, there are many non-human entities, plenty of which are intelligent, and many of which can apprehend, memorise, compare and even approximately add numerical quantities. Several animals have also passed the mirror test of self-consciousness. But a few surprising examples of mathematical abstraction notwithstanding (for example, chimpanzees can be trained to carry out symbolic addition with digits, or the report of a parrot understanding a “zero-like concept”), all examples of animal intelligence with respect to mathematics are limited to basic counting abilities. He adds, "non-human intelligent beings should exist that understand the language of advanced mathematics. However, none of the non-human intelligent beings that we know of confirm the status of (advanced) mathematics as an objective language." In the paper "On Math, Matter and Mind" the secularist viewpoint examined argues (sec. VI.A) that math is evolving over time, there is "no reason to think it is converging to a definite structure, with fixed questions and established ways to address them", and also that "The Radical Platonist position is just another metaphysical theory like solipsism... In the end the metaphysics just demands that we use a different language for saying what we already knew." Tegmark responds (sec VI.A.1) that "The notion of a mathematical structure is rigorously defined in any book on Model Theory", and that non-human mathematics would only differ from our own "because we are uncovering a different part of what is in fact a consistent and unified picture, so math is converging in this sense." In his 2014 book on the MUH, Tegmark argues that the resolution is not that we invent the language of mathematics, but that we discover the structure of mathematics.

Coexistence of all mathematical structures

Don Page has argued that "At the ultimate level, there can be only one world and, if mathematical structures are broad enough to include all possible worlds or at least our own, there must be one unique mathematical structure that describes ultimate reality. So I think it is logical nonsense to talk of Level 4 in the sense of the co-existence of all mathematical structures." This means there can only be one mathematical corpus. Tegmark responds that "this is less inconsistent with Level IV than it may sound, since many mathematical structures decompose into unrelated substructures, and separate ones can be unified."

Consistency with our "simple universe"

Alexander Vilenkin comments (Ch. 19, p. 203) that "the number of mathematical structures increases with increasing complexity, suggesting that 'typical' structures should be horrendously large and cumbersome. This seems to be in conflict with the beauty and simplicity of the theories describing our world". He goes on to note (footnote 8, p. 222) that Tegmark's solution to this problem, the assigning of lower "weights" to the more complex structures seems arbitrary ("Who determines the weights?") and may not be logically consistent ("It seems to introduce an additional mathematical structure, but all of them are supposed to be already included in the set").

Occam's razor

Tegmark has been criticized as misunderstanding the nature and application of Occam's razor; Massimo Pigliucci reminds that "Occam's razor is just a useful heuristic, it should never be used as the final arbiter to decide which theory is to be favored".

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...