Search This Blog

Saturday, October 3, 2020

Freedom of choice

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Freedom_of_choice

Freedom of choice describes an individual's opportunity and autonomy to perform an action selected from at least two available options, unconstrained by external parties.

In politics

In the abortion debate, for example, the term "freedom of choice" may emerge in defense of the position that a woman has a right to determine whether she will proceed with or terminate a pregnancy.  Similarly, other topics such as euthanasia, vaccination, contraception and same-sex marriage are sometimes discussed in terms of an assumed individual right of "freedom of choice". Some social issues, for example the New York "Soda Ban" have been both defended and opposed with reference to "freedom of choice".

In economics

The freedom of choice on which brand and flavor of soda to buy is related to market competition

In microeconomics, freedom of choice is the freedom of economic agents to allocate their resources as they see fit, among the options (such as goods, services, or assets) that are available to them. It includes the freedom to engage in employment available to them.

Ratner et al., in 2008, cited the literature on libertarian paternalism which states that consumers do not always act in their own best interests. They attribute this phenomenon to factors such as emotion, cognitive limitations and biases, and incomplete information which they state may be remedied by various proposed interventions. They discuss providing consumers with information and decision tools, organizing and restricting their market options, and tapping emotions and managing expectations. Each of these, they state, could improve consumers' ability to choose.

However, economic freedom to choose ultimately depends upon market competition, since buyers' available options are usually the result of various factors controlled by sellers, such as overall quality of a product or a service and advertisement. In the event that a monopoly exists, the consumer no longer has the freedom to choose to buy from a different producer. As Friedrich Hayek pointed out:

Our freedom of choice in a competitive society rests on the fact that, if one person refuses to satisfy our wishes, we can turn to another. But if we face a monopolist we are at his absolute mercy...

— Friedrich Hayek, The Road to Serfdom, "Can planning free us from care?"

As exemplified in the above quote, libertarian thinkers are often strong advocates for increasing freedom of choice. One example of this is Milton Friedman's Free to Choose book and TV series.

There is no consensus as to whether an increase in economic freedom of choice leads to an increase in happiness. In one study, the Heritage Foundation's 2011 Index of Economic Freedom report showed a strong correlation between its Index of Economic Freedom and happiness in a country.

Measuring freedom of choice

The axiomatic-deductive approach has been used to address the issue of measuring the amount of freedom of choice (FoC) an individual enjoys. In a 1990 paper, Prasanta K. Pattanaik, and Yongsheng Xu presented three conditions that a measurement of FoC should satisfy:

  1. Indifference between no-choice situations. Having only one option amounts to the same FoC, no matter what the option is.
  2. Strict monotonicity. Having two distinct options x and y amounts to more FoC than having only the option x.
  3. Independence. If a situation A has more FoC than B, by adding a new option x to both (not contained in A or B), A will still have more FoC than B.

They proved that the cardinality is the only measurement that satisfies these axioms, what they observed to be counter-intuitive and suggestive that one or more axioms should be reformulated. They illustrated this with the example of the option set "to travel by train" or "to travel by car", that should yield more FoC than the option set "to travel by red car" or "to travel by blue car". Some suggestions have been made to solve this problem, by reformulating the axioms, usually including concepts of preferences, or rejecting the third axiom.

Relationship with happiness

A 2006 study by Simona Botti and Ann L. McGill showed that, when subjects were presented with differentiated options and had the freedom to choose between them, their choice enhanced their satisfaction with positive and dissatisfaction with negative outcomes, relative to nonchoosers.

A 2010 study by Hazel Rose Markus and Barry Schwartz compiled a list of experiments about freedom of choice and argued that "too much choice can produce a paralyzing uncertainty, depression, and selfishness". Schwartz argues that people frequently experience regret due to opportunity costs for not making an optimal decision and that, in some scenarios, people's overall satisfaction are sometimes higher when a difficult decision is made by another person rather than by themselves, even when the other person's choice is worse. Schwarts had written a book and given speeches criticizing the excess of options in modern society, though acknowledging that "some choice is better than none".

Structure formation

From Wikipedia, the free encyclopedia
 
The quantum-mechanical "Schrödinger's cat" paradox according to the Many-Worlds interpretation. In this interpretation, every quantum event is a branch point; the cat is both alive and dead, even before the box is opened, but the "alive" and "dead" cats are in different branches of the universe, both of which are equally real, but which do not interact with each other.

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wavefunction collapse. This implies that all possible outcomes of quantum measurements are physically realized in some "world" or universe. In contrast to some other interpretations, such as the Copenhagen interpretation, the evolution of reality as a whole in MWI is rigidly deterministic. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1960s and 1970s.

In many-worlds, the subjective appearance of wavefunction collapse is explained by the mechanism of quantum decoherence. Decoherence approaches to interpreting quantum theory have been widely explored and developed since the 1970s, and have become quite popular. MWI is now considered a mainstream interpretation along with the other decoherence interpretations, collapse theories (including the Copenhagen interpretation), and hidden variable theories such as Bohmian mechanics.

The many-worlds interpretation implies that there are very many universes, perhaps infinitely many. It is one of many multiverse hypotheses in physics and philosophy. MWI views time as a many-branched tree, wherein every possible quantum outcome is realised. This is intended to resolve some paradoxes of quantum theory, such as the EPR paradox and Schrödinger's cat, since every possible outcome of a quantum event exists in its own universe.

History

In 1952 Erwin Schrödinger gave a lecture in Dublin in which at one point he jocularly warned his audience that what he was about to say might "seem lunatic". He went on to assert that while what the equation that won him a Nobel prize seems to be describing is several different histories, they are "not alternatives but all really happen simultaneously". This is the earliest known reference to many-worlds.

MWI originated in Everett's Princeton Ph.D. thesis "The Theory of the Universal Wavefunction", developed under his thesis advisor John Archibald Wheeler, a shorter summary of which was published in 1957 under the title "Relative State Formulation of Quantum Mechanics" (Wheeler contributed the title "relative state"; Everett originally called his approach the "Correlation Interpretation", where "correlation" refers to quantum entanglement). The phrase "many-worlds" is due to Bryce DeWitt, who was responsible for the wider popularisation of Everett's theory, which was largely ignored for a decade after publication.

Overview of the interpretation

The key idea of the many-worlds interpretation is that unitary quantum mechanics describes the whole universe. In particular, it describes a measurement as a unitary transformation, without using a collapse postulate, and describes observers as ordinary quantum-mechanical systems. This stands in sharp contrast to the Copenhagen interpretation, on which a measurement is a "primitive" concept, not describable by quantum mechanics, the universe is divided into a quantum and a classical domain, and the collapse postulate is central. MWI's main conclusion is that the universe (or multiverse in this context) is composed of a quantum superposition of an infinite or undefinable amount or number of increasingly divergent, non-communicating parallel universes or quantum worlds.

The many-worlds interpretation makes essential use of decoherence to explain the measurement process and the emergence of a quasi-classical world. Wojciech H. Zurek, one of decoherence theory's pioneers, stated: "Under scrutiny of the environment, only pointer states remain unchanged. Other states decohere into mixtures of stable pointer states that can persist, and, in this sense, exist: They are einselected." Żurek emphasizes that his work does not depend on a particular interpretation.

The many-worlds interpretation shares many similarities with the decoherent histories interpretation, which also uses decoherence to explain the process of measurement or wavefunction collapse.MWI treats the other histories or worlds as real since it regards the universal wavefunction as the "basic physical entity" or "the fundamental entity, obeying at all times a deterministic wave equation". Decoherent histories, on the other hand, needs only one of the histories (or worlds) to be real.

Several authors, including Wheeler, Everett and Deutsch, call many-worlds a theory, rather than just an interpretation. Everett argued that it was the "only completely coherent approach to explaining both the contents of quantum mechanics and the appearance of the world." Deutsch dismissed the idea that many-worlds is an "interpretation", saying that to call it that "is like talking about dinosaurs as an 'interpretation' of fossil records."

Formulation

In Everett's formulation, a measuring apparatus M and an object system S form a composite system, each of which prior to measurement exists in well-defined (but time-dependent) states. Measurement is regarded as causing M and S to interact. After S interacts with M, it is no longer possible to describe either system by an independent state. According to Everett, the only meaningful descriptions of each system are relative states: for example the relative state of S given the state of M or the relative state of M given the state of S. In DeWitt's formulation, the state of S after a sequence of measurements is given by a quantum superposition of states, each one corresponding to an alternative measurement history of S.

Schematic illustration of splitting as a result of a repeated measurement.

For example, consider the smallest possible truly quantum system S, as shown in the illustration. This describes for instance, the spin-state of an electron. Considering a specific axis (say the z-axis) the north pole represents spin "up" and the south pole, spin "down". The superposition states of the system are described by a sphere called the Bloch sphere. To perform a measurement on S, it is made to interact with another similar system M. After the interaction, the combined system can be regarded as a quantum superposition of two "alternative histories" of the original system S, one in which "up" was observed and the other in which "down" was observed. Each subsequent binary measurement (that is interaction with a system M) causes a similar split in the history tree. Thus after three measurements, the system can be regarded as a quantum superposition of 8 = 2 × 2 × 2 copies of the original system S.

Relative state

In his 1957 doctoral dissertation, Everett proposed that rather than modeling an isolated quantum system subject to external observation, one could mathematically model an object as well as its observers as purely physical systems within the mathematical framework developed by Paul Dirac, John von Neumann and others, discarding altogether the ad hoc mechanism of wave function collapse.

Since Everett's original work, a number of similar formalisms have appeared in the literature. One is the relative state formulation. It makes two assumptions: first, the wavefunction is not simply a description of the object's state, but is entirely equivalent to the object—a claim it has in common with some other interpretations. Second, observation or measurement has no special laws or mechanics, unlike in the Copenhagen interpretation, which considers the wavefunction collapse a special kind of event that occurs as a result of observation. Instead, measurement in the relative state formulation is the consequence of a configuration change in an observer's memory described by the same basic wave physics as the object being modeled.

The many-worlds interpretation is DeWitt's popularisation of Everett, who had referred to the combined observer–object system as split by an observation, each split corresponding to the different or multiple possible outcomes of an observation. These splits generate a tree, as shown in the graphic above. Subsequently, DeWitt introduced the term "world" to describe a complete measurement history of an observer, which corresponds roughly to a single branch of that tree.

Under the many-worlds interpretation, the Schrödinger equation, or relativistic analog, holds all the time everywhere. An observation or measurement is modeled by applying the wave equation to the entire system comprising the observer and the object. One consequence is that every observation can be thought of as causing the combined observer–object's wavefunction to change into a quantum superposition of two or more non-interacting branches, or split into many "worlds". Since many observation-like events have happened and are constantly happening, there are an enormous and growing number of simultaneously existing states.

If a system is composed of two or more subsystems, the system's state will be a superposition of products of the subsystems' states. Each product of subsystem states in the overall superposition evolves over time independently of other products. Once the subsystems interact, their states have become correlated or entangled and can no longer be considered independent. In Everett's terminology each subsystem state was now correlated with its relative state, since each subsystem must now be considered relative to the other subsystems with which it has interacted.

Properties

MWI removes the observer-dependent role in the quantum measurement process by replacing wavefunction collapse with quantum decoherence. Since the observer's role lies at the heart of most if not all "quantum paradoxes," this automatically resolves a number of problems, such as Schrödinger's cat thought experiment, the EPR paradox, von Neumann's "boundary problem", and even wave-particle duality.

Since the Copenhagen interpretation requires the existence of a classical domain beyond the one described by quantum mechanics, it has been criticized as inadequate for the study of cosmology. MWI was developed with the explicit goal of allowing quantum mechanics to be applied to the universe as a whole, making quantum cosmology possible.

MWI is a realist, deterministic, and local theory. It achieves this by removing wavefunction collapse, which is indeterministic and non-local, from the deterministic and local equations of quantum theory.

MWI (like other, broader multiverse theories) provides a context for the anthropic principle, which may provide an explanation for the fine-tuned universe.

MWI depends crucially on the linearity of quantum mechanics. If the final theory of everything is non-linear with respect to wavefunctions, then many-worlds is invalid.. While quantum gravity or string theory may be non-linear in this respect, there is no evidence of this as yet.

Interpreting wavefunction collapse

As with the other interpretations of quantum mechanics, the many-worlds interpretation is motivated by behavior that can be illustrated by the double-slit experiment. When particles of light (or anything else) pass through the double slit, a calculation assuming wavelike behavior of light can be used to identify where the particles are likely to be observed. Yet when the particles are observed in this experiment, they appear as particles (i.e., at definite places) and not as non-localized waves.

Some versions of the Copenhagen interpretation of quantum mechanics proposed a process of "collapse" in which an indeterminate quantum system would probabilistically collapse down onto, or select, just one determinate outcome to "explain" this phenomenon of observation. Wavefunction collapse was widely regarded as artificial and ad hoc, so an alternative interpretation in which the behavior of measurement could be understood from more fundamental physical principles was considered desirable.

Everett's Ph.D. work provided such an interpretation. He argued that for a composite system—such as a subject (the "observer" or measuring apparatus) observing an object (the "observed" system, such as a particle)—the claim that either the observer or the observed has a well-defined state is meaningless; in modern parlance, the observer and the observed have become entangled: we can only specify the state of one relative to the other, i.e., the state of the observer and the observed are correlated after the observation is made. This led Everett to derive from the unitary, deterministic dynamics alone (i.e., without assuming wavefunction collapse) the notion of a relativity of states.

Everett noticed that the unitary, deterministic dynamics alone entailed that after an observation is made each element of the quantum superposition of the combined subject–object wavefunction contains two "relative states": a "collapsed" object state and an associated observer who has observed the same collapsed outcome; what the observer sees and the state of the object have become correlated by the act of measurement or observation. The subsequent evolution of each pair of relative subject–object states proceeds with complete indifference as to the presence or absence of the other elements, as if wavefunction collapse has occurred, which has the consequence that later observations are always consistent with the earlier observations. Thus the appearance of the object's wavefunction's collapse has emerged from the unitary, deterministic theory itself. (This answered Einstein's early criticism of quantum theory, that the theory should define what is observed, not for the observables to define the theory.) Since the wavefunction merely appears to have collapsed then, Everett reasoned, there was no need to actually assume that it had collapsed. And so, invoking Occam's razor, he removed the postulate of wavefunction collapse from the theory.

Testability

In 1985, David Deutsch proposed a variant of the Wigner's friend thought experiment as a test of many-worlds versus the Copenhagen interpretation. It consists of an experimenter (Wigner's friend) making a measurement on a quantum system in an isolated laboratory, and another experimenter (Wigner) who would make a measurement on the first one. According to the many-worlds theory, the first experimenter would end up in a macroscopic superposition of seeing one result of the measurement in one branch, and another result in another branch. The second experimenter could then interfere these two branches in order to test whether it is in fact in a macroscopic superposition or has collapsed into a single branch, as predicted by the Copenhagen interpretation. Since then Lockwood (1989), Vaidman and others have made similar proposals. These proposals require placing macroscopic objects in a coherent superposition and interfering them, a task now beyond experimental capability.

Probability and the Born rule

Since the many-worlds interpretation's inception, physicists have been puzzled about the role of probability in it. As put by Wallace, there are two facets to the question: the incoherence problem, which asks why we should assign probabilities at all to outcomes that are certain to occur in some worlds, and the quantitative problem, which asks why the probabilities should be given by the Born rule.

Everett tried to answer these questions in the paper that introduced many-worlds. To address the incoherence problem, he argued that an observer who makes a sequence of measurements on a quantum system will in general have an apparently random sequence of results in their memory, which justifies the use of probabilities to describe the measurement process. To address the quantitative problem, Everett proposed a derivation of the Born rule based on the properties that a measure on the branches of the wavefunction should have. His derivation has been criticized as relying on unmotivated assumptions. Since then several other derivations of the Born rule in the many-worlds framework have been proposed. There is no consensus on whether this has been successful.

Frequentism

DeWitt and Graham and Farhi et al., among others, have proposed derivations of the Born rule based on a frequentist interpretation of probability. They try to show that in the limit of infinitely many measurements no worlds would have relative frequencies that didn't match the probabilities given by the Born rule, but these derivations have been shown to be mathematically incorrect.

Decision theory

A decision-theoretic derivation of the Born rule was produced by David Deutsch (1999) and refined by Wallace (2002–2009) and Saunders (2004). They consider an agent who takes part in a quantum gamble: the agent makes a measurement on a quantum system, branches as a consequence, and each of the agent's future selves receives a reward that depends on the measurement result. The agent uses decision theory to evaluate the price they would pay to take part in such a gamble, and concludes that the price is given by the utility of the rewards weighted according to the Born rule. Some reviews have been positive, although these arguments remain highly controversial; some theoretical physicists have taken them as supporting the case for parallel universes. For example, a New Scientist story on a 2007 conference about Everettian interpretations quoted physicist Andy Albrecht as saying, "This work will go down as one of the most important developments in the history of science." In contrast, the philosopher Huw Price, also attending the conference, found the Deutsch–Wallace–Saunders approach fundamentally flawed.

Symmetries and invariance

Żurek (2005) has produced a derivation of the Born rule based on the symmetries of entangled states; Schlosshauer and Fine argue that Żurek's derivation is not rigorous, as it does not define what probability is and has several unstated assumptions about how it should behave.

Charles Sebens and Sean M. Carroll, building on work by Lev Vaidman, proposed a similar approach based on self-locating uncertainty. In this approach, decoherence creates multiple identical copies of observers, who can assign credences to being on different branches using the Born rule. The Sebens–Carroll approach has been criticized by Adrian Kent, and Vaidman himself does not find it satisfactory.

The preferred basis problem

As originally formulated by Everett and DeWitt, the many-worlds interpretation had a privileged role for measurements: they determined which basis of a quantum system would give rise to the eponymous worlds. Without this the theory was ambiguous, as a quantum state can equally well be described (e.g.) as having a well-defined position or as being a superposition of two delocalised states. The assumption that the preferred basis to use is the one from a measurement of position results in worlds having objects in well-defined positions, instead of worlds with delocalised objects (which would be grossly incompatible with experiment). This special role for measurements is problematic for the theory, as it contradicts Everett and DeWitt's goal of having a reductionist theory and undermines their criticism of the ill-defined measurement postulate of the Copenhagen interpretation. This is known today as the preferred basis problem.

The preferred basis problem has been solved, according to Saunders and Wallace, among others, by incorporating decoherence in the many-worlds theory. In this approach, the preferred basis does not have to be postulated, but rather is identified as the basis stable under environmental decoherence. In this way measurements no longer play a special role; rather, any interaction that causes decoherence causes the world to split. Since decoherence is never complete, there will always remain some infinitesimal overlap between two worlds, making it arbitrary whether a pair of worlds has split or not. Wallace argues that this is not problematic: it only shows that worlds are not a part of the fundamental ontology, but rather of the emergent ontology, where these approximate, effective descriptions are routine in the physical sciences. Since in this approach the worlds are derived, it follows that they must be present in any other interpretation of quantum mechanics that does not have a collapse mechanism, such as Bohmian mechanics.

This approach to deriving the preferred basis has been criticized as creating a circularity with derivations of probability in the many-worlds interpretation, as decoherence theory depends on probability, and probability depends on the ontology derived from decoherence. Wallace contends that decoherence theory depends not on probability but only on the notion that one is allowed to do approximations in physics.

Reception

MWI's initial reception was overwhelmingly negative, with the notable exception of DeWitt. Wheeler made considerable efforts to formulate the theory in a way that would be palatable to Bohr, visited Copenhagen in 1956 to discuss it with him, and convinced Everett to visit as well, which happened in 1959. Nevertheless, Bohr and his collaborators completely rejected the theory. Everett left academia in 1956, never to return, and Wheeler eventually disavowed the theory.

One of MWI's strongest advocates is David Deutsch. According to Deutsch, the single photon interference pattern observed in the double slit experiment can be explained by interference of photons in multiple universes. Viewed this way, the single photon interference experiment is indistinguishable from the multiple photon interference experiment. In a more practical vein, in one of the earliest papers on quantum computing, he suggested that parallelism that results from MWI could lead to "a method by which certain probabilistic tasks can be performed faster by a universal quantum computer than by any classical restriction of it". Deutsch has also proposed that MWI will be testable (at least against "naive" Copenhagenism) when reversible computers become conscious via the reversible observation of spin.

Asher Peres was an outspoken critic of MWI. A section of his 1993 textbook had the title Everett's interpretation and other bizarre theories. Peres argued that the various many-worlds interpretations merely shift the arbitrariness or vagueness of the collapse postulate to the question of when "worlds" can be regarded as separate, and that no objective criterion for that separation can actually be formulated.

Some consider MWI unfalsifiable and hence unscientific because the multiple parallel universes are non-communicating, in the sense that no information can be passed between them. Others claim MWI is directly testable.

Victor J. Stenger remarked that Murray Gell-Mann's published work explicitly rejects the existence of simultaneous parallel universes. Collaborating with James Hartle, Gell-Mann had been, before his death, working toward the development a more "palatable" post-Everett quantum mechanics. Stenger thought it fair to say that most physicists dismiss the many-worlds interpretation as too extreme, while noting it "has merit in finding a place for the observer inside the system being analyzed and doing away with the troublesome notion of wave function collapse".

Philosophers of science James Ladyman and Don Ross state that the MWI could be true, but that they do not embrace it. They note that no quantum theory is yet empirically adequate for describing all of reality, given its lack of unification with general relativity, and so they do not see a reason to regard any interpretation of quantum mechanics as the final word in metaphysics. They also suggest that the multiple branches may be an artifact of incomplete descriptions and of using quantum mechanics to represent the states of macroscopic objects. They argue that macroscopic objects are significantly different from microscopic objects in not being isolated from the environment, and that using quantum formalism to describe them lacks explanatory and descriptive power and accuracy.

Polls

A poll of 72 "leading quantum cosmologists and other quantum field theorists" conducted before 1991 by L. David Raub showed 58% agreement with "Yes, I think MWI is true".

Max Tegmark reports the result of a "highly unscientific" poll taken at a 1997 quantum mechanics workshop. According to Tegmark, "The many worlds interpretation (MWI) scored second, comfortably ahead of the consistent histories and Bohm interpretations."

In response to Sean M. Carroll's statement "As crazy as it sounds, most working physicists buy into the many-worlds theory", Michael Nielsen counters: "at a quantum computing conference at Cambridge in 1998, a many-worlder surveyed the audience of approximately 200 people... Many-worlds did just fine, garnering support on a level comparable to, but somewhat below, Copenhagen and decoherence." But Nielsen notes that it seemed most attendees found it to be a waste of time: Peres "got a huge and sustained round of applause…when he got up at the end of the polling and asked 'And who here believes the laws of physics are decided by a democratic vote?'"

A 2005 poll of fewer than 40 students and researchers taken after a course on the Interpretation of Quantum Mechanics at the Institute for Quantum Computing University of Waterloo found "Many Worlds (and decoherence)" to be the least favored.

A 2011 poll of 33 participants at an Austrian conference found 6 endorsed MWI, 8 "Information-based/information-theoretical", and 14 Copenhagen; the authors remark that MWI received a similar percentage of votes as in Tegmark's 1997 poll.

Debate whether the other worlds are real

Everett believed in the literal reality of the other quantum worlds. His son reported that he "never wavered in his belief over his many-worlds theory".

According to Martin Gardner, the "other" worlds of MWI have two different interpretations: real or unreal; he claimed that Stephen Hawking and Steven Weinberg both favour the unreal interpretation. Gardner also claimed that most physicists favour the unreal interpretation, whereas the "realist" view is supported only by MWI experts such as Deutsch and DeWitt. Hawking has said that "according to Feynman's idea", all other histories are as "equally real" as our own,  and Gardner reports Hawking saying that MWI is "trivially true". In a 1983 interview, Hawking also said he regarded MWI as "self-evidently correct" but was dismissive of questions about the interpretation of quantum mechanics, saying, "When I hear of Schrödinger's cat, I reach for my gun." In the same interview, he also said, "But, look: All that one does, really, is to calculate conditional probabilities—in other words, the probability of A happening, given B. I think that that's all the many worlds interpretation is. Some people overlay it with a lot of mysticism about the wave function splitting into different parts. But all that you're calculating is conditional probabilities." Elsewhere Hawking contrasted his attitude towards the "reality" of physical theories with that of his colleague Roger Penrose, saying, "He's a Platonist and I'm a positivist. He's worried that Schrödinger's cat is in a quantum state, where it is half alive and half dead. He feels that can't correspond to reality. But that doesn't bother me. I don't demand that a theory correspond to reality because I don't know what it is. Reality is not a quality you can test with litmus paper. All I'm concerned with is that the theory should predict the results of measurements. Quantum theory does this very successfully." For his own part, Penrose agrees with Hawking that QM applied to the universe implies MW, but he believes the lack of a successful theory of quantum gravity negates the claimed universality of conventional QM.

Speculative implications

Quantum suicide thought experiment

Quantum suicide is a thought experiment in quantum mechanics and the philosophy of physics. Purportedly, it can distinguish between the Copenhagen interpretation of quantum mechanics and the many-worlds interpretation by means of a variation of the Schrödinger's cat thought experiment, from the cat's point of view. Quantum immortality refers to the subjective experience of surviving quantum suicide.

Most experts believe that the experiment would not work in the real world, because the world with the surviving experimenter has a lower "measure" than the world prior to the experiment, making it less likely that the experimenter will go on to experience their survival.

Absurdly improbable timelines

DeWitt has stated that "[Everett, Wheeler and Graham] do not in the end exclude any element of the superposition. All the worlds are there, even those in which everything goes wrong and all the statistical laws break down."

Max Tegmark has affirmed that absurd or highly unlikely events are inevitable but rare under the MWI. To quote Tegmark, "Things inconsistent with the laws of physics will never happen—everything else will... it's important to keep track of the statistics, since even if everything conceivable happens somewhere, really freak events happen only exponentially rarely."

Ladyman and Ross state that, in general, many of the unrealized possibilities that are discussed in other scientific fields will not have counterparts in other branches, because they are in fact incompatible with the universal wavefunction.

Cosmological perturbation theory

From Wikipedia, the free encyclopedia

In physical cosmology, cosmological perturbation theory is the theory by which the evolution of structure is understood in the Big Bang model. It uses general relativity to compute the gravitational forces causing small perturbations to grow and eventually seed the formation of stars, quasars, galaxies and clusters. It only applies to situations in which the universe is predominantly homogeneous, such as during cosmic inflation and large parts of the Big Bang. The universe is believed to still be homogeneous enough that the theory is a good approximation on the largest scales, but on smaller scales more involved techniques, such as N-body simulations, must be used.

Because of the gauge invariance of general relativity, the correct formulation of cosmological perturbation theory is subtle. In particular, when describing an inhomogeneous spacetime there is often not a preferred coordinate choice. There are currently two distinct approaches to perturbation theory in classical general relativity:

  • gauge-invariant perturbation theory based on foliating a space-time with hyper-surfaces, and
  • 1+3 covariant gauge-invariant perturbation theory based on threading a space-time with frames.

Gauge-invariant perturbation theory

The gauge-invariant perturbation theory is based on developments by Bardeen (1980), Kodama and Sasaki (1984) building on the work of Lifshitz (1946). This is the standard approach to perturbation theory of general relativity for cosmology. This approach is widely used for the computation of anisotropies in the cosmic microwave background radiation as part of the physical cosmology program and focuses on predictions arising from linearisations that preserve gauge invariance with respect to Friedmann-Lemaître-Robertson-Walker (FLRW) models. This approach draws heavily on the use of Newtonian like analogue and usually has as it starting point the FRW background around which perturbations are developed. The approach is non-local and coordinate dependent but gauge invariant as the resulting linear framework is built from a specified family of background hyper-surfaces which are linked by gauge preserving mappings to foliate the space-time. Although intuitive this approach does not deal well with the nonlinearities natural to general relativity.

1+3 covariant gauge-invariant perturbation theory

In relativistic cosmology using the Lagrangian threading dynamics of Ehlers (1971) and Ellis (1971) it is usual to use the gauge-invariant covariant perturbation theory developed by Hawking (1966) and Ellis and Bruni (1989). Here rather than starting with a background and perturbing away from that background one starts with full general relativity and systematically reduces the theory down to one that is linear around a particular background. The approach is local and both covariant as well as gauge invariant but can be non-linear because the approach is built around the local comoving observer frame (see frame bundle) which is used to thread the entire space-time. This approach to perturbation theory produces differential equations that are of just the right order needed to describe the true physical degrees of freedom and as such no non-physical gauge modes exist. It is usual to express the theory in a coordinate free manner. For applications of kinetic theory, because one is required to use the full tangent bundle, it becomes convenient to use the tetrad formulation of relativistic cosmology. The application of this approach to the computation of anisotropies in cosmic microwave background radiation requires the linearization of the full relativistic kinetic theory developed by Thorne (1980) and Ellis, Matravers and Treciokas (1983).

Gauge freedom and frame fixing

In relativistic cosmology there is a freedom associated with the choice of threading frame, this frame choice is distinct from choice associated with coordinates. Picking this frame is equivalent to fixing the choice of timelike world lines mapped into each other, this reduces the gauge freedom it does not fix the gauge but the theory remains gauge invariant under the remaining gauge freedoms. In order to fix the gauge a specification of correspondences between the time surfaces in the real universe (perturbed) and the background universe are required along with the correspondences between points on the initial spacelike surfaces in the background and in the real universe. This is the link between the gauge-invariant perturbation theory and the gauge-invariant covariant perturbation theory. Gauge invariance is only guaranteed if the choice of frame coincides exactly with that of the background; usually this is trivial to ensure because physical frames have this property.

Newtonian-like equations

Newtonian-like equations emerge from perturbative general relativity with the choice of the Newtonian gauge; the Newtonian gauge provides the direct link between the variables typically used in the gauge-invariant perturbation theory and those arising from the more general gauge-invariant covariant perturbation theory.

Critical realism (philosophy of the social sciences)

From Wikipedia, the free encyclopedia

Critical realism is a philosophical approach to understanding science developed by Roy Bhaskar (1944–2014). It combines a general philosophy of science (transcendental realism) with a philosophy of social science (critical naturalism). It specifically opposes forms of empiricism and positivism by viewing science as concerned with identifying causal mechanisms. Also, in the context of social science it argues that scientific investigation can lead directly to critique of social arrangements and institutions, in a similar manner to the work of Karl Marx. In the last decades of the twentieth century it also stood against various forms of 'postmodernism'. It is one of a range of types of philosophical realism, as well as forms of realism advocated within social science such as analytic realism and subtle realism.

Contemporary critical realism

Overview

Bhaskar developed a general philosophy of science that he described as transcendental realism and a special philosophy of the human sciences that he called critical naturalism. The two terms were combined by other authors to form the umbrella term critical realism.

Transcendental realism attempts to establish that in order for scientific investigation to take place, the object of that investigation must have real, manipulable, internal mechanisms that can be actualized to produce particular outcomes. This is what we do when we conduct experiments. This stands in contrast to empiricist scientists' claim that all scientists can do is observe the relationship between cause and effect and impose meaning. Whilst empiricism, and positivism more generally, locate causal relationships at the level of events, critical realism locates them at the level of the generative mechanism, arguing that causal relationships are irreducible to empirical constant conjunctions of David Hume's doctrine; in other words, a constant conjunctive relationship between events is neither sufficient nor even necessary to establish a causal relationship.

The implication of this is that science should be understood as an ongoing process in which scientists improve the concepts they use to understand the mechanisms that they study. It should not, in contrast to the claim of empiricists, be about the identification of a coincidence between a postulated independent variable and dependent variable. Positivism/falsificationism are also rejected due to the observation that it is highly plausible that a mechanism will exist but either a) go unactivated, b) be activated, but not perceived, or c) be activated, but counteracted by other mechanisms, which results in its having unpredictable effects. Thus, non-realisation of a posited mechanism cannot (in contrast to the claim of some positivists) be taken to signify its non-existence. Falsificationism can be viewed at the statement level (naive falsificationism) or at the theorem level (more common in practice). In this way, the two approaches can be reconciled to some extent.

Critical naturalism argues that the transcendental realist model of science is equally applicable to both the physical and the human worlds. However, when we study the human world we are studying something fundamentally different from the physical world and must, therefore, adapt our strategy to studying it. Critical naturalism, therefore, prescribes social scientific methods which seek to identify the mechanisms producing social events, but with a recognition that these are in a much greater state of flux than those of the physical world (as human structures change much more readily than those of, say, a leaf). In particular, we must understand that human agency is made possible by social structures that themselves require the reproduction of certain actions/pre-conditions. Further, the individuals that inhabit these social structures are capable of consciously reflecting upon, and changing, the actions that produce them—a practice that is in part facilitated by social scientific research.

Critical realism has become an influential movement in British sociology and social science in general as a reaction to, and reconciliation of, postmodern critiques.

Developments

Since Bhaskar made the first big steps in popularising the theory of critical realism in the 1970s, it has become one of the major strands of social scientific method, rivalling positivism/empiricism, and post-structuralism/relativism/interpretivism.

After his development of critical realism, Bhaskar went on to develop a philosophical system he calls dialectical critical realism, which is most clearly outlined in his weighty book, Dialectic: The Pulse of Freedom.

An accessible introduction to Bhaskar's writings was written by Andrew Collier. Andrew Sayer has written accessible texts on critical realism in social science. Danermark et al. have also produced an accessible account. Margaret Archer is associated with this school, as is the ecosocialist writer Peter Dickens.

David Graeber relies on critical realism, which he understands as a form of 'heraclitean' philosophy, emphasizing flux and change over stable essences, in his anthropological book on the concept of value, Toward an anthropological theory of value: the false coin of our own dreams.

Recently, attention has turned to the challenge of implementing critical realism in applied social research. An edited volume examined the use of critical realism for studying organizations (Edwards, O'Mahoney, and Vincent 2014). Other authors (Fletcher 2016, Parr 2015, Bunt 2018, Hoddy 2018) have discussed which specific research methodologies and methods are conducive (or not) to research guided by critical realism as a philosophy of science.

In economics

Heterodox economists like Tony Lawson, Lars Pålsson Syll, Frederic Lee or Geoffrey Hodgson are trying to work the ideas of critical realism into economics, especially the dynamic idea of macro-micro interaction.

According to critical realist economists, the central aim of economic theory is to provide explanations in terms of hidden generative structures. This position combines transcendental realism with a critique of mainstream economics. It argues that mainstream economics (i) relies excessively on deductivist methodology, (ii) embraces an uncritical enthusiasm for formalism, and (iii) believes in strong conditional predictions in economics despite repeated failures.

The world that mainstream economists study is the empirical world. But this world is "out of phase" (Lawson) with the underlying ontology of economic regularities. The mainstream view is thus a limited reality because empirical realists presume that the objects of inquiry are solely "empirical regularities"—that is, objects and events at the level of the experienced.

The critical realist views the domain of real causal mechanisms as the appropriate object of economic science, whereas the positivist view is that the reality is exhausted in empirical, i.e. experienced reality. Tony Lawson argues that economics ought to embrace a "social ontology" to include the underlying causes of economic phenomena.

Marxism

A development of Bhaskar's critical realism lies at the ontological root of contemporary streams of Marxist political and economic theory. The realist philosophy described by Bhaskar in A Realist Theory of Science is compatible with Marx's work in that it differentiates between an intransitive reality, which exists independently of human knowledge of it, and the socially produced world of science and empirical knowledge. This dualist logic is clearly present in the Marxian theory of ideology, according to which social reality may be very different from its empirically observable surface appearance. Notably, Alex Callinicos has argued for a 'critical realist' ontology in the philosophy of social science and explicitly acknowledges Bhaskar's influence (while also rejecting the latter's 'spiritualist turn' in his later work). The relationship between critical realist philosophy and Marxism has also been discussed in an article co-authored by Bhaskar and Callinicos and published in the Journal of Critical Realism.

In international relations theory

Since 2000, critical realist philosophy has also been increasingly influential in the field of international relations (IR) theory. Patrick Thaddeus Jackson has called it 'all the rage' in the field. Bob Jessop, Colin Wight, Milja Kurki, Jonathan Joseph and Hidemi Suganami have all published major works on the utility of beginning IR research from a critical realist social ontology—an ontology they all credit Roy Bhaskar with originating.

Ecological economics

The British ecological economist Clive Spash holds the opinion that critical realism offers a thorough basis—as a philosophy of science—for the theoretical foundation of ecological economics. He therefore uses a critical realist lens for conducting research in (ecological) economics.

However, also other scholars base ecological economics on a critical realist foundation, such as Leigh Price from Rhodes University.

Materialism

From Wikipedia, the free encyclopedia

Materialism is a form of philosophical monism that holds that matter is the fundamental substance in nature, and that all things, including mental states and consciousness, are results of material interactions. According to philosophical materialism, mind and consciousness are by-products or epiphenomena of material processes (such as the biochemistry of the human brain and nervous system), without which they cannot exist. This concept directly contrasts with idealism, where mind and consciousness are first-order realities to which matter is subject and material interactions are secondary.

Materialism is closely related to physicalism—the view that all that exists is ultimately physical. 

Philosophical physicalism has evolved from materialism with the theories of the physical sciences to incorporate more sophisticated notions of physicality than mere ordinary matter (e.g. spacetime, physical energies and forces, and dark matter). Thus, the term physicalism is preferred over materialism by some, while others use the terms as if they were synonymous.

Philosophies contradictory to materialism or physicalism include idealism, pluralism, dualism, panpsychism, and other forms of monism.

Overview

In 1748, French doctor and philosopher La Mettrie espouses a materialistic definition of the human soul in L'Homme Machine

Materialism belongs to the class of monist ontology, and is thus different from ontological theories based on dualism or pluralism. For singular explanations of the phenomenal reality, materialism would be in contrast to idealism, neutral monism, and spiritualism. It can also contrast with phenomenalism, vitalism, and dual-aspect monism. Its materiality can, in some ways, be linked to the concept of determinism, as espoused by Enlightenment thinkers.

Despite the large number of philosophical schools and subtle nuances between many, all philosophies are said to fall into one of two primary categories, defined in contrast to each other: idealism and materialism. The basic proposition of these two categories pertains to the nature of reality—the primary distinction between them is the way they answer two fundamental questions: "what does reality consist of?" and "how does it originate?" To idealists, spirit or mind or the objects of mind (ideas) are primary, and matter secondary. To materialists, matter is primary, and mind or spirit or ideas are secondary—the product of matter acting upon matter.

The materialist view is perhaps best understood in its opposition to the doctrines of immaterial substance applied to the mind historically by René Descartes; however, by itself materialism says nothing about how material substance should be characterized. In practice, it is frequently assimilated to one variety of physicalism or another.

Modern philosophical materialists extend the definition of other scientifically observable entities such as energy, forces and the curvature of space; however, philosophers such as Mary Midgley suggest that the concept of "matter" is elusive and poorly defined.

During the 19th century, Karl Marx and Friedrich Engels extended the concept of materialism to elaborate a materialist conception of history centered on the roughly empirical world of human activity (practice, including labor) and the institutions created, reproduced or destroyed by that activity. They also developed dialectical materialism, by taking Hegelian dialectics, stripping them of their idealist aspects, and fusing them with materialism (see Modern philosophy).

Non-reductive materialism

Materialism is often associated with reductionism, according to which the objects or phenomena individuated at one level of description, if they are genuine, must be explicable in terms of the objects or phenomena at some other level of description—typically, at a more reduced level.

Non-reductive materialism explicitly rejects this notion, however, taking the material constitution of all particulars to be consistent with the existence of real objects, properties or phenomena not explicable in the terms canonically used for the basic material constituents. Jerry Fodor argues this view, according to which empirical laws and explanations in "special sciences" like psychology or geology are invisible from the perspective of basic physics.

Early history

Before Common Era

Materialism developed, possibly independently, in several geographically separated regions of Eurasia during what Karl Jaspers termed the Axial Age (c. 800–200 BC).

In ancient Indian philosophy, materialism developed around 600 BC with the works of Ajita Kesakambali, Payasi, Kanada and the proponents of the Cārvāka school of philosophy. Kanada became one of the early proponents of atomism. The NyayaVaisesika school (c. 600–100 BC) developed one of the earliest forms of atomism (although their proofs of God and their positing that consciousness was not material precludes labelling them as materialists). Buddhist atomism and the Jaina school continued the atomic tradition.

Ancient Greek atomists like Leucippus, Democritus and Epicurus prefigure later materialists. The Latin poem De Rerum Natura by Lucretius (99 – c. 55 BC) reflects the mechanistic philosophy of Democritus and Epicurus. According to this view, all that exists is matter and void, and all phenomena result from different motions and conglomerations of base material particles called atoms (literally 'indivisibles').

 De Rerum Natura provides mechanistic explanations for phenomena such as erosion, evaporation, wind, and sound. Famous principles like "nothing can touch body but body" first appeared in the works of Lucretius. Democritus and Epicurus, however, did not hold to a monist ontology since they held to the ontological separation of matter and space (i.e. space being "another kind" of being) indicating that the definition of materialism is wider than the given scope of this article.

Early Common Era

Wang Chong (27 – c. 100 AD) was a Chinese thinker of the early Common Era said to be a materialist. Later Indian materialist Jayaraashi Bhatta (6th century) in his work Tattvopaplavasimha ('The upsetting of all principles') refuted the Nyāya Sūtra epistemology. The materialistic Cārvāka philosophy appears to have died out some time after 1400; when Madhavacharya compiled Sarva-darśana-samgraha ('a digest of all philosophies') in the 14th century, he had no Cārvāka (or Lokāyata) text to quote from or refer to.

In early 12th-century al-Andalus, Arabian philosopher Ibn Tufail (a.k.a. Abubacer) wrote discussions on materialism in his philosophical novel, Hayy ibn Yaqdhan (Philosophus Autodidactus), while vaguely foreshadowing the idea of a historical materialism.

Modern philosophy

Thomas Hobbes (1588–1679) and Pierre Gassendi (1592–1665) represented the materialist tradition in opposition to the attempts of René Descartes (1596–1650) to provide the natural sciences with dualist foundations. There followed the materialist and atheist abbé Jean Meslier (1664–1729), along with the works of the French materialists: Julien Offray de La Mettrie, German-French Baron d'Holbach (1723–1789), Denis Diderot (1713–1784), and other French Enlightenment thinkers. In England, John "Walking" Stewart (1747–1822) insisted on seeing matter as endowed with a moral dimension, which had a major impact on the philosophical poetry of William Wordsworth (1770–1850).

In late modern philosophy, German atheist anthropologist Ludwig Feuerbach would signal a new turn in materialism through his book The Essence of Christianity (1841), which presented a humanist account of religion as the outward projection of man's inward nature. Feuerbach introduced anthropological materialism, a version of materialism that views materialist anthropology as the universal science.

Feuerbach's variety of materialism would go on to heavily influence Karl Marx,[13] who in the late 19th century elaborated the concept of historical materialism—the basis for what Marx and Friedrich Engels outlined as scientific socialism:

The materialist conception of history starts from the proposition that the production of the means to support human life and, next to production, the exchange of things produced, is the basis of all social structure; that in every society that has appeared in history, the manner in which wealth is distributed and society divided into classes or orders is dependent upon what is produced, how it is produced, and how the products are exchanged. From this point of view, the final causes of all social changes and political revolutions are to be sought, not in men's brains, not in men's better insights into eternal truth and justice, but in changes in the modes of production and exchange. They are to be sought, not in the philosophy, but in the economics of each particular epoch.

— Friedrich Engels, Socialism: Scientific and Utopian (1880)

Through his Dialectics of Nature (1883), Engels later developed a "materialist dialectic" philosophy of nature; a worldview that would be given the title dialectical materialism by Georgi Plekhanov, the father of Russian Marxism. In early 20th-century Russian philosophy, Vladimir Lenin further developed dialectical materialism in his book Materialism and Empirio-criticism (1909), which connected the political conceptions put forth by his opponents to their anti-materialist philosophies.

A more naturalist-oriented materialist school of thought that developed in the middle of the 19th century (also in Germany) was German materialism, members of which included Ludwig Büchner, Jacob Moleschott, and Carl Vogt.

Contemporary history

Analytic philosophy

Contemporary analytic philosophers (e.g. Daniel Dennett, Willard Van Orman Quine, Donald Davidson, and Jerry Fodor) operate within a broadly physicalist or scientific materialist framework, producing rival accounts of how best to accommodate the mind, including functionalism, anomalous monism, identity theory, and so on.

Scientific materialism is often synonymous with, and has typically been described as being, a reductive materialism. In the early 21st century, Paul and Patricia Churchland advocated a radically contrasting position (at least, in regards to certain hypotheses): eliminative materialism. Eliminative materialism holds that some mental phenomena simply do not exist at all, and that talk of those mental phenomena reflects a totally spurious "folk psychology" and introspection illusion. A materialist of this variety might believe that a concept like "belief" simply has no basis in fact (e.g. the way folk science speaks of demon-caused illnesses).

With reductive materialism being at one end of a continuum (our theories will reduce to facts) and eliminative materialism on the other (certain theories will need to be eliminated in light of new facts), revisionary materialism is somewhere in the middle.

Continental philosophy

Contemporary continental philosopher Gilles Deleuze has attempted to rework and strengthen classical materialist ideas. Contemporary theorists such as Manuel DeLanda, working with this reinvigorated materialism, have come to be classified as new materialist in persuasion. New materialism has now become its own specialized subfield of knowledge, with courses being offered on the topic at major universities, as well as numerous conferences, edited collections and monographs devoted to it.

Jane Bennett's book Vibrant Matter (2010) has been particularly instrumental in bringing theories of monist ontology and vitalism back into a critical theoretical fold dominated by poststructuralist theories of language and discourse. Scholars such as Mel Y. Chen and Zakiyyah Iman Jackson, however, have critiqued this body of new materialist literature for its neglect in considering the materiality of race and gender in particular.

Métis scholar Zoe Todd, as well as Mohawk (Bear Clan, Six Nations) and Anishinaabe scholar Vanessa Watts, query the colonial orientation of the race for a "new" materialism. Watts in particular describes the tendency to regard matter as a subject of feminist or philosophical care as a tendency that is too invested in the reanimation of a Eurocentric tradition of inquiry at the expense of an Indigenous ethic of responsibility. While, other scholars, such as Helene Vosters, echo their concerns and have questioned whether there is anything particularly "new" about this so-called "new materialism," as Indigenous and other animist ontologies have attested to what might be called the "vibrancy of matter" for centuries. Other scholars such as Thomas Nail have critiqued "vitalist" versions of new materialism for its depoliticizing "flat ontology" and for being ahistorical in nature.

Quentin Meillassoux proposed speculative materialism, a post-Kantian return to David Hume which is also based on materialist ideas.

Defining 'matter'

The nature and definition of matter—like other key concepts in science and philosophy—have occasioned much debate:

  • Is there a single kind of matter (hyle) that everything is made of, or multiple kinds?
  • Is matter a continuous substance capable of expressing multiple forms (hylomorphism); or a number of discrete, unchanging constituents (atomism)?
  • Does it have intrinsic properties (substance theory) or is it lacking them (prima materia)?

One challenge to the traditional concept of matter as tangible 'stuff' came with the rise of field physics in the 19th century. Relativity shows that matter and energy (including the spatially distributed energy of fields) are interchangeable. This enables the ontological view that energy is prima materia and matter is one of its forms. On the other hand, the Standard Model of particle physics uses quantum field theory to describe all interactions. On this view it could be said that fields are prima materia and the energy is a property of the field.

According to the dominant cosmological model, the Lambda-CDM model, less than 5% of the universe's energy density is made up of the "matter" described by the Standard Model, and the majority of the universe is composed of dark matter and dark energy, with little agreement among scientists about what these are made of.

With the advent of quantum physics, some scientists believed the concept of matter had merely changed, while others believed the conventional position could no longer be maintained. For instance Werner Heisenberg said, "The ontology of materialism rested upon the illusion that the kind of existence, the direct 'actuality' of the world around us, can be extrapolated into the atomic range. This extrapolation, however, is impossible…atoms are not things." Likewise, some philosophers feel that these dichotomies necessitate a switch from materialism to physicalism. Others use the terms materialism and physicalism interchangeably.

The concept of matter has changed in response to new scientific discoveries. Thus materialism has no definite content independent of the particular theory of matter on which it is based. According to Noam Chomsky, any property can be considered material, if one defines matter such that it has that property.

Physicalism

George Stack distinguishes between materialism and physicalism:

In the twentieth century, physicalism has emerged out of positivism. Physicalism restricts meaningful statements to physical bodies or processes that are verifiable or in principle verifiable. It is an empirical hypothesis that is subject to revision and, hence, lacks the dogmatic stance of classical materialism. Herbert Feigl defended physicalism in the United States and consistently held that mental states are brain states and that mental terms have the same referent as physical terms. The twentieth century has witnessed many materialist theories of the mental, and much debate surrounding them.

However, not all conceptions of physicalism are tied to verificationist theories of meaning or direct realist accounts of perception. Rather, physicalists believe that no "element of reality" is missing from the mathematical formalism of our best description of the world. "Materialist" physicalists also believe that the formalism describes fields of insentience. In other words, the intrinsic nature of the physical is non-experiential.

Criticism and alternatives

From scientists

Rudolf Peierls, a physicist who played a major role in the Manhattan Project, rejected materialism: "The premise that you can describe in terms of physics the whole function of a human being…including knowledge and consciousness, is untenable. There is still something missing."

Erwin Schrödinger said, "Consciousness cannot be accounted for in physical terms. For consciousness is absolutely fundamental. It cannot be accounted for in terms of anything else."

Werner Heisenberg, who came up with the uncertainty principle, wrote, "The ontology of materialism rested upon the illusion that the kind of existence, the direct 'actuality' of the world around us, can be extrapolated into the atomic range. This extrapolation, however, is impossible.… Atoms are not things."

Quantum mechanics

Some 20th-century physicists (e.g., Eugene Wigner and Henry Stapp), as well as modern day physicists and science writers (e.g., Stephen Barr, Paul Davies, and John Gribbin) have argued that materialism is flawed due to certain recent scientific findings in physics, such as quantum mechanics and chaos theory. According to Gribbin and Davies (1991):

Then came our Quantum theory, which totally transformed our image of matter. The old assumption that the microscopic world of atoms was simply a scaled-down version of the everyday world had to be abandoned. Newton's deterministic machine was replaced by a shadowy and paradoxical conjunction of waves and particles, governed by the laws of chance, rather than the rigid rules of causality. An extension of the quantum theory goes beyond even this; it paints a picture in which solid matter dissolves away, to be replaced by weird excitations and vibrations of invisible field energy. Quantum physics undermines materialism because it reveals that matter has far less "substance" than we might believe. But another development goes even further by demolishing Newton's image of matter as inert lumps. This development is the theory of chaos, which has recently gained widespread attention.

— Paul Davies and John Gribbin, The Matter Myth, Chapter 1: "The Death of Materialism"

Digital physics

The objections of Davies and Gribbin are shared by proponents of digital physics who view information rather than matter to be fundamental. Famous physicist and proponent of digital physics John Archibald Wheeler wrote, "all matter and all things physical are information-theoretic in origin and this is a participatory universe." Their objections were also shared by some founders of quantum theory, such as Max Planck, who wrote:

As a man who has devoted his whole life to the most clear headed science, to the study of matter, I can tell you as a result of my research about atoms this much: There is no matter as such. All matter originates and exists only by virtue of a force which brings the particle of an atom to vibration and holds this most minute solar system of the atom together. We must assume behind this force the existence of a conscious and intelligent Mind. This Mind is the matrix of all matter.

— Max Planck, Das Wesen der Materie, 1944

James Jeans concurred with Planck saying, "The Universe begins to look more like a great thought than like a great machine. Mind no longer appears to be an accidental intruder into the realm of matter."

Religious and spiritual views

According to Constantin Gutberlet writing in Catholic Encyclopedia (1911), materialism, defined as "a philosophical system which regards matter as the only reality in the world…denies the existence of God and the soul." In this view, materialism could be perceived incompatible with world religions that ascribe existence to immaterial objects. Materialism may be conflated with atheism; according to Friedrich A. Lange (1892), "Diderot has not always in the Encyclopædia expressed his own individual opinion, but it is just as true that at its commencement he had not yet got as far as Atheism and Materialism."

Most of Hinduism and transcendentalism regard all matter as an illusion, or maya, blinding humans from knowing the truth. Transcendental experiences like the perception of Brahman are considered to destroy the illusion.

Joseph Smith, the founder of the Latter Day Saint movement, taught: "There is no such thing as immaterial matter. All spirit is matter, but it is more fine or pure, and can only be discerned by purer eyes; We cannot see it; but when our bodies are purified we shall see that it is all matter." This spirit element is believed to always have existed and to be co-eternal with God.

Mary Baker Eddy, the founder of the Christian Science movement, denied the existence of matter on the basis of the allness of Mind (which she regarded as a synonym for God).

Philosophical objections

In the Critique of Pure Reason, Immanuel Kant argued against materialism in defending his transcendental idealism (as well as offering arguments against subjective idealism and mind–body dualism). However, Kant with his refutation of idealism, argues that change and time require an enduring substrate.

Postmodern/poststructuralist thinkers also express a skepticism about any all-encompassing metaphysical scheme. Philosopher Mary Midgley argues that materialism is a self-refuting idea, at least in its eliminative materialist form.

Varieties of idealism

Arguments for idealism, such as those of Hegel and Berkeley, often take the form of an argument against materialism; indeed, the idealism of Berkeley was called immaterialism. Now, matter can be argued to be redundant, as in bundle theory, and mind-independent properties can, in turn, be reduced to subjective percepts. Berkeley presents an example of the latter by pointing out that it is impossible to gather direct evidence of matter, as there is no direct experience of matter; all that is experienced is perception, whether internal or external. As such, the existence of matter can only be assumed from the apparent (perceived) stability of perceptions; it finds absolutely no evidence in direct experience.

If matter and energy are seen as necessary to explain the physical world, but incapable of explaining mind, dualism results. Emergence, holism and process philosophy seek to ameliorate the perceived shortcomings of traditional (especially mechanistic) materialism without abandoning materialism entirely.

Materialism as methodology

Some critics object to materialism as part of an overly skeptical, narrow or reductivist approach to theorizing, rather than to the ontological claim that matter is the only substance. Particle physicist and Anglican theologian John Polkinghorne objects to what he calls promissory materialism—claims that materialistic science will eventually succeed in explaining phenomena it has not so far been able to explain. Polkinghorne prefers "dual-aspect monism" to materialism.

Some scientific materialists have been criticized for failing to provide clear definitions for what constitutes matter, leaving the term materialism without any definite meaning. Noam Chomsky states that since the concept of matter may be affected by new scientific discoveries, as has happened in the past, scientific materialists are being dogmatic in assuming the opposite.

 

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...