Search This Blog

Sunday, May 13, 2018

Einstein's thought experiments

From Wikipedia, the free encyclopedia

A hallmark of Einstein's career was his use of visualized thought experiments (Gedanken-Experimente) as a fundamental tool for understanding physical issues and for elucidating his concepts to others. Einstein's thought experiments took diverse forms. In his youth, he mentally chased beams of light. For special relativity, he employed moving trains and flashes of lightning to explain his most penetrating insights. For general relativity, he considered a person falling off a roof, accelerating elevators, blind beetles crawling on curved surfaces and the like. In his debates with Niels Bohr on the nature of reality, he proposed imaginary devices intended to show, at least in concept, how the Heisenberg uncertainty principle might be evaded. In a profound contribution to the literature on quantum mechanics, Einstein considered two particles briefly interacting and then flying apart so that their states are correlated, anticipating the phenomenon known as quantum entanglement.

Introduction

A thought experiment is a logical argument or mental model cast within the context of an imaginary (hypothetical or even counterfactual) scenario. A scientific thought experiment, in particular, may examine the implications of a theory, law, or set of principles with the aid of fictive and/or natural particulars (demons sorting molecules, cats whose lives hinge upon a radioactive disintegration, men in enclosed elevators) in an idealized environment (massless trapdoors, absence of friction). They describe experiments that, except for some specific and necessary idealizations, could conceivably be performed in the real world.[1]

As opposed to physical experiments, thought experiments do not report new empirical data. They can only provide conclusions based on deductive or inductive reasoning from their starting assumptions. Thought experiments invoke particulars that are irrelevant to the generality of their conclusions. It is the invocation of these particulars that give thought experiments their experiment-like appearance. A thought experiment can always be reconstructed as a straightforward argument, without the irrelevant particulars. John D. Norton, a well-known philosopher of science, has noted that "a good thought experiment is a good argument; a bad thought experiment is a bad argument."[2]

When effectively used, the irrelevant particulars that convert a straightforward argument into a thought experiment can act as "intuition pumps" that stimulate readers' ability to apply their intuitions to their understanding of a scenario.[3] Thought experiments have a long history. Perhaps the best known in the history of modern science is Galileo's demonstration that falling objects must fall at the same rate regardless of their masses. This has sometimes been taken to be an actual physical demonstration, involving his climbing up the Leaning Tower of Pisa and dropping two heavy weights off it. In fact, it was a logical demonstration described by Galileo in Discorsi e dimostrazioni matematiche (1638).[4]

Einstein had a highly visual understanding of physics. His work in the patent office "stimulated [him] to see the physical ramifications of theoretical concepts." These aspects of his thinking style inspired him to fill his papers with vivid practical detail making them quite different from, say, the papers of Lorentz or Maxwell. This included his use of thought experiments.[5]:26–27;121–127

Special relativity

Pursuing a beam of light

Late in life, Einstein recalled
...a paradox upon which I had already hit at the age of sixteen: If I pursue a beam of light with the velocity c (velocity of light in a vacuum), I should observe such a beam of light as an electromagnetic field at rest though spatially oscillating. There seems to be no such thing, however, neither on the basis of experience nor according to Maxwell's equations. From the very beginning it appeared to me intuitively clear that, judged from the standpoint of such an observer, everything would have to happen according to the same laws as for an observer who, relative to the earth, was at rest. For how should the first observer know or be able to determine, that he is in a state of fast uniform motion? One sees in this paradox the germ of the special relativity theory is already contained.[p 1]:52–53

Einstein's thought experiment as a 16 year old student

Einstein's recollections of his youthful musings are widely cited because of the hints they provide of his later great discovery. However, Norton has noted that Einstein's reminiscences were probably colored by a half-century of hindsight. Norton lists several problems with Einstein's recounting, both historical and scientific:[6]
1. At 16 years old and a student at the Gymnasium in Aarau, Einstein would have had the thought experiment in late 1895 to early 1896. But various sources note that Einstein did not learn Maxwell's theory until 1898, in university.[6][7]
2. The second issue is that a 19th century aether theorist would have had no difficulties with the thought experiment. Einstein's statement, "...there seems to be no such thing...on the basis of experience," would not have counted as an objection, but would have represented a mere statement of fact, since no one had ever traveled at such speeds.
3. An aether theorist would have regarded "...nor according to Maxwell's equations" as simply representing a misunderstanding on Einstein's part. Unfettered by any notion that the speed of light represents a cosmic limit, the aether theorist would simply have set velocity equal to c, noted that yes indeed, the light would appear to be frozen, and then thought no more of it.[6]
Rather than the thought experiment being at all incompatible with aether theories (which it is not), the youthful Einstein appears to have reacted to the scenario out of an intuitive sense of wrongness. He felt that the laws of optics should obey the principle of relativity. As he grew older, his early thought experiment acquired deeper levels of significance: Einstein felt that Maxwell's equations should be the same for all observers in inertial motion. From Maxwell's equations, one can deduce a single speed of light, and there is nothing in this computation that depends on an observer's speed. Einstein sensed a conflict between Newtonian mechanics and the constant speed of light determined by Maxwell's equations.[5]:114–115

Regardless of the historical and scientific issues described above, Einstein's early thought experiment was part of the repertoire of test cases that he used to check on the viability of physical theories. Norton suggests that the real importance of the thought experiment was that it provided a powerful objection to emission theories of light, which Einstein had worked on for several years prior to 1905.[6][7][8]

Magnet and conductor

In the very first paragraph of Einstein's seminal 1905 work introducing special relativity, he writes:
It is known that the application of Maxwell's electrodynamics, as ordinarily conceived at the present time, to moving bodies, leads to asymmetries which don't seem to be connected with the phenomena. Let us, for example, think of the mutual action between a magnet and a conductor. The observed phenomenon in this case depends only on the relative motion of the conductor and the magnet, while according to the usual conception, a strict distinction must be made between the cases where the one or the other of the bodies is in motion. If, for example, the magnet moves and the conductor is at rest, then an electric field of certain energy-value is produced in the neighbourhood of the magnet, which excites a current in those parts of the field where a conductor exists. But if the magnet be at rest and the conductor be set in motion, no electric field is produced in the neighbourhood of the magnet, but an electromotive force is produced in the conductor which corresponds to no energy per se; however, this causes – equality of the relative motion in both considered cases is assumed – an electric current of the same magnitude and the same course, as the electric force in the first case.[p 2]

Magnet and conductor thought experiment

This opening paragraph recounts well-known experimental results obtained by Michael Faraday in 1831. The experiments describe what appeared to be two different phenomena: the motional EMF generated when a wire moves through a magnetic field (see Lorentz force), and the transformer EMF generated by a changing magnetic field (due to the Maxwell–Faraday equation).[8][9][10]:135–157  James Clerk Maxwell himself drew attention to this fact in his 1861 paper On Physical Lines of Force. In the latter half of Part II of that paper, Maxwell gave a separate physical explanation for each of the two phenomena.[p 3]

Although Einstein calls the asymmetry "well-known", there is no evidence that any of Einstein's contemporaries considered the distinction between motional EMF and transformer EMF to be in any way odd or pointing to a lack of understanding of the underlying physics. Maxwell, for instance, had repeatedly discussed Faraday's laws of induction, stressing that the magnitude and direction of the induced current was a function only of the relative motion of the magnet and the conductor, without being bothered by the clear distinction between conductor-in-motion and magnet-in-motion in the underlying theoretical treatment.[10]:135–138

Yet Einstein's reflection on this experiment represented the decisive moment in his long and tortuous path to special relativity. Although the equations describing the two scenarios are entirely different, there is no measurement that can distinguish whether the magnet is moving, the conductor is moving, or both.[9]

In a 1920 review on the Fundamental Ideas and Methods of the Theory of Relativity (unpublished), Einstein related how disturbing he found this asymmetry:
The idea that these two cases should essentially be different was unbearable to me. According to my conviction, the difference between the two could only lie in the choice of the point of view, but not in a real difference .[p 4]:20
Einstein needed to extend the relativity of motion that he perceived between magnet and conductor in the above thought experiment to a full theory. For years, however, he did not know how this might be done. The exact path that Einstein took to resolve this issue is unknown. We do know, however, that Einstein spent several years pursuing an emission theory of light, encountering difficulties that eventually led him to give up the attempt.[9]
Gradually I despaired of the possibility of discovering the true laws by means of constructive efforts based on known facts. The longer and more desperately I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead us to assured results.[p 1]:49
That decision ultimately led to his development of special relativity as a theory founded on two postulates of which he could be sure.[9] Expressed in contemporary physics vocabulary, his postulates were as follows:[note 1]
1. The laws of physics take the same form in all inertial frames.
2. In any given inertial frame, the velocity of light c is the same whether the light be emitted by a body at rest or by a body in uniform motion.
Einstein's wording of the second postulate was one with which nearly all theorists of his day could agree. His wording is a far more intuitive form of the second postulate than the stronger version frequently encountered in popular writings and college textbooks.[12][note 2]

Trains, embankments, and lightning flashes

The topic of how Einstein arrived at special relativity has been a fascinating one to many scholars, and it is not hard to understand why: A lowly, twenty-six year old patent officer (third class), largely self-taught in physics and completely divorced from mainstream research, nevertheless in his miracle year of 1905 produces four extraordinary works, only one of which (his paper on Brownian motion) appeared related to anything that he had ever published before.[7]

Einstein's paper, On the Electrodynamics of Moving Bodies, is a polished work that bears few traces of its gestation. Documentary evidence concerning the development of the ideas that went into it consist of, quite literally, only two sentences in a handful of preserved early letters, and various later historical remarks by Einstein himself, some of them known only second-hand and at times contradictory.[7]


Train and embankment thought experiment

In regards to the relativity of simultaneity, Einstein's 1905 paper develops the concept vividly by carefully considering the basics of how time may be disseminated through the exchange of signals between clocks.[14] In his popular work, Relativity: The Special and General Theory, Einstein translates the formal presentation of his paper into a thought experiment using a train, a railway embankment, and lightning flashes. The essence of the thought experiment is as follows:
  • Observer M stands on an embankment, while observer M' rides on a rapidly traveling train. At the precise moment that M and M' coincide in their positions, lightning strikes points A and B equidistant from M and M'.
  • Light from these two flashes reach M at the same time, from which M concludes that the bolts were synchronous.
  • The combination of Einstein's first and second postulates implies that, despite the rapid motion of the train relative to the embankment, M' measures exactly the same speed of light as does M. Since M' was equidistant from A and B when lightning struck, the fact that M' receives light from B before light from A means that to M', the bolts were not synchronous. Instead, the bolt at B struck first.[p 5]:29–31
A routine supposition among historians of science is that, in accordance with the analysis given in his 1905 special relativity paper and in his popular writings, Einstein discovered the relativity of simultaneity by thinking about how clocks could be synchronized by light signals.[14] The Einstein synchronization convention was originally developed by telegraphers in the middle 19th century. The dissemination of precise time was an increasingly important topic during this period. Trains needed accurate time to schedule use of track, cartographers needed accurate time to determine longitude, while astronomers and surveyors dared to consider the worldwide dissemination of time to accuracies of thousandths of a second.[15]:132–144;183–187 Following this line of argument, Einstein's position in the patent office, where he specialized in evaluating electromagnetic and electromechanical patents, would have exposed him to the latest developments in time technology, which would have guided him in his thoughts towards understanding the relativity of simultaneity.[15]:243–263

However, all of the above is supposition. In later recollections, when Einstein was asked about what inspired him to develop special relativity, he would mention his riding a light beam and his magnet and conductor thought experiments. He would also mention the importance of the Fizeau experiment and the observation of stellar aberration. "They were enough", he said.[16] He never mentioned thought experiments about clocks and their synchronization.[14]

The routine analyses of the Fizeau experiment and of stellar aberration, that treat light as Newtonian corpuscles, do not require relativity. But problems arise if one considers light as waves traveling through an aether, which are resolved by applying the relativity of simultaneity. It is entirely possible, therefore, that Einstein arrived at special relativity through a different path than that commonly assumed, through Einstein's examination of Fizeau's experiment and stellar aberration.[14]

We therefore do not know just how important clock synchronization and the train and embankment thought experiment were to Einstein's development of the concept of the relativity of simultaneity. We do know, however, that the train and embankment thought experiment was the preferred means whereby he chose to teach this concept to the general public.[p 5]:29–31

General relativity

Falling painters and accelerating elevators

In his unpublished 1920 review, Einstein related the genesis of his thoughts on the equivalence principle:
When I was busy (in 1907) writing a summary of my work on the theory of special relativity for the Jahrbuch für Radioaktivität und Elektronik [Yearbook for Radioactivity and Electronics], I also had to try to modify the Newtonian theory of gravitation such as to fit its laws into the theory. While attempts in this direction showed the practicability of this enterprise, they did not satisfy me because they would have had to be based upon unfounded physical hypotheses. At that moment I got the happiest thought of my life in the following form: In an example worth considering, the gravitational field has a relative existence only in a manner similar to the electric field generated by magneto-electric induction. Because for an observer in free-fall from the roof of a house there is during the fall—at least in his immediate vicinity—no gravitational field. Namely, if the observer lets go of any bodies, they remain relative to him, in a state of rest or uniform motion, independent of their special chemical or physical nature. The observer, therefore, is justified in interpreting his state as being "at rest."[p 4]:20–21
The realization "startled" Einstein, and inspired him to begin an eight-year quest that led to what is considered to be his greatest work, the theory of general relativity. Over the years, the story of the falling man has become an iconic one, much embellished by other writers. In most retellings of Einstein's story, the falling man is identified as a painter. In some accounts, Einstein was inspired after he witnessed a painter falling from the roof of a building adjacent to the patent office where he worked. This version of the story leaves unanswered the question of why Einstein might consider his observation of such an unfortunate accident to represent the happiest thought in his life.[5]:145


A thought experiment used by Einstein to illustrate the equivalence principle

Einstein later refined his thought experiment to consider a man inside a large enclosed chest or elevator falling freely in space. While in free fall, the man would consider himself weightless, and any loose objects that he emptied from his pockets would float alongside him. Then Einstein imagined a rope attached to the roof of the chamber. A powerful "being" of some sort begins pulling on the rope with constant force. The chamber begins to move "upwards" with a uniformly accelerated motion. Within the chamber, all of the man's perceptions are consistent with his being in a uniform gravitational field. Einstein asked, "Ought we to smile at the man and say that he errs in his conclusion?" Einstein answered no. Rather, the thought experiment provided "good grounds for extending the principle of relativity to include bodies of reference which are accelerated with respect to each other, and as a result we have gained a powerful argument for a generalised postulate of relativity."[p 5]:75–79 [5]:145–147

Through this thought experiment, Einstein addressed an issue that was so well-known, scientists rarely worried about it or considered it puzzling: Objects have "gravitational mass," which determines the force with which they are attracted to other objects. Objects also have "inertial mass," which determines the relationship between the force applied to an object and how much it accelerates. Newton had pointed out that, even though they are defined differently, gravitational mass and inertial mass always seem to be equal. But until Einstein, no one had conceived a good explanation as to why this should be so. From the correspondence revealed by his thought experiment, Einstein concluded that "it is impossible to discover by experiment whether a given system of coordinates is accelerated, or whether...the observed effects are due to a gravitational field." This correspondence between gravitational mass and inertial mass is the equivalence principle.[5]:147

An extension to his accelerating observer thought experiment allowed Einstein to deduce that "rays of light are propagated curvilinearly in gravitational fields."[p 5]:83–84 [5]:190

Quantum mechanics

Background: Einstein and the quantum

Many myths have grown up about Einstein's relationship with quantum mechanics. Freshman physics students are aware that Einstein explained the photoelectric effect and introduced the concept of the photon. But students who have grown up with the photon may not be aware of how revolutionary the concept was for his time. The best-known factoids about Einstein's relationship with quantum mechanics are his statement, "God does not play dice" and the indisputable fact that he just didn't like the theory in its final form. This has led to the general impression that, despite his initial contributions, Einstein was out of touch with quantum research and played at best a secondary role in its development.[17]:1–4 Concerning Einstein's estrangement from the general direction of physics research after 1925, his well-known scientific biographer, Abraham Pais, wrote:
Einstein is the only scientist to be justly held equal to Newton. That comparison is based exclusively on what he did before 1925. In the remaining 30 years of his life he remained active in research but his fame would be undiminished, if not enhanced, had he gone fishing instead.[18]:43
In hindsight, we know that Pais was incorrect in his assessment.

Einstein was arguably the greatest single contributor to the "old" quantum theory.[17][note 3]
  • In his 1905 paper on light quanta,[p 6] Einstein created the quantum theory of light. His proposal that light exists as tiny packets (photons) was so revolutionary, that even such major pioneers of quantum theory as Planck and Bohr refused to believe that it could be true.[17]:70–79;282–284 [note 4] Bohr, in particular, was a passionate disbeliever in light quanta, and repeatedly argued against them until 1925, when he yielded in the face of overwhelming evidence for their existence.[20]
  • In his 1906 theory of specific heats, Einstein was the first to realize that quantized energy levels explained the specific heat of solids.[p 7] In this manner, he found a rational justification for the third law of thermodynamics (i.e. the entropy of any system approaches zero as the temperature approaches absolute zero[note 5]): at very cold temperatures, atoms in a solid don't have enough thermal energy to reach even the first excited quantum level, and so cannot vibrate.[17]:141–148
  • Einstein proposed the wave-particle duality of light. In 1909, using a rigorous fluctuation argument based on a thought experiment and drawing on his previous work on Brownian motion, he predicted the emergence of a "fusion theory" that would combine the two views.[17]:136–140 [p 8] [p 9] Basically, he demonstrated that the Brownian motion experienced by a mirror in thermal equilibrium with black body radiation would be the sum of two terms, one due to the wave properties of radiation, the other due to its particulate properties.[2]
  • Although Planck is justly hailed as the father of quantum mechanics, his derivation of the law of black-body radiation rested on fragile ground, since it required ad hoc assumptions of an unreasonable character.[note 6] In his 1916 theory of radiation, Einstein was the first to create a completely general explanation.[p 10] This paper, well-known for broaching the possibility of stimulated emission (the basis of the laser), changed the nature of the evolving quantum theory by introducing the fundamental role of random chance.[17]:181–192
  • In 1924, Einstein received a short manuscript by an unknown Indian professor, Satyendra Nath Bose, outlining a new method of deriving the law of blackbody radiation.[note 7] Einstein was intrigued by Bose's peculiar method of counting the number of distinct ways of putting photons into the available states, a method of counting that Bose apparently did not realize was unusual. Einstein, however, understood that Bose's counting method implied that photons are, in a deep sense, indistinguishable. He translated the paper into German and had it published. Einstein then followed Bose's paper with an extension to Bose's work which predicted Bose-Einstein condensation, one of the fundamental research topics of condensed matter physics.[17]:215–240
  • While trying to develop a mathematical theory of light which would fully encompass its wavelike and particle-like aspects, Einstein developed the concept of "ghost fields". A guiding wave obeying Maxwell's classical laws would propagate following the normal laws of optics, but would not transmit any energy. This guiding wave, however, would govern the appearance of quanta of energy  h \nu on a statistical basis, so that the appearance of these quanta would be proportional to the intensity of the interference radiation. These ideas became widely known in the physics community, and through Born's work in 1926, later became a key concept in the modern quantum theory of radiation and matter.[17]:193–203 [note 8]
Therefore, Einstein before 1925 originated most of the key concepts of quantum theory: light quanta, wave-particle duality, the fundamental randomness of physical processes, the concept of indistinguishabity, and the probability density interpretation of the wave equation. In addition, Einstein can arguably be considered the father of solid state physics and condensed matter physics.[21] He provided a correct derivation of the blackbody radiation law and sparked the notion of the laser.

What of after 1925? In 1935, working with two younger colleagues, Einstein issued a final challenge to quantum mechanics, attempting to show that it could not represent a final solution.[p 12] Despite the questions raised by this paper, it made little or no difference to how physicists employed quantum mechanics in their work. Of this paper, Pais was to write:
The only part of this article that will ultimately survive, I believe, is this last phrase [i.e. "No reasonable definition of reality could be expect to permit this" where "this" refers to the instantaneous transmission of information over a distance], which so poignantly summarizes Einstein's views on quantum mechanics in his later years....This conclusion has not affected subsequent developments in physics, and it is doubtful that it ever will.[11]:454–457
In contrast to Pais' negative assessment, this paper, outlining the EPR paradox, is currently among the top ten papers published in Physical Review, and is the centerpiece of the development of quantum information theory,[22] which has been termed the "third quantum revolution."[23] [note 9]

Einstein's light box

Einstein did not like the direction in which quantum mechanics had turned after 1925. Although excited by Heisenberg's matrix mechanics, Schroedinger's wave mechanics, and Born's clarification of the meaning of the Schroedinger wave equation (i.e. that the absolute square of the wave function is to be interpreted as a probability density), his instincts told him that something was missing.[5]:326–335 In a letter to Born, he wrote:
Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real thing. The theory produces a good deal but hardly brings us closer to the secret of the Old One.[11]:440–443
The Solvay Debates between Bohr and Einstein began in dining-room discussions at the Fifth Solvay International Conference on Electrons and Photons in 1927. Einstein's issue with the new quantum mechanics was not just that, with the probability interpretation, it rendered invalid the notion of rigorous causality. After all, as noted above, Einstein himself had introduced random processes in his 1916 theory of radiation. Rather, by defining and delimiting the maximum amount of information obtainable in a given experimental arrangement, the Heisenberg uncertainty principle denied the existence of any knowable reality in terms of a complete specification of the momenta and description of individual particles, an objective reality that would exist whether or not we could ever observe it.[5]:325–326 [11]:443–446

Over dinner, during after-dinner discussions, and at breakfast, Einstein debated with Bohr and his followers on the question whether quantum mechanics in its present form could be called complete. Einstein illustrated his points with increasingly clever thought experiments intended to prove that position and momentum could in principle be simultaneously known to arbitrary precision. For example, one of his thought experiments involved sending a beam of electrons through a shuttered screen, recording the positions of the electrons as they struck a photographic screen. Bohr and his allies would always be able to counter Einstein's proposal, usually by the end of the same day.[5]:344–347

On the final day of the conference, Einstein revealed that the uncertainty principle was not the only aspect of the new quantum mechanics that bothered him. Quantum mechanics, at least in the Copenhagen interpretation, appeared to allow action at a distance, the ability for two separated objects to communicate at speeds greater than light. By 1928, the consensus was that Einstein had lost the debate, and even his closest allies during the Fifth Solvay Conference, for example Louis de Broglie, conceded that quantum mechanics appeared to be complete.[5]:346–347


Einstein's light box

At the Sixth Solvay International Conference on Magnetism (1930), Einstein came armed with a new thought experiment. This involved a box with a shutter that operated so quickly, it would allow only one photon to escape at a time. The box would first be weighed exactly. Then, at a precise moment, the shutter would open, allowing a photon to escape. The box would then be re-weighed. The well-known relationship between mass and energy E=mc^{2} would allow the energy of the particle to be precisely determined. With this gadget, Einstein believed that he had demonstrated a means to obtain, simultaneously, a precise determination of the energy of the photon as well as its exact time of departure from the system.[5]:346–347 [11]:446–448

Bohr was shaken by this thought experiment. Unable to think of a refutation, he went from one conference participant to another, trying to convince them that Einstein's thought experiment couldn't be true, that if it were true, it would literally mean the end of physics. After a sleepless night, he finally worked out a response which, ironically, depended on Einstein's general relativity.[5]:348–349 Consider the illustration of Einstein's light box:[11]:446–448
1. After emitting a photon, the loss of weight causes the box to rise in the gravitational field.
2. The observer returns the box to its original height by adding weights until the pointer points to its initial position. It takes a certain amount of time t for the observer to perform this procedure. How long it takes depends on the strength of the spring and on how well-damped the system is. If undamped, the box will bounce up and down forever. If over-damped, the box will return to its original position sluggishly (See Damped spring-mass system).[note 10]
3. The longer that the observer allows the damped spring-mass system to settle, the closer the pointer will reach its equilibrium position. At some point, the observer will conclude that his setting of the pointer to its initial position is within an allowable tolerance. There will be some residual error \Delta q in returning the pointer to its initial position. Correspondingly, there will be some residual error \Delta m in the weight measurement.
4. Adding the weights imparts a momentum p to the box which can be measured with an accuracy \Delta p delimited by {\displaystyle \Delta p\Delta q\approx h.} It is clear that {\displaystyle \Delta p<gt\Delta m,} where g is the gravitational constant. Plugging in yields {\displaystyle gt\Delta m\Delta q>h.}
5. General relativity informs us that while the box has been at a height different than its original height, it has been ticking at a rate different than its original rate. The red shift formula informs us that there will be an uncertainty {\displaystyle \Delta t=c^{-2}gt\Delta q} in the determination of {\displaystyle t_{0},} the emission time of the photon.
6. Hence, {\displaystyle c^{2}\Delta m\Delta t=\Delta E\Delta t>h.} The accuracy with which the energy of the photon is measured restricts the precision with which its moment of emission can be measured, following the Heisenberg uncertainty principle.
After finding his last attempt at finding a loophole around the uncertainty principle refuted, Einstein quit trying to search for inconsistencies in quantum mechanics. Instead, he shifted his focus to the other aspects of quantum mechanics with which he was uncomfortable, focusing on his critique of action at a distance. His next paper on quantum mechanics foreshadowed his later paper on the EPR paradox.[11]:448

Einstein was gracious in his defeat. The following September, Einstein nominated Heisenberg and Schroedinger for the Nobel Prize, stating, "I am convinced that this theory undoubtedly contains a part of the ultimate truth."[11]:448

EPR Paradox

Both Bohr and Einstein were subtle men. Einstein tried very hard to show that quantum mechanics was inconsistent; Bohr, however, was always able to counter his arguments. But in his final attack Einstein pointed to something so deep, so counterintuitive, so troubling, and yet so exciting, that at the beginning of the twenty-first century it has returned to fascinate theoretical physicists. Bohr’s only answer to Einstein’s last great discovery—the discovery of entanglement—was to ignore it.

Einstein's fundamental dispute with quantum mechanics wasn't about whether God rolled dice, whether the uncertainty principle allowed simultaneous measurement of position and momentum, or even whether quantum mechanics was complete. It was about reality. Does a physical reality exist independent of our ability to observe it? To Bohr and his followers, such questions were meaningless. All that we can know are the results of measurements and observations. It makes no sense to speculate about an ultimate reality that exists beyond our perceptions.[5]:460–461

Einstein's beliefs had evolved over the years from those that he had held when he was young, when, as a logical positivist heavily influenced by his reading of David Hume and Ernst Mach, he had rejected such unobservable concepts as absolute time and space. Einstein believed:[5]:460–461
1. A reality exists independent of our ability to observe it.
2. Objects are located at distinct points in spacetime and have their own independent, real existence. In other words, he believed in separability and locality.
3. Although at a superficial level, quantum events may appear random, at some ultimate level, strict causality underlies all processes in nature.

EPR paradox thought experiment. (top) The total wave function of a particle pair spreads from the collision point. (bottom) Observation of one particle collapses the wave function.

Einstein considered that realism and localism were fundamental underpinnings of physics. After leaving Nazi Germany and settling in Princeton at the Institute for Advanced Studies, Einstein began writing up a thought experiment that he had been mulling over since attending a lecture by Léon Rosenfeld in 1933. Since the paper was to be in English, Einstein enlisted the help of the 46-year-old Boris Podolsky, a fellow who had moved to the Institute from Caltech; he also enlisted the help of the 26-year-old Nathan Rosen, also at the Institute, who did much of the math. The result of their collaboration was the four page EPR paper, which in its title asked the question Can Quantum-Mechanical Description of Physical Reality be Considered Complete?[5]:448–450 [p 12]

After seeing the paper in print, Einstein found himself unhappy with the result. His clear conceptual visualization had been buried under layers of mathematical formalism.[5]:448–450

Einstein's thought experiment involved two particles that have collided or which have been created in such a way that they have properties which are correlated. The total wave function for the pair links the positions of the particles as well as their linear momenta.[5]:450–453 [22] The figure depicts the spreading of the wave function from the collision point. However, observation of the position of the first particle allows us to determine precisely the position of the second particle no matter how far the pair have separated. Likewise, measuring the momentum of the first particle allows us to determine precisely the momentum of the second particle. "In accordance with our criterion for reality, in the first case we must consider the quantity P as being an element of reality, in the second case the quantity Q is an element of reality."[p 12]

Einstein concluded that the second particle, which we have never directly observed, must have at any moment a position that is real and a momentum that is real. Quantum mechanics does not account for these features of reality. Therefore, quantum mechanics is not complete.[5]:451 It is known, from the uncertainty principle, that position and momentum cannot be measured at the same time. But even though their values can only be determined in distinct contexts of measurement, can they both be definite at the same time? Einstein concluded that the answer must be yes.[22]

The only alternative, claimed Einstein, would be to assert that measuring the first particle instantaneously affected the reality of the position and momentum of the second particle.[5]:451 "No reasonable definition of reality could be expected to permit this."[p 12]

Bohr was stunned when he read Einstein's paper and spent more than six weeks framing his response, which he gave exactly the same title as the EPR paper.[p 16] The EPR paper forced Bohr to make a major revision in his understanding of complementarity in the Copenhagen interpretation of quantum mechanics.[22]

Prior to EPR, Bohr had maintained that disturbance caused by the act of observation was the physical explanation for quantum uncertainty. In the EPR thought experiment, however, Bohr had to admit that "there is no question of a mechanical disturbance of the system under investigation." On the other hand, he noted that the two particles were one system described by one quantum function. Furthermore, the EPR paper did nothing to dispel the uncertainty principle.[11]:454–457 [note 11]

Later commentators have questioned the strength and coherence of Bohr's response. As a practical matter, however, physicists for the most part did not pay much attention to the debate between Bohr and Einstein, since the opposing views did not affect one's ability to apply quantum mechanics to practical problems, but only affected one's interpretation of the quantum formalism. If they thought about the problem at all, most working physicists tended to follow Bohr's leadership.[22][26][27]

So stood the situation for nearly 30 years. Then, in 1964, John Stewart Bell made the groundbreaking discovery that Einstein's local realist world view made experimentally verifiable predictions that would be in conflict with those of quantum mechanics. Bell's discovery shifted the Einstein–Bohr debate from philosophy to the realm of experimental physics. Bell's theorem showed that, for any local realist formalism, there exist limits on the predicted correlations between pairs of particles in an experimental realization of the EPR thought experiment. In 1972, the first experimental tests were carried out. Successive experiments improved the accuracy of observation and closed loopholes. To date, it is virtually certain that local realist theories have been falsified.[28]

So Einstein was wrong. But it has several times been the case that Einstein's "mistakes" have foreshadowed and provoked major shifts in scientific research. Such, for instance, has been the case with his proposal of the cosmological constant, which Einstein considered his greatest blunder, but which currently is being actively investigated for its possible role in the accelerating expansion of the universe. In his Princeton years, Einstein was virtually shunned as he pursued the unified field theory. Nowadays, innumerable physicists pursue Einstein's dream for a "theory of everything."[29]

The EPR paper did not prove quantum mechanics to be incorrect. What it did prove was that quantum mechanics, with its "spooky action at a distance," is completely incompatible with commonsense understanding.[30] Furthermore, the effect predicted by the EPR paper, quantum entanglement, has inspired approaches to quantum mechanics different from the Copenhagen interpretation, and has been at the forefront of major technological advances in quantum computing, quantum encryption, and quantum information theory.[31]

Testing the God Hypothesis

This article was originally published in Fair Observer.

In my 2007 book God: The Failed Hypothesis; How ScienceShows That God Does Not Exist, I applied the scientific process of hypothesis testing to the question of God. The common objection I heard was that the existence of God is not a scientific hypothesis. Let me explain why I say it is.

The scientific method is not limited to what professional scientists do but can be applied to any question that relates to observations. The brain does not have the capacity to save the time, direction, and energy of each photon that hits the eyes. Instead it operates on a simplified picture of objects, be they rocks, trees, or people, assigning them general properties that do not encompass every detail.

That is, we make models. Science merely rationalizes the procedure, communicating by precise speech and writing among individuals who then attempt to reach an agreement on what they all have seen and how best to represent their collective observations. What are called scientific theories are just models.

The God Model

Religion carries out a similar process, although one in which agreement is generally asserted by authority rather than by a consensus of objective, unbiased observations. From humanity’s earliest days, gods have been imagined who possessed attributes that people could understand and to which they could relate.

Gods and spirits took the form of the objects of experience: the sun, Earth, moon, animals, and humans. The gods of the ancient Egyptians had the form of animals. The gods of the ancient Greeks had the form of imperfect but immortal humans. The God of Judaism, Christianity, and Islam took the form of a powerful, autocratic, male king enthroned high above his subjects.

Each of these god models developed from the culture of the day. If the process continued to today, everyone would worship the shopping mall. In fact, many of the megachurches in America today are located in shopping malls.

By dealing in terms of models of gods that are based on human conceptions, we avoid the objection that the “true” God may lay beyond our limited cognitive capabilities. When we demonstrate that a particular god is rejected by the evidence, we are not proving that all gods, conceivable or inconceivable, do not exist. We are simply showing beyond a reasonable doubt that a god with explicit hypothesized attributes described by the model does not exist. Belief aside, at the very minimum the fact that a specific god model may be inconsistent with the evidence is cause enough to disregard that model in the practices of everyday life.

The exact relationship between the elements of scientific models and whatever true reality lies out there is not of major concern to most scientists, or should not be anyway. When scientists have a model that describes their measurements, is consistent with other established models, makes successful predictions, and can be put to practical use, what else do they need?

The model works fine in not only describing observations but in enabling practical applications. It makes absolutely no difference whether or not an electron is “real” when we apply the model of electrons flowing in an electronic circuit to design some high-tech device. Whatever the intrinsic reality, the model describes what we observe, and those observations are real enough.

Similarly, it does not matter from a practical standpoint whether the “real” God resembles any of the gods whose empirical consequences we have examined and modeled. People do not worship abstractions. They worship a God with qualities they can comprehend. The most common example of a god model is a personal God who answers prayers. This god model has not has not been confirmed in numerous controlled experiments on the efficacy of prayer. It follows that a religious person is wasting her time praying for some favor of such a God.

If praying worked, the effects would be objectively observed. They are not. Let me then summarize the god models that are inconsistent with scientific observations.

Inconsistent Gods
  • A personal God who has given humans immortal souls fails to agree with the empirical facts that human thoughts, memories, and personalities are governed by physical processes in the brain, which dissolves upon death. No nonphysical or extra-physical powers of “mind” can be found and no evidence exists for an afterlife.
  • A personal God whose interactions with humans include miraculous interventions such as those reported in scriptures is contradicted by the lack of independent evidence for the alleged miraculous events.
  • A cosmic God who fine-tuned the laws and constants of physics for life, in particular human life, fails to agree with the fact that the universe is not congenial to human life, being tremendously wasteful of time, space, and matter from the human perspective. It also fails to agree with the fact that the universe is mostly composed of particles in random motion, with complex structures such as galaxies forming less than four percent of the total mass of the universe.
  • A personal God who communicates directly with humans by means of revelation fails to agree with the fact that no scientifically verifiable new information has ever been transmitted while many wrong and harmful doctrines have been asserted by this means. No claimed revelation contains information that could not have been already in the head of the person making the claim. Furthermore, physical evidence now conclusively demonstrates that some of the most important biblical narratives, such as the Exodus, never took place.
  • A personal God who is the source of morality and human values does not exist since the evidence shows that humans define morals and values for themselves. This is not “relative morality.” Believers and nonbelievers alike agree on a common set of morals and values. Even the most devout decide for themselves what is good and what is bad and even judge much of what is approved in scriptures as immoral, such as genocide, slavery, and the oppression of women. Nonbelievers behave no less morally than believers.
  • A personal God who is omniscient, omnibenevolent, and omnipotent does not exist because it is logically inconsistent with the existence of evil, in particular, gratuitous suffering (standard problem of evil).
What If?

The existence of the God worshiped by most Jews, Christians, and Muslims not only lacks supporting empirical evidence but is even contradicted by such evidence. However, it need not have turned out that way. Things might have been different, and this is important to understand as it justifies the use of science to address the God question and refutes the frequently heard statement that science can say nothing about God. If scientific observations had confirmed at least one model god, those believers who make that statement would quickly change their tune. Even the most skeptical atheists would have to come around and admit that there might be some chance that God exists. This has not happened.

Consider the following hypothetical events that, had they occurred, would have favored the God hypothesis. Readers are invited to think of their own similar “might have been” scenarios. While not necessarily proving the existence of God, they would at least lend some credence to traditional beliefs that currently does not exist.

Hypothetical Observations
  • Evidence was found that falsified evolution. Fossils might have been discovered that were inexplicably out of sequence. Life forms might not have all been based on the same genetic scheme. Transitional species might not have been observed. As actually thought at the time of Darwin, the age of the sun could have proved too short for evolution. The discovery of nuclear energy changed that, showing that, fueled by nuclear fusion, the sun will last ten billion years–ample time for life to evolve.
  • Human memories and thoughts might have provided evidence that cannot be plausibly accounted for by known physical processes. Science might have confirmed exceptional powers of the mind that it could not be plausibly explained physically.
  • Science might have uncovered convincing evidence for an afterlife. For example, a person who had been declared dead by every means known to science might return to life with detailed stories of an afterlife that were later verified. For example, she might meet Jimmy Hoffa who tells her where to find his body.
  • Similarly, any claim of a revelation obtained during a mystical trance could contain scientifically verifiable information that the subject could not possibly have known.
  • Physical and historical evidence might have been found for the miraculous events and the important narratives of the scriptures. For example, Roman records might have been found for an earthquake in Judea at the time of a certain crucifixion ordered by Pontius Pilate. Noah’s Ark might have been discovered. The Shroud of Turin might have contained genetic material with no Y-chromosomes. Since the image is that of a man with a beard, this would confirm he was born of a virgin. Or, the genetic material might contain a novel form of coding molecule not found in any other living organism. This would have proven an alien (if not divine) origin of the enshrouded being.
  • The universe might have been found to be so congenial to human life that it must have been created with human life in mind. Humans might have been able to move from planet to planet, just as easily as they now move from continent to continent, and be able to survive on every planet – even in space – without life support.
  • Natural events might follow some moral law, rather than morally neutral mathematical laws. For example, lightning might strike only the wicked; people who behave badly might fall sick more often; nuns would always survive plane crashes.
  • Believers might have had a higher moral sense than nonbelievers and other measurably superior qualities. For example, the jails might be filled with atheists while all believers live happy, prosperous, contented lives surrounded by loving families and pets.
  • Miracles are observed. For example, prayers are answered; an arm or a leg is regenerated through faith healing.
But none of this has happened. Indeed, the opposite is true in some cases, such as an abnormally low number of atheists in jail. Every claim of a supernatural event has proved false. The hypothesis of God is not confirmed by the evidence. Indeed, that hypothesis is strongly contradicted by the observations of our senses and the instruments of science.

Epistemology

From Wikipedia, the free encyclopedia
Epistemology (/ɪˌpɪstɪˈmɒləi/ (About this sound listen); from Greek ἐπιστήμη, epistēmē, meaning 'knowledge', and λόγος, logos, meaning 'logical discourse') is the branch of philosophy concerned with the theory of knowledge.[1]

Epistemology studies the nature of knowledge, justification, and the rationality of belief. Much of the debate in epistemology centers on four areas: (1) the philosophical analysis of the nature of knowledge and how it relates to such concepts as truth, belief, and justification,[2][3] (2) various problems of skepticism, (3) the sources and scope of knowledge and justified belief, and (4) the criteria for knowledge and justification. Epistemology addresses such questions as "What makes justified beliefs justified?",[4] "What does it mean to say that we know something?"[5] and fundamentally "How do we know that we know?"[6]

The term "epistemology" was first used by Scottish philosopher James Frederick Ferrier in 1854.[a] However, according to Brett Warren, King James VI of Scotland had previously personified this philosophical concept as the character Epistemon in 1591.[8]

Epistemon

In a philosophical dialogue, King James VI of Scotland penned the character Epistemon as the personification of a philosophical concept to debate on arguments of whether the ancient religious perceptions of witchcraft should be punished in a politically fueled Christian society. The arguments King James poses, through the character Epistemon, are based on ideas of theological reasoning regarding society's belief, as his opponent Philomathes takes a philosophical stance on society's legal aspects but seeks to obtain greater knowledge from Epistemon, whose name is Greek for scientist. This philosophical approach signified a Philomath seeking to obtain greater knowledge through epistemology with the use of theology. The dialogue was used by King James to educate society on various concepts including the history and etymology of the subjects debated.[8]

Etymology

The word epistemology is derived from the ancient Greek epistēmē meaning "knowledge" and the suffix -logy, meaning "logical discourse" (derived from the Greek word logos meaning "discourse"). J.F. Ferrier coined epistemology on the model of 'ontology', to designate that branch of philosophy which aims to discover the meaning of knowledge, and called it the 'true beginning' of philosophy. The word is equivalent to the concept Wissenschaftslehre, which was used by German philosophers Johann Fichte and Bernard Bolzano for different projects before it was taken up again by Husserl. French philosophers then gave the term épistémologie a narrower meaning as 'theory of knowledge [théorie de la connaissance].' E.g., Émile Meyerson opened his Identity and Reality, written in 1908, with the remark that the word 'is becoming current' as equivalent to 'the philosophy of the sciences.'[9]

Knowledge

In mathematics, it is known that 2 + 2 = 4, but there is also knowing how to add two numbers, and knowing a person (e.g., oneself), place (e.g., one's hometown), thing (e.g., cars), or activity (e.g., addition). Some philosophers think there is an important distinction between "knowing that" (know a concept), "knowing how" (understand an operation), and "acquaintance-knowledge" (know by relation), with epistemology being primarily concerned with the first of these.[10]

While these distinctions are not explicit in English, they are defined explicitly in other languages (N.B. some languages related to English have been said to retain these verbs, e.g. Scots: "wit" and "ken"). In French, Portuguese, Spanish, German and Dutch to know (a person) is translated using connaître, conhecer, conocer and kennen respectively, whereas to know (how to do something) is translated using savoir, saber, wissen and weten. Modern Greek has the verbs γνωρίζω (gnorízo) and ξέρω (kséro). Italian has the verbs conoscere and sapere and the nouns for knowledge are conoscenza and sapienza. German has the verbs wissen and kennen. Wissen implies knowing a fact, kennen implies knowing in the sense of being acquainted with and having a working knowledge of; there is also a noun derived from kennen, namely Erkennen, which has been said to imply knowledge in the form of recognition or acknowledgment. The verb itself implies a process: you have to go from one state to another, from a state of "not-erkennen" to a state of true erkennen. This verb seems to be the most appropriate in terms of describing the "episteme" in one of the modern European languages, hence the German name "Erkenntnistheorie". The theoretical interpretation and significance of these linguistic issues remains controversial.

In his paper On Denoting and his later book Problems of Philosophy Bertrand Russell stressed the distinction between "knowledge by description" and "knowledge by acquaintance". Gilbert Ryle is also credited with stressing the distinction between knowing how and knowing that in The Concept of Mind. In Personal Knowledge, Michael Polanyi argues for the epistemological relevance of knowledge how and knowledge that; using the example of the act of balance involved in riding a bicycle, he suggests that the theoretical knowledge of the physics involved in maintaining a state of balance cannot substitute for the practical knowledge of how to ride, and that it is important to understand how both are established and grounded. This position is essentially Ryle's, who argued that a failure to acknowledge the distinction between knowledge that and knowledge how leads to infinite regress.

In recent times, epistemologists including Sosa, Greco, Kvanvig, Zagzebski and Duncan Pritchard have argued that epistemology should evaluate people's "properties" (i.e., intellectual virtues) and not just the properties of propositions or of propositional mental attitudes.[citation needed]

Belief

In common speech, a "statement of belief" is typically an expression of faith or trust in a person, power or other entity—while it includes such traditional views, epistemology is also concerned with what we believe. This includes 'the' truth, and everything else we accept as 'true' for ourselves from a cognitive point of view.

Truth

Whether someone's belief is true is not a prerequisite for (its) belief. On the other hand, if something is actually known, then it categorically cannot be false. For example, if a person believes that a bridge is safe enough to support them, and attempts to cross it, but the bridge then collapses under their weight, it could be said that they believed that the bridge was safe but that their belief was mistaken. It would not be accurate to say that they knew that the bridge was safe, because plainly it was not. By contrast, if the bridge actually supported their weight, then the person might say that they had believed the bridge was safe, whereas now, after proving it to themself (by crossing it), they know it was safe.

Epistemologists argue over whether belief is the proper truth-bearer. Some would rather describe knowledge as a system of justified true propositions, and others as a system of justified true sentences. Plato, in his Gorgias, argues that belief is the most commonly invoked truth-bearer.[11]

Justification

In the Theaetetus, Socrates considers a number of theories as to what knowledge is, the last being that knowledge is true belief "with an account" (meaning explained or defined in some way). According to the theory that knowledge is justified true belief, in order to know that a given proposition is true, one must not only believe the relevant true proposition, but one must also have a good reason for doing so. One implication of this would be that no one would gain knowledge just by believing something that happened to be true. For example, an ill person with no medical training, but with a generally optimistic attitude, might believe that he will recover from his illness quickly. Nevertheless, even if this belief turned out to be true, the patient would not have known that he would get well since his belief lacked justification.

The definition of knowledge as justified true belief was widely accepted until the 1960s. At this time, a paper written by the American philosopher Edmund Gettier provoked major widespread discussion.

Gettier problem


Euler diagram representing a definition of knowledge.

Edmund Gettier is best known for a short paper entitled 'Is Justified True Belief Knowledge?' published in 1963, which called into question the theory of knowledge that had been dominant among philosophers for thousands of years.[12] This in turn called into question the actual value of philosophy if such an obvious and easy counterexample to a major theory could exist without anyone noticing it for thousands of years. In a few pages, Gettier argued that there are situations in which one's belief may be justified and true, yet fail to count as knowledge. That is, Gettier contended that while justified belief in a true proposition is necessary for that proposition to be known, it is not sufficient. As in the diagram, a true proposition can be believed by an individual (purple region) but still not fall within the "knowledge" category (yellow region).

According to Gettier, there are certain circumstances in which one does not have knowledge, even when all of the above conditions are met. Gettier proposed two thought experiments, which have come to be known as "Gettier cases", as counterexamples to the classical account of knowledge. One of the cases involves two men, Smith and Jones, who are awaiting the results of their applications for the same job. Each man has ten coins in his pocket. Smith has excellent reasons to believe that Jones will get the job and, furthermore, knows that Jones has ten coins in his pocket (he recently counted them). From this Smith infers, "the man who will get the job has ten coins in his pocket." However, Smith is unaware that he also has ten coins in his own pocket. Furthermore, Smith, not Jones, is going to get the job. While Smith has strong evidence to believe that Jones will get the job, he is wrong. Smith has a justified true belief that the man who will get the job has ten coins in his pocket; however, according to Gettier, Smith does not know that the man who will get the job has ten coins in his pocket, because Smith's belief is "...true by virtue of the number of coins in Jones's pocket, while Smith does not know how many coins are in Smith's pocket, and bases his belief...on a count of the coins in Jones's pocket, whom he falsely believes to be the man who will get the job." (see[12] p. 122.) These cases fail to be knowledge because the subject's belief is justified, but only happens to be true by virtue of luck. In other words, he made the correct choice (believing that the man who will get the job has ten coins in his pocket) for the wrong reasons. This example is similar to those often given when discussing belief and truth, wherein a person's belief of what will happen can coincidentally be correct without his or her having the actual knowledge to base it on.

Responses to Gettier

The responses to Gettier have been varied. Usually, they have involved substantial attempts to provide a definition of knowledge different from the classical one, either by recasting knowledge as justified true belief with some additional fourth condition, or proposing a completely new set of conditions, disregarding the classical ones entirely.
Infallibilism, indefeasibility
In one response to Gettier, the American philosopher Richard Kirkham has argued that the only definition of knowledge that could ever be immune to all counterexamples is the infallibilist one.[13] To qualify as an item of knowledge, goes the theory, a belief must not only be true and justified, the justification of the belief must necessitate its truth. In other words, the justification for the belief must be infallible.

Yet another possible candidate for the fourth condition of knowledge is indefeasibility. Defeasibility theory maintains that there should be no overriding or defeating truths for the reasons that justify one's belief. For example, suppose that person S believes he saw Tom Grabit steal a book from the library and uses this to justify the claim that Tom Grabit stole a book from the library. A possible defeater or overriding proposition for such a claim could be a true proposition like, "Tom Grabit's identical twin Sam is currently in the same town as Tom." When no defeaters of one's justification exist, a subject would be epistemologically justified.

The Indian philosopher B. K. Matilal has drawn on the Navya-Nyāya fallibilism tradition to respond to the Gettier problem. Nyaya theory distinguishes between know p and know that one knows p—these are different events, with different causal conditions. The second level is a sort of implicit inference that usually follows immediately the episode of knowing p (knowledge simpliciter). The Gettier case is examined by referring to a view of Gangesha Upadhyaya (late 12th century), who takes any true belief to be knowledge; thus a true belief acquired through a wrong route may just be regarded as knowledge simpliciter on this view. The question of justification arises only at the second level, when one considers the knowledgehood of the acquired belief. Initially, there is lack of uncertainty, so it becomes a true belief. But at the very next moment, when the hearer is about to embark upon the venture of knowing whether he knows p, doubts may arise. "If, in some Gettier-like cases, I am wrong in my inference about the knowledgehood of the given occurrent belief (for the evidence may be pseudo-evidence), then I am mistaken about the truth of my belief – and this is in accordance with Nyaya fallibilism: not all knowledge-claims can be sustained."[14]
Reliabilism
Reliabilism has been a significant line of response to the Gettier problem among philosophers, originating with work by Alvin Goldman in the 1960s. According to reliabilism, a belief is justified (or otherwise supported in such a way as to count towards knowledge) only if it is produced by processes that typically yield a sufficiently high ratio of true to false beliefs. In other words, this theory states that a true belief counts as knowledge only if it is produced by a reliable belief-forming process. Examples of reliable processes include: standard perceptual processes, remembering, good reasoning, and introspection.[15]
Reliabilism has been challenged by Gettier cases. Another argument that challenges reliabilism, like the Gettier cases (although it was not presented in the same short article as the Gettier cases), is the case of Henry and the barn façades. In the thought experiment, a man, Henry, is driving along and sees a number of buildings that resemble barns. Based on his perception of one of these, he concludes that he has just seen barns. While he has seen one, and the perception he based his belief that the one he saw was of a real barn, all the other barn-like buildings he saw were façades. Theoretically, Henry does not know that he has seen a barn, despite both his belief that he has seen one being true and his belief being formed on the basis of a reliable process (i.e. his vision), since he only acquired his true belief by accident.[16]
Other responses
Robert Nozick has offered the following definition of knowledge: S knows that P if and only if:
  • P;
  • S believes that P;
  • if P were false, S would not believe that P;
  • if P were true, S would believe that P.[17]
Nozick argues that the third of these conditions serves to address cases of the sort described by Gettier. Nozick further claims this condition addresses a case of the sort described by D. M. Armstrong:[18] A father believes his daughter innocent of committing a particular crime, both because of faith in his baby girl and (now) because he has seen presented in the courtroom a conclusive demonstration of his daughter's innocence. His belief via the method of the courtroom satisfies the four subjunctive conditions, but his faith-based belief does not. If his daughter were guilty, he would still believe her innocent, on the basis of faith in his daughter; this would violate the third condition.

The British philosopher Simon Blackburn has criticized this formulation by suggesting that we do not want to accept as knowledge beliefs, which, while they "track the truth" (as Nozick's account requires), are not held for appropriate reasons. He says that "we do not want to award the title of knowing something to someone who is only meeting the conditions through a defect, flaw, or failure, compared with someone else who is not meeting the conditions."[19] In addition to this, externalist accounts of knowledge, such as Nozick's, are often forced to reject closure in cases where it is intuitively valid.

Timothy Williamson has advanced a theory of knowledge according to which knowledge is not justified true belief plus some extra condition(s), but primary. In his book Knowledge and its Limits, Williamson argues that the concept of knowledge cannot be broken down into a set of other concepts through analysis—instead, it is sui generis. Thus, though knowledge requires justification, truth, and belief, the word "knowledge" can't be, according to Williamson's theory, accurately regarded as simply shorthand for "justified true belief".

Alvin Goldman writes in his Causal Theory of Knowing that in order for knowledge to truly exist there must be a causal chain between the proposition and the belief of that proposition.

Externalism and internalism

A central debate about the nature of justification is a debate between epistemological externalists on the one hand, and epistemological internalists on the other.

Externalists hold that factors deemed "external", meaning outside of the psychological states of those who gain knowledge, can be conditions of justification. For example, an externalist response to the Gettier problem is to say that, in order for a justified true belief to count as knowledge, there must be a link or dependency between the belief and the state of the external world. Usually this is understood to be a causal link. Such causation, to the extent that it is "outside" the mind, would count as an external, knowledge-yielding condition. Internalists, on the other hand, assert that all knowledge-yielding conditions are within the psychological states of those who gain knowledge.

Though unfamiliar with the internalist/externalist debate himself, many point to René Descartes as an early example of the internalist path to justification. He wrote that, because the only method by which we perceive the external world is through our senses, and that, because the senses are not infallible, we should not consider our concept of knowledge to be infallible. The only way to find anything that could be described as "indubitably true", he advocates, would be to see things "clearly and distinctly".[20] He argued that if there is an omnipotent, good being who made the world, then it's reasonable to believe that people are made with the ability to know. However, this does not mean that man's ability to know is perfect. God gave man the ability to know, but not omniscience. Descartes said that man must use his capacities for knowledge correctly and carefully through methodological doubt.[21] The dictum "Cogito ergo sum" (I think, therefore I am) is also commonly associated with Descartes' theory, because in his own methodological doubt, doubting everything he previously knew in order to start from a blank slate, the first thing that he could not logically bring himself to doubt was his own existence: "I do not exist" would be a contradiction in terms; the act of saying that one does not exist assumes that someone must be making the statement in the first place. Though Descartes could doubt his senses, his body and the world around him, he could not deny his own existence, because he was able to doubt and must exist in order to do so. Even if some "evil genius" were to be deceiving him, he would have to exist in order to be deceived. This one sure point provided him with what he would call his Archimedean point, in order to further develop his foundation for knowledge. Simply put, Descartes' epistemological justification depended upon his indubitable belief in his own existence and his clear and distinct knowledge of God.[22]

Value problem

We generally assume that knowledge is more valuable than mere true belief. If so, what is the explanation? A formulation of the value problem in epistemology first occurs in Plato's Meno. Socrates points out to Meno that a man who knew the way to Larissa could lead others there correctly. But so, too, could a man who had true beliefs about how to get there, even if he had not gone there or had any knowledge of Larissa. Socrates says that it seems that both knowledge and true opinion can guide action. Meno then wonders why knowledge is valued more than true belief, and why knowledge and true belief are different. Socrates responds that knowledge is more valuable than mere true belief because it is tethered, or justified. Justification, or working out the reason for a true belief, locks down true belief.[23]

The problem is to identify what (if anything) makes knowledge more valuable than mere true belief, or that makes knowledge more valuable than a more minimal conjunction of its components, such as justification, safety, sensitivity, statistical likelihood, and anti-Gettier conditions, on a particular analysis of knowledge that conceives of knowledge as divided into components (to which knowledge-first epistemological theories, which posit knowledge as fundamental, are notable exceptions).[24] The value problem reemerged in the philosophical literature on epistemology in the twenty-first century following the rise of virtue epistemology in the 1980s, partly because of the obvious link to the concept of value in ethics.[25]

The value problem has been presented as an argument against epistemic reliabilism by philosophers including Linda Zagzebski, Wayne Riggs and Richard Swinburne. Zagzebski analogizes the value of knowledge to the value of espresso produced by an espresso maker: "The liquid in this cup is not improved by the fact that it comes from a reliable espresso maker. If the espresso tastes good, it makes no difference if it comes from an unreliable machine."[26] For Zagzebski, the value of knowledge deflates to the value of mere true belief. She assumes that reliability in itself has no value or disvalue, but Goldman and Olsson disagree. They point out that Zagzebski's conclusion rests on the assumption of veritism: all that matters is the acquisition of true belief.[27] To the contrary, they argue that a reliable process for acquiring a true belief adds value to the mere true belief by making it more likely that future beliefs of a similar kind will be true. By analogy, having a reliable espresso maker that produced a good cup of espresso would be more valuable than having an unreliable one that luckily produced a good cup because the reliable one would more likely produce good future cups compared to the unreliable one.

The value problem is important to assessing the adequacy of theories of knowledge that conceive of knowledge as consisting of true belief and other components. According to Kvanvig, an adequate account of knowledge should resist counterexamples and allow an explanation of the value of knowledge over mere true belief. Should a theory of knowledge fail to do so, it would prove inadequate.[28]

One of the more influential responses to the problem is that knowledge is not particularly valuable and is not what ought to be the main focus of epistemology. Instead, epistemologists ought to focus on other mental states, such as understanding.[29] Advocates of virtue epistemology have argued that the value of knowledge comes from an internal relationship between the knower and the mental state of believing.[24]

Acquiring knowledge

A priori and a posteriori knowledge

The nature of this distinction has been disputed by various philosophers; however, the terms may be roughly defined as follows:
  • A priori knowledge is knowledge that is known independently of experience (that is, it is non-empirical, or arrived at beforehand, usually by reason). It will henceforth be acquired through anything that is independent from experience.
  • A posteriori knowledge is knowledge that is known by experience (that is, it is empirical, or arrived at afterward).
A priori knowledge is a way of gaining knowledge without the need of experience. In Bruce Russell's article "A Priori Justification and Knowledge"[30] he says that it is "knowledge based on a priori justification," (1) which relies on intuition and the nature of these intuitions. A priori knowledge is often contrasted with posteriori knowledge, which is knowledge gained by experience. A way to look at the difference between the two is through an example. Bruce Russell gives two propositions in which the reader decides which one he believes more. Option A: All crows are birds. Option B: All crows are black. If you believe option A, then you are a priori justified in believing it because you don't have to see a crow to know it's a bird. If you believe in option B, then you are posteriori justified to believe it because you have seen many crows therefore knowing they are black. He goes on to say that it doesn't matter if the statement is true or not, only that if you believe in one or the other that matters.

The idea of a priori knowledge is that it is based on intuition or rational insights. Laurence BonJour says in his article "The Structure of Empirical Knowledge",[31] that a "rational insight is an immediate, non-inferential grasp, apprehension or 'seeing' that some proposition is necessarily true." (3) Going back to the crow example, by Laurence BonJour's definition the reason you would believe in option A is because you have an immediate knowledge that a crow is a bird, without ever experiencing one.

Evolutionary psychology takes a novel approach to the problem. It says that there is an innate predisposition for certain types of learning. "Only small parts of the brain resemble a tabula rasa; this is true even for human beings. The remainder is more like an exposed negative waiting to be dipped into a developer fluid"[32]

Analytic–synthetic distinction

Immanuel Kant, in his Critique of Pure Reason, drew a distinction between "analytic" and "synthetic" propositions. He contended that some propositions are such that we can know them to be true just by understanding their meaning. For example, consider, "My father's brother is my uncle." We can know it to be true solely by virtue of our understanding what its terms mean. Philosophers call such propositions "analytic". Synthetic propositions, on the other hand, have distinct subjects and predicates. An example would be, "My father's brother has black hair." Kant stated that all mathematical and scientific statements are analytic a priori propositions because they are necessarily true but our knowledge about the attributes of the mathematical or physical subjects we can only get by logical inference.

The American philosopher Willard Van Orman Quine, in his Two Dogmas of Empiricism, famously challenged the distinction, arguing that the two have a blurry boundary. Some contemporary philosophers have offered more sustainable accounts of the distinction.[33]

Branches or schools of thought

Historical

The historical study of philosophical epistemology is the historical study of efforts to gain philosophical understanding or knowledge of the nature and scope of human knowledge.[34] Since efforts to get that kind of understanding have a history, the questions philosophical epistemology asks today about human knowledge are not necessarily the same as they once were.[34] But that does not mean that philosophical epistemology is itself a historical subject, or that it pursues only or even primarily historical understanding.[34]

Empiricism

In philosophy, empiricism is generally a theory of knowledge focusing on the role of experience, especially experience based on perceptual observations by the senses. Certain forms treat all knowledge as empirical,[citation needed] while some regard disciplines such as mathematics and logic as exceptions.[citation needed]

There are many variants of empiricism, positivism, realism and common sense being among the most commonly expounded. But central to all empiricist epistemologies is the notion of the epistemologically privileged status of sense data.

Idealism

Many idealists believe that knowledge is primarily (at least in some areas) acquired by a priori processes or is innate—for example, in the form of concepts not derived from experience. The relevant theoretical processes often go by the name "intuition".[35] The relevant theoretical concepts may purportedly be part of the structure of the human mind (as in Kant's theory of transcendental idealism), or they may be said to exist independently of the mind (as in Plato's theory of Forms).

Rationalism

By contrast with empiricism and idealism, which centres around the epistemologically privileged status of sense data (empirical) and the primacy of Reason (theoretical) respectively, modern rationalism adds a third 'system of thinking', (as Gaston Bachelard has termed these areas) and holds that all three are of equal importance: The empirical, the theoretical and the abstract. For Bachelard, rationalism makes equal reference to all three systems of thinking.

Constructivism

Constructivism is a view in philosophy according to which all "knowledge is a compilation of human-made constructions",[36] "not the neutral discovery of an objective truth".[37] Whereas objectivism is concerned with the "object of our knowledge", constructivism emphasises "how we construct knowledge".[38] Constructivism proposes new definitions for knowledge and truth that form a new paradigm, based on inter-subjectivity instead of the classical objectivity, and on viability instead of truth. Piagetian constructivism, however, believes in objectivity—constructs can be validated through experimentation. The constructivist point of view is pragmatic;[39] as Vico said: "The norm of the truth is to have made it."

Pragmatism

Pragmatism is an empiricist epistemology formulated by Charles Sanders Peirce, William James, and John Dewey, which understands truth as that which is practically applicable in the world. Peirce formulates the maxim: 'Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.'[40] This suggests that we are to analyse ideas and objects in the world for their practical value. This is in contrast to any correspondence theory of truth which holds that what is true is what corresponds to an external reality. William James suggests that through a pragmatist epistemology 'Theories thus become instruments, not answers to enigmas in which we can rest.' [41] A more contemporary understanding of pragmatism was developed by the philosopher Richard Rorty who proposed that values were historically contingent and dependent upon their utility within a given historical period. [42]

Regress problem

The regress problem is the problem of providing a complete logical foundation for human knowledge. The traditional way of supporting a rational argument is to appeal to other rational arguments, typically using chains of reason and rules of logic. A classic example that goes back to Aristotle is deducing that Socrates is mortal. We have a logical rule that says All humans are mortal and an assertion that Socrates is human and we deduce that Socrates is mortal. In this example how do we know that Socrates is human? Presumably we apply other rules such as: All born from human females are human. Which then leaves open the question how do we know that all born from humans are human? This is the regress problem: how can we eventually terminate a logical argument with some statement(s) that do not require further justification but can still be considered rational and justified?
As John Pollock stated:
... to justify a belief one must appeal to a further justified belief. This means that one of two things can be the case. Either there are some beliefs that we can be justified for holding, without being able to justify them on the basis of any other belief, or else for each justified belief there is an infinite regress of (potential) justification [the nebula theory]. On this theory there is no rock bottom of justification. Justification just meanders in and out through our network of beliefs, stopping nowhere.[43]
The apparent impossibility of completing an infinite chain of reasoning is thought by some to support skepticism. It is also the impetus for Descartes' famous dictum: I think, therefore I am. Descartes was looking for some logical statement that could be true without appeal to other statements.

Response to the regress problem

Many epistemologists studying justification have attempted to argue for various types of chains of reasoning that can escape the regress problem.
  • Foundationalism
Foundationalists respond to the regress problem by asserting that certain "foundations" or "basic beliefs" support other beliefs but do not themselves require justification from other beliefs. These beliefs might be justified because they are self-evident, infallible, or derive from reliable cognitive mechanisms. Perception, memory, and a priori intuition are often considered to be possible examples of basic beliefs.

The chief criticism of foundationalism is that if a belief is not supported by other beliefs, accepting it may be arbitrary or unjustified.[44]
  • Coherentism
Another response to the regress problem is coherentism, which is the rejection of the assumption that the regress proceeds according to a pattern of linear justification. To avoid the charge of circularity, coherentists hold that an individual belief is justified circularly by the way it fits together (coheres) with the rest of the belief system of which it is a part. This theory has the advantage of avoiding the infinite regress without claiming special, possibly arbitrary status for some particular class of beliefs. Yet, since a system can be coherent while also being wrong, coherentists face the difficulty of ensuring that the whole system corresponds to reality. Additionally, most logicians agree that any argument that is circular is trivially valid. That is, to be illuminating, arguments must be linear with conclusions that follow from stated premises.

However, Warburton writes in 'Thinking from A to Z', "Circular arguments are not invalid; in other words, from a logical point of view there is nothing intrinsically wrong with them. However, they are, when viciously circular, spectacularly uninformative. (Warburton 1996)."
  • Foundherentism
A position known as "foundherentism", advanced by Susan Haack, is meant to be a unification of foundationalism and coherentism. One component of this theory is what is called the "analogy of the crossword puzzle." Whereas, for example, infinitists regard the regress of reasons as "shaped" like a single line, Susan Haack has argued that it is more like a crossword puzzle, with multiple lines mutually supporting each other.[45]
  • Infinitism
An alternative resolution to the regress problem is known as "infinitism". Infinitists take the infinite series to be merely potential, in the sense that an individual may have indefinitely many reasons available to them, without having consciously thought through all of these reasons when the need arises. This position is motivated in part by the desire to avoid what is seen as the arbitrariness and circularity of its chief competitors, foundationalism and coherentism.

Indian pramana

Indian philosophical schools such as the Hindu Nyaya, and Carvaka, and later, the Jain and Buddhist philosophical schools, developed an epistemological tradition which is termed "pramana" independently of the Western philosophical tradition. Pramana can be translated as "instrument of knowledge" and refers to various means or sources of knowledge which were held to be reliable by Indian philosophers. Each school of Indian philosophy had their own theories about which pramanas were valid means to knowledge and which was unreliable (and why).[46] A Vedic text, Taittirīya Āraṇyaka (c. 9th–6th centuries BCE), lists "four means of attaining correct knowledge": smṛti ("tradition" or "scripture"), pratyakṣa ("perception"), aitihya ("communication by one who is expert", or "tradition), and anumāna ("reasoning" or "inference").[47][48]
In the Indian traditions, the most widely discussed pramanas are: Pratyakṣa (perception), Anumāṇa (inference), Upamāṇa (comparison and analogy), Arthāpatti (postulation, derivation from circumstances), Anupalabdi (non-perception, negative/cognitive proof) and Śabda (word, testimony of past or present reliable experts). While the Nyaya school (beginning with the Nyāya Sūtras of Gotama, between 6th-century BCE and 2nd-century CE[49][50]) were a proponent of realism and supported four pramanas (perception, inference, comparison/analogy and testimony), the Buddhist epistemologists (Dignaga and Dharmakirti) generally accepted only perception and inference.

The theory of knowledge of the Buddha in the early Buddhist texts has been interpreted as a form of pragmatism as well as a form of correspondence theory.[51] Likewise, the Buddhist philosopher Dharmakirti has been interpreted both as holding a form of pragmatism or correspondence theory for his view that what is true is what has effective power (arthakriya).[52][53] The Buddhist Madhyamika school's theory of emptiness (shunyata) meanwhile has been interpreted as a form of philosophical skepticism.[54]

The main Jain contribution to epistemology has been their theory of "many sided-ness" or "multi-perspectivism" (Anekantavada) which says that since the world is multifaceted, any single viewpoint is limited (naya — a partial standpoint).[55] This has been interpreted as a kind of pluralism or perspectivism.[56][57] According to Jain epistemology, none of the pramanas gives absolute or perfect knowledge since they are each limited points of view.

The Carvaka school of materialists only accepted the pramana of perception and hence were one of the first empiricists.[58] There was also another school of philosophical skepticism, the Ajñana.

Skepticism

Skepticism is a position that questions the validity of some or all of human knowledge. Skepticism does not refer to any one specific school of philosophy, rather it is a thread that runs through many philosophical discussions of epistemology. The first well known Greek skeptic was Socrates who claimed that his only knowledge was that he knew nothing with certainty. In Indian philosophy, Sanjaya Belatthiputta was a famous skeptic and the Buddhist Madhyamika school has been seen as taking up a form of skepticism. Descartes' most famous inquiry into mind and body also began as an exercise in skepticism. Descartes began by questioning the validity of all knowledge and looking for some fact that was irrefutable. In so doing, he came to his famous dictum: I think, therefore I am.
Foundationalism and the other responses to the regress problem are essentially defenses against skepticism. Similarly, the pragmatism of William James can be viewed as a coherentist defense against skepticism. James discarded conventional philosophical views of truth and defined truth to be based on how well a concept works in a specific context rather than objective rational criteria. The philosophy of Logical Positivism and the work of philosophers such as Kuhn and Popper can be viewed as skepticism applied to what can truly be considered scientific knowledge.[59]

Computer-aided software engineering

From Wikipedia, the free encyclopedia ...