Search This Blog

Thursday, March 25, 2021

Chemical revolution

From Wikipedia, the free encyclopedia
 
Geoffroy's 1718 Affinity Table: at the head of each column is a chemical species with which all the species below can combine. Some historians have defined this table as being the start of the chemical revolution.

The chemical revolution, also called the first chemical revolution, was the early modern reformulation of chemistry that culminated in the law of conservation of mass and the oxygen theory of combustion. During the 19th and 20th century, this transformation was credited to the work of the French chemist Antoine Lavoisier (the "father of modern chemistry"). However, recent work on the history of early modern chemistry considers the chemical revolution to consist of gradual changes in chemical theory and practice that emerged over a period of two centuries. The so-called scientific revolution took place during the sixteenth and seventeenth centuries whereas the chemical revolution took place during the seventeenth and eighteenth centuries.

Primary factors

Several factors led to the first chemical revolution. First, there were the forms of gravimetric analysis that emerged from alchemy and new kinds of instruments that were developed in medical and industrial contexts. In these settings, chemists increasingly challenged hypotheses that had already been presented by the ancient Greeks. For example, chemists began to assert that all structures were composed of more than the four elements of the Greeks or the eight elements of the medieval alchemists. The Irish alchemist, Robert Boyle, laid the foundations for the Chemical Revolution, with his mechanical corpuscular philosophy, which in turn relied heavily on the alchemical corpuscular theory and experimental method dating back to pseudo-Geber.

Earlier works by chemists such as Jan Baptist van Helmont helped to shift the belief in theory that air existed as a single element to that of one in which air existed as a composition of a mixture of distinct kinds of gasses. Van Helmont's data analysis also suggests that he had a general understanding of the law of conservation of mass in the 17th century. Furthermore, work by Jean Rey in the early 17th century with metals like tin and lead and their oxidation in the presence of air and water helped pinpoint the contribution and existence of oxygen in the oxidation process.

Other factors included new experimental techniques and the discovery of 'fixed air' (carbon dioxide) by Joseph Black in the middle of the 18th century. This discovery was particularly important because it empirically proved that 'air' did not consist of only one substance and because it established 'gas' as an important experimental substance. Nearer the end of the 18th century, the experiments by Henry Cavendish and Joseph Priestley further proved that air is not an element and is instead composed of several different gases. Lavoisier also translated the names of chemical substance into a new nomenclatural language more appealing to scientists of the nineteenth century. Such changes took place in an atmosphere in which the industrial revolution increased public interest in learning and practicing chemistry. When describing the task of reinventing chemical nomenclature, Lavoisier attempted to harness the new centrality of chemistry by making the rather hyperbolic claim that:

We must clean house thoroughly, for they have made use of an enigmatical language peculiar to themselves, which in general presents one meaning for the adepts and another meaning for the vulgar, and at the same time contains nothing that is rationally intelligible either for the one or for the other.

Precision instruments

Much of the reasoning behind Antoine Lavoisier being named the "father of modern chemistry" and the start of the chemical revolution lay in his ability to mathematize the field, pushing chemistry to use the experimental methods utilized in other "more exact sciences." Lavoisier changed the field of chemistry by keeping meticulous balance sheets in his research, attempting to show that through the transformation of chemical species the total amount of substance was conserved. Lavoisier used instrumentation for thermometric and barometric measurements in his experiments, and collaborated with Pierre Simon de Laplace in the invention of the calorimeter, an instrument for measuring heat changes in a reaction. In attempting to dismantle phlogiston theory and implement his own theory of combustion, Lavoisier utilized multiple apparatuses. These included a red-hot iron gun barrel which was designed to have water run through it and decompose, and an alteration of the apparatus which implemented a pneumatic trough at one end, a thermometer, and a barometer. The precision of his measurements was a requirement in convincing opposition of his theories about water as a compound, with instrumentation designed by himself implemented in his research.

Despite having precise measurements for his work, Lavoisier faced a large amount of opposition in his research. Proponents of phlogiston theory, such as Keir and Priestley, claimed that demonstration of facts was only applicable for raw phenomena, and that interpretation of these facts did not imply accuracy in theories. They stated that Lavoisier was attempting to impose order on observed phenomena, whereas a secondary source of validity would be required to give definitive proof of the composition of water and non-existence of phlogiston.

Antoine Lavoisier

The latter stages of the revolution was fuelled by the 1789 publication of Lavoisier's Traité Élémentaire de Chimie (Elements of Chemistry). Beginning with this publication and others to follow, Lavoisier synthesised the work of others and coined the term "oxygen". Antoine Lavoisier represented the chemical revolution not only in his publications, but also in the way he practiced chemistry. Lavoisier's work was characterized by his systematic determination of weights and his strong emphasis on precision and accuracy. While it has been postulated that the law of conservation of mass was discovered by Lavoisier, this claim has been refuted by scientist Marcellin Berthelot. Earlier use of the law of conservation of mass has been suggested by Henry Guerlac, noting that scientist Jan Baptist van Helmont had implicitly applied the methodology to his work in the 16th and 17th centuries. Earlier references of the law of conservation of mass and its use were made by Jean Rey in 1630. Although the law of conservation of mass was not explicitly discovered by Lavoisier, his work with a wider array of materials than what most scientists had available at the time allowed his work to greatly expand the boundaries of the principal and its fundamentals.

Lavoisier also contributed to chemistry a method of understanding combustion and respiration and proof of the composition of water by decomposition into its constituent parts. He explained the theory of combustion, and challenged the phlogiston theory with his views on caloric. The Traité incorporates notions of a "new chemistry" and describes the experiments and reasoning that led to his conclusions. Like Newton's Principia, which was the high point of the Scientific Revolution, Lavoisier's Traité can be seen as the culmination of the Chemical Revolution.

Lavoisier's work was not immediately accepted and it took several decades for it gain momentum. This transition was aided by the work of Jöns Jakob Berzelius, who came up with a simplified shorthand to describe chemical compounds based on John Dalton's theory of atomic weights. Many people credit Lavoisier and his overthrow of phlogiston theory as the traditional chemical revolution, with Lavoisier marking the beginning of the revolution and John Dalton marking its culmination.

Méthode de nomenclature chimique

Antoine Lavoisier, in a collaborative effort with Louis Bernard Guyton de Morveau, Claude Louis Berthollet, and Antoine François de Fourcroy, published Méthode de nomenclature chimique in 1787. This work established a terminology for the "new chemistry" which Lavoisier was creating, which focused on a standardized set of terms, establishment of new elements, and experimental work. Méthode established 55 elements which were substances that could not be broken down into simpler composite parts at the time of publishing. By introducing new terminology into the field, Lavoisier encouraged other chemists to adopt his theories and practices in order to use his terms and stay current in chemistry.

Traité élémentaire de chimie

One of Lavoisier's main influences was Étienne Bonnet, abbé de Condillac. Condillac's approach to scientific research, which was the basis of Lavoisier's approach in Traité, was to demonstrate that human beings could create a mental representation of the world using gathered evidence. In Lavoisier's preface to Traité, he states

It is a maxim universally admitted in geometry, and indeed in every branch of knowledge, that, in the progress of investigation, we should proceed from known facts to what is unknown. ... In this manner, from a series of sensations, observations, and analyses, a successive train of ideas arises, so linked together, that an attentive observer may trace back to a certain point the order and connection of the whole sum of human knowledge.

Lavoisier clearly ties his ideas in with those of Condillac, seeking to reform the field of chemistry. His goal in Traité was to associate the field with direct experience and observation, rather than assumption. His work defined a new foundation for the basis of chemical ideas and set a direction for the future course of chemistry.

Humphry Davy

Humphry Davy was an English chemist and a professor of chemistry at the London's Royal Institution in the early 1800s. There he performed experiments that cast doubt upon some of Lavoisier's key ideas such as the acidity of oxygen and the idea of a caloric element. Davy was able to show that acidity was not due to the presence of oxygen using muriatic acid (hydrochloric acid) as proof. He also proved that the compound oxymuriatic acid contained no oxygen and was instead an element, which he named chlorine. Through his use of electric batteries at the Royal Institution Davy first isolated chlorine, followed by the isolation of elemental iodine in 1813. Using the batteries Davy was also able to isolate the elements sodium and potassium. From these experiments Davy concluded that the forces that join chemical elements together must be electrical in nature. Davy was also a proponent against the idea that caloric was an immaterial fluid, arguing instead that heat was a type of motion.

John Dalton

John Dalton was an English chemist who developed the idea of atomic theory of chemical elements. Dalton's atomic theory of chemical elements assumed that each element had unique atoms associated with and specific to that atom. This was in opposition to Lavoisier's definition of elements which was that elements are substances that chemists could not break down further into simpler parts. Dalton's idea also differed from the idea of corpuscular theory of matter, which believed that all atoms were the same, and had been a supported theory since the 17th century. To help support his idea, Dalton worked on defining the relative weights of atoms in chemicals in his work New System of Chemical Philosophy, published in 1808. His text showed calculations to determine the relative atomic weights of Lavoisier's different elements based on experimental data pertaining to the relative amounts of different elements in chemical combinations. Dalton argued that elements would combine in the simplest form possible. Water was known to be a combination of hydrogen and oxygen, thus Dalton believed water to be a binary compound containing one hydrogen and one oxygen.

Dalton was able to accurately compute the relative quantity of gases in atmospheric air. He used the specific gravity of azotic (nitrogen), oxygenous, carbonic acid (carbon dioxide), and hydrogenous gases as well as aqueous vapor determined by Lavoisier and Davy to determine the proportional weights of each as a percent of a whole volume of atmospheric air. Dalton determined that atmospheric air contains 75.55% azotic gas, 23.32% oxygenous gas, 1.03% aqueous vapor, and 0.10% carbonic acid gas.

Jöns Jacob Berzelius

Jöns Jacob Berzelius was a Swedish chemist who studied medicine at the University of Uppsala and was a professor of chemistry in Stockholm. He drew on the ideas of both Davy and Dalton to create an electrochemical view of how elements combined together. Berzelius classified elements into two groups, electronegative and electropositive depending which pole of a galvanic battery they were released from when decomposed. He created a scale of charge with oxygen being the most electronegative element and potassium the most electropositive. This scale signified that some elements had positive and negative charges associated with them and the position of an element on this scale and the element's charge determined how that element combined with others. Berzelius's work on electrochemical atomic theory was published in 1818 as Essai sur la théorie des proportions chimiques et sur l'influence chimique de l'électricité. He also introduced a new chemical nomenclature into chemistry by representing elements with letters and abbreviations, such as O for oxygen and Fe for iron. Combinations of elements were represented as sequences of these symbols and the number of atoms were represented at first by superscripts and then later subscripts.

Naïve realism

From Wikipedia, the free encyclopedia
Naïve realism argues we perceive the world directly

In philosophy of perception and philosophy of mind, naïve realism (also known as direct realism, perceptual realism, or common sense realism) is the idea that the senses provide us with direct awareness of objects as they really are. When referred to as direct realism, naïve realism is often contrasted with indirect realism.

According to the naïve realist, the objects of perception are not merely representations of external objects, but are in fact those external objects themselves. The naïve realist is typically also a metaphysical realist, holding that these objects continue to obey the laws of physics and retain all of their properties regardless of whether or not there is anyone to observe them. They are composed of matter, occupy space, and have properties, such as size, shape, texture, smell, taste and colour, that are usually perceived correctly. The indirect realist, by contrast, holds that the objects of perception are simply representations of reality based on sensory inputs, and thus adheres to the primary/secondary quality distinction in ascribing properties to external objects.

In addition to indirect realism, naïve realism can also be contrasted with some forms of idealism, which claim that no world exists apart from mind-dependent ideas, and some forms of philosophical skepticism, which say that we cannot trust our senses or prove that we are not radically deceived in our beliefs; that our conscious experience is not of the real world but of an internal representation of the world.

Overview

The naïve realist is generally committed to the following views:

  • Metaphysical realism: There exists a world of material objects, which exist independently of being perceived, and which have properties such as shape, size, color, mass, and so on independently of being perceived
  • Empiricism: Some statements about these objects can be known to be true through sensory experience
  • Naïve realism: By means of our senses, we perceive the world directly, and pretty much as it is, meaning that our claims to have knowledge of it are justified

Among contemporary analytic philosophers who defended direct realism one might refer to, for example, Hilary Putnam, John McDowell, Galen Strawson, John R. Searle, and John L. Pollock.

Searle, for instance, disputes the popular assumption that "we can only directly perceive our own subjective experiences, but never objects and states of affairs in the world themselves". According to Searle, it has influenced many thinkers to reject direct realism. But Searle contends that the rejection of direct realism is based on a bad argument: the argument from illusion, which in turn relies on vague assumptions on the nature or existence of "sense data". Various sense data theories were deconstructed in 1962 by the British philosopher J. L. Austin in a book titled Sense and Sensibilia.

Talk of sense data has largely been replaced today by talk of representational perception in a broader sense, and scientific realists typically take perception to be representational and therefore assume that indirect realism is true. But the assumption is philosophical, and arguably little prevents scientific realists from assuming direct realism to be true. In a blog-post on "Naive realism and color realism", Hilary Putnam sums up with the following words: "Being an apple is not a natural kind in physics, but it is in biology, recall. Being complex and of no interest to fundamental physics isn't a failure to be "real". I think green is as real as applehood."

The direct realist claims that the experience of a sunset, for instance, is the real sunset that we directly experience. The indirect realist claims that our relation to reality is indirect, so the experience of a sunset is a subjective representation of what really is radiation as described by physics. But the direct realist does not deny that the sunset is radiation; the experience has a hierarchical structure, and the radiation is part of what amounts to the direct experience.

Simon Blackburn has argued that whatever positions they may take in books, articles or lectures, naive realism is the view of "philosophers when they are off-duty."

Scientific realism and naïve perceptual realism

Many philosophers claim that it is incompatible to accept naïve realism in the philosophy of perception and scientific realism in the philosophy of science. Scientific realism states that the universe contains just those properties that feature in a scientific description of it, which would mean that secondary qualities like color are not real per se, and that all that exists are certain wavelengths which are reflected by physical objects because of their microscopic surface texture.

John Locke notably held that the world only contains the primary qualities that feature in a corpuscularian scientific account of the world, and that secondary qualities are in some sense subjective and depend for their existence upon the presence of some perceiver who can observe the objects.

One should add, however, that naïve realism does not necessarily claim that reality is only what we see, hear, etc. Likewise, scientific realism does not claim that reality is only what can be described by fundamental physics. It follows that the relevant distinction to make is not between naïve and scientific realism but between direct and indirect realism.

Influence in psychology

Naïve realism in philosophy has also inspired work on visual perception in psychology. The leading direct realist theorist in psychology was J. J. Gibson. Other psychologists were heavily influenced by this approach, including William Mace, Claire Michaels, Edward Reed, Robert Shaw, and Michael Turvey. More recently, Carol Fowler has promoted a direct realist approach to speech perception.

Objectivity (science)

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Objectivity_(science)

Objectivity in science is an attempt to uncover truths about the natural world by eliminating personal biases, emotions, and false beliefs. It is often linked to observation as part of the scientific method. It is thus intimately related to the aim of testability and reproducibility. To be considered objective, the results of measurement must be communicated from person to person, and then demonstrated for third parties, as an advance in a collective understanding of the world. Such demonstrable knowledge has ordinarily conferred demonstrable powers of prediction or technology.

The problem of philosophical objectivity is contrasted with personal subjectivity, sometimes exacerbated by the overgeneralization of a hypothesis to the whole. E.g. Newton's law of universal gravitation appears to be the norm for the attraction between celestial bodies, but it was later superseded by the more general theory of relativity.

History

The scientific method was argued for by Enlightenment philosopher Francis Bacon, rose to popularity with the discoveries of Isaac Newton and his followers, and continued into later eras. In the early eighteenth century, there existed an epistemic virtue in science which has been called truth-to-nature. This ideal was practiced by Enlightenment naturalists and scientific atlas-makers, and involved active attempts to eliminate any idiosyncrasies in their representations of nature in order to create images thought best to represent "what truly is." Judgment and skill were deemed necessary in order to determine the "typical", "characteristic", "ideal", or "average." In practicing truth-to-nature naturalists did not seek to depict exactly what was seen; rather, they sought a reasoned image.

In the latter half of the nineteenth century objectivity in science was born when a new practice of mechanical objectivity appeared. "'Let nature speak for itself' became the watchword of a new brand of scientific objectivity." It was at this time that idealized representations of nature, which were previously seen as a virtue, were now seen as a vice. Scientists began to see it as their duty to actively restrain themselves from imposing their own projections onto nature. The aim was to liberate representations of nature from subjective, human interference and in order to achieve this scientists began using self-registering instruments, cameras, wax molds, and other technological devices.

In the twentieth century trained judgment supplemented mechanical objectivity as scientists began to recognize that, in order for images or data to be of any use, scientists needed to be able to see scientifically; that is, to interpret images or data and identify and group them according to particular professional training, rather than to simply depict them mechanically. Since the latter half of the nineteenth century, objectivity has come to involve a combination of trained judgment and mechanical objectivity.

Objectivity in measurement

Another methodological aspect is the avoidance of bias, which can involve cognitive bias, cultural bias, or sampling bias. Methods for avoiding or overcoming such biases include random sampling and double-blind trials. However, objectivity in measurement can be unobtainable in certain circumstances. Even the most quantitative social sciences such as economics employ measures that are constructs (conventions, to employ the term coined by Pierre Duhem).

The role of the scientific community

Various scientific processes, such as peer reviews, the discussions at scientific conferences, and other meetings where scientific results are presented, are part of a social process whose purpose is to strengthen the objective aspect of the scientific method.

Next to unintentional and systematic error, there is always the possibility of deliberate misrepresentation of scientific results, whether for gain, fame, or ideological motives. When such cases of scientific fraud come to light, they usually give rise to an academic scandal, but it is unknown how much fraud goes undiscovered. For important results, other groups will try to repeat the experiment. If they consistently fail, they will bring these negative results into the scientific debate.[according to whom?]

Critiques of scientific objectivity

A critical argument on scientific objectivity and positivism is that all science has a degree of interpretivism. In the 1920s, Percy Bridgman's The Logic of Modern Physics and the operationalism presented was centered in such recognition.

Thomas Kuhn's The Structure of Scientific Revolutions

Based on a historical review of the development of certain scientific theories in his book, The Structure of Scientific Revolutions, scientist and historian Thomas Kuhn raised some philosophical objections to claims of the possibility of scientific understanding being truly objective. In Kuhn's analysis, scientists in different disciplines organise themselves into de facto paradigms within which scientific research is done, junior scientists are educated, and scientific problems are determined.

When observational data arises which appears to contradict or falsify a given scientific paradigm, scientists within that paradigm historically have not immediately rejected it, as Karl Popper's philosophical theory of falsificationism would have them do. Instead they have gone to considerable lengths to resolve the apparent conflict without rejecting the paradigm. Through ad hoc variations to the theory and sympathetic interpretation of the data, supporting scientists will resolve the apparent conundrum. In extreme cases, they may ignore the data altogether. Thus, the failure of a scientific paradigm will go into crisis when a significant portion of scientists working in the field lose confidence in it. The corollary of this observation is that a paradigm is contingent on the social order amongst scientists at the time it gains ascendancy.

Kuhn's theory has been criticised by scientists such as Richard Dawkins and Alan Sokal as presenting a relativist view of scientific progress.In a postscript to the third edition of his book, Kuhn denied being a relativist.

Donna Haraway's Situated Knowledges

In Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective (1988), Donna Haraway argues that objectivity in science and philosophy is traditionally understood as a kind of disembodied and transcendent "conquering gaze from nowhere." She argues that this kind of objectivity, in which the subject is split apart and distanced from the object, is an impossible "illusion, a god trick." She demands a re-thinking of objectivity in such a way that, while still striving for "faithful accounts of the real world," we must also acknowledge our perspective within the world. She calls this new kind of knowledge-making "situated knowledges." Objectivity, she argues, "turns out to be about particular and specific embodiment and ... not about the false vision promising transcendence of all limits and responsibility". This new objectivity, "allows us to become answerable for what we learn how to see." Thus, Haraway is not only critiquing the idea that objectivity as we have long understood it is possible; she is also arguing that if we continue to approach knowledge-making in this way, then we wash our hands of any responsibility for our truth claims. In contrast, she argues, approaching knowledge-making from an embodied perspective forces us to take responsibility.

Criticism of science

From Wikipedia, the free encyclopedia
 
Personification of "Science" in front of the Boston Public Library

Criticism of science addresses problems within science in order to improve science as a whole and its role in society. Criticisms come from philosophy, from social movements like feminism, and from within science itself.

The emerging field of metascience seeks to increase the quality of and efficiency of scientific research by improving the scientific process.

Philosophical critiques

"All methodologies, even the most obvious ones, have their limits." ―Paul Feyerabend in Against Method

Philosopher of science Paul Feyerabend advanced the idea of epistemological anarchism, which holds that there are no useful and exception-free methodological rules governing the progress of science or the growth of knowledge, and that the idea that science can or should operate according to universal and fixed rules is unrealistic, pernicious and detrimental to science itself. Feyerabend advocates a democratic society where science is treated as an equal to other ideologies or social institutions such as religion, and education, or magic and mythology, and considers the dominance of science in society authoritarian and unjustified. He also contended (along with Imre Lakatos) that the demarcation problem of distinguishing science from pseudoscience on objective grounds is not possible and thus fatal to the notion of science running according to fixed, universal rules.

Feyerabend also criticized science for not having evidence for its own philosophical precepts. Particularly the notion of Uniformity of Law and the Uniformity of Process across time and space, as noted by Stephen Jay Gould. "We have to realize that a unified theory of the physical world simply does not exist" says Feyerabend, "We have theories that work in restricted regions, we have purely formal attempts to condense them into a single formula, we have lots of unfounded claims (such as the claim that all of chemistry can be reduced to physics), phenomena that do not fit into the accepted framework are suppressed; in physics, which many scientists regard as the one really basic science, we have now at least three different points of view...without a promise of conceptual (and not only formal) unification". In other words, science is begging the question when it presupposes that there is a universal truth with no proof thereof.

Historian Jacques Barzun termed science "a faith as fanatical as any in history" and warned against the use of scientific thought to suppress considerations of meaning as integral to human existence.

Sociologist Stanley Aronowitz scrutinizes science for operating with the presumption that the only acceptable criticisms of science are those conducted within the methodological framework that science has set up for itself. That science insists that only those who have been inducted into its community, through means of training and credentials, are qualified to make these criticisms. Aronowitz also alleges that while scientists consider it absurd that Fundamentalist Christianity uses biblical references to bolster their claim that the Bible is true, scientists pull the same tactic by using the tools of science to settle disputes concerning its own validity.

Philosopher of religion Alan Watts criticized science for operating under a materialist model of the world that he posited is simply a modified version of the Abrahamic worldview, that "the universe is constructed and maintained by a Lawmaker" (commonly identified as God or the Logos). Watts asserts that during the rise of secularism through the 18th to 20th century when scientific philosophers got rid of the notion of a lawmaker they kept the notion of law, and that the idea that the world is a material machine run by law is a presumption just as unscientific as religious doctrines that affirm it is a material machine made and run by a lawmaker.

Epistemology

David Parkin compared the epistemological stance of science to that of divination. He suggested that, to the degree that divination is an epistemologically specific means of gaining insight into a given question, science itself can be considered a form of divination that is framed from a Western view of the nature (and thus possible applications) of knowledge.

Author and Episkopos of Discordianism Robert Anton Wilson stresses that the instruments used in scientific investigation produce meaningful answers relevant only to the instrument, and that there is no objective vantage point from which science could verify its findings since all findings are relative to begin with.

Ethics

Joseph Wright of Derby (1768) An Experiment on a Bird in an Air Pump, National Gallery, London

Several academics have offered critiques concerning ethics in science. In Science and Ethics, for example, the professor of philosophy Bernard Rollin examines the relevance of ethics to science, and argues in favor of making education in ethics part and parcel of scientific training.

Social science scholars, like social anthropologist Tim Ingold, and scholars from philosophy and the humanities, like critical theorist Adorno, have criticized modern science for subservience to economic and technological interests. A related criticism is the debate on positivism. While before the 19th century science was perceived to be in opposition to religion, in contemporary society science is often defined as the antithesis of the humanities and the arts.

Many thinkers, such as Carolyn Merchant, Theodor Adorno and E. F. Schumacher considered that the 17th century scientific revolution shifted science from a focus on understanding nature, or wisdom, to a focus on manipulating nature, i.e. power, and that science's emphasis on manipulating nature leads it inevitably to manipulate people, as well. Science's focus on quantitative measures has led to critiques that it is unable to recognize important qualitative aspects of the world.

Critiques from within science

Metascience is the use of scientific methodology to study science itself, with the goal of increasing the quality of research while reducing waste. Meta-research has identified methodological weaknesses in many areas of science. Critics argue that reforms are needed to address these weaknesses.

Reproduciblity

The social sciences, such as social psychology, have long suffered from the problem of their studies being largely not reproducible. Now, medicine has come under similar pressures. In a phenomenon known as the replication crisis, journals are less likely to publish straight replication studies so it may be difficult to disprove results. Another result of publication bias is the Proteus phenomenon: early attempts to replicate results tend to contradict them. However, there are claims that this bias may be beneficial, allowing accurate meta-analysis with fewer publications.

Cognitive and publication biases

Critics argue that the biggest bias within science is motivated reasoning, whereby scientists are more likely to accept evidence that supports their hypothesis and more likely to scrutinize findings that do not. Scientists do not practice pure induction but instead often come into science with preconceived ideas and often will, unconsciously or consciously, interpret observations to support their own hypotheses through confirmation bias. For example, scientists may re-run trials when they do not support a hypothesis but use results from the first trial when they do support their hypothesis. It is often argued that while each individual has cognitive biases, these biases are corrected for when scientific evidence converges. However, systematic issues in the publication system of academic journals can often compound these biases. Issues like publication bias, where studies with non-significant results are less likely to be published, and selective outcome reporting bias, where only the significant outcomes out of a variety of outcomes are likely to be published, are common within academic literature. These biases have widespread implications, such as the distortion of meta-analyses where only studies that include positive results are likely to be included. Statistical outcomes can be manipulated as well, for example large numbers of participants can be used and trials overpowered so that small difference cause significant effects or inclusion criteria can be changed to include those are most likely to respond to a treatment. Whether produced on purpose or not, all of these issues need to be taken into consideration within scientific research, and peer-reviewed published evidence should not be assumed to be outside of the realm of bias and error; some critics are now claiming that many results in scientific journals are false or exaggerated.

Feminist critiques

Feminist scholars and women scientists such as Emily Martin, Evelyn Fox Keller, Ruth Hubbard, Londa Schiebinger and Bonnie Spanier have critiqued science because they believe it presents itself as objective and neutral while ignoring its inherent gender bias. They assert that gender bias exists in the language and practice of science, as well as in the expected appearance and social acceptance of who can be scientists within society.

Sandra Harding says that the "moral and political insights of the women's movement have inspired social scientists and biologists to raise critical questions about the ways traditional researchers have explained gender, sex, and relations within and between the social and natural worlds." Anne Fausto-Sterling is a prominent example of this kind of feminist work within biological science. Some feminists, such as Ruth Hubbard and Evelyn Fox Keller, criticize traditional scientific discourse as being historically biased towards a male perspective. A part of the feminist research agenda is the examination of the ways in which power inequities are created and/or reinforced in scientific and academic institutions.

Other feminist scholars, such as Ann Hibner Koblitz, Lenore Blum, Mary Gray, Mary Beth Ruskai, and Pnina Abir-Am and Dorinda Outram, have criticized some gender and science theories for ignoring the diverse nature of scientific research and the tremendous variation in women's experiences in different cultures and historical periods. For example, the first generation of women to receive advanced university degrees in Europe were almost entirely in the natural sciences and medicine—in part because those fields at the time were much more welcoming of women than were the humanities. Koblitz and others who are interested in increasing the number of women in science have expressed concern that some of the statements by feminist critics of science could undermine those efforts, notably the following assertion by Keller:

Just as surely as inauthenticity is the cost a woman suffers by joining men in misogynist jokes, so it is, equally, the cost suffered by a woman who identifies with an image of the scientist modeled on the patriarchal husband. Only if she undergoes a radical disidentification from self can she share masculine pleasure in mastering a nature cast in the image of woman as passive, inert, and blind.

Language in science

Emily Martin examines the metaphors used in science to support her claim that science reinforces socially constructed ideas about gender rather than objective views of nature. In her study about the fertilization process, Martin describes several cases when gender-biased perception skewed the descriptions of biological processes during fertilization and even possibly hampered the research. She asserts that classic metaphors of the strong dominant sperm racing to an idle egg are products of gendered stereotyping rather than a faithful portrayal of human fertilization. The notion that women are passive and men are active are socially constructed attributes of gender which, according to Martin, scientists have projected onto the events of fertilization and so obscuring the fact that eggs do play an active role. For example, she wrote that "even after having revealed...the egg to be a chemically active sperm catcher, even after discussing the egg's role in tethering the sperm, the research team continued for another three years to describe the sperm's role as actively penetrating the egg." Scott Gilbert, a developmental biologist at Swarthmore College supports her position: "if you don’t have an interpretation of fertilization that allows you to look at the egg as active, you won’t look for the molecules that can prove it. You simply won’t find activities that you don’t visualize."

Media and politics

The mass media face a number of pressures that can prevent them from accurately depicting competing scientific claims in terms of their credibility within the scientific community as a whole. Determining how much weight to give different sides in a scientific debate requires considerable expertise regarding the matter. Few journalists have real scientific knowledge, and even beat reporters who know a great deal about certain scientific issues may know little about other ones they are suddenly asked to cover.

Many issues damage the relationship of science to the media and the use of science and scientific arguments by politicians. As a very broad generalisation, many politicians seek certainties and facts whilst scientists typically offer probabilities and caveats. However, politicians' ability to be heard in the mass media frequently distorts the scientific understanding by the public. Examples in Britain include the controversy over the MMR inoculation, and the 1988 forced resignation of a government minister, Edwina Currie, for revealing the high probability that battery eggs were contaminated with Salmonella.

Some scientists and philosophers suggest that scientific theories are more or less shaped by the dominant political, economic, or cultural models of the time, even though the scientific community may claim to be exempt from social influences and historical conditions. For example, Zoologist Peter Kropotkin thought that the Darwinian theory of evolution overstressed a painful "we must struggle to survive" way of life, which he said was influenced by capitalism and the struggling lifestyles people lived within it. Karl Marx also thought that science was largely driven by and used as capital.

Robert Anton Wilson, Stanley Aronowitz, and Paul Feyerabend all thought that the military-industrial complex, large corporations, and the grants that came from them had an immense influence over the research and even results of scientific experiments. Aronowitz even went as far as to say "It does not matter that the scientific community ritualistically denies its alliance with economic/industrial and military power. The evidence is overwhelming that such is the case. Thus, every major power has a national science policy; the United States Military appropriates billions each year for 'basic' as well as 'applied' research".

 

Empiricism

In philosophy, empiricism is a theory that states that knowledge comes only or primarily from sensory experience. It is one of several views of epistemology, along with rationalism and skepticism. Empiricism emphasizes the role of empirical evidence in the formation of ideas, rather than innate ideas or traditions. However, empiricists may argue that traditions (or customs) arise due to relations of previous sense experiences.

Historically, empiricism was associated with the "blank slate" concept (tabula rasa), according to which the human mind is "blank" at birth and develops its thoughts only through experience.

Empiricism in the philosophy of science emphasizes evidence, especially as discovered in experiments. It is a fundamental part of the scientific method that all hypotheses and theories must be tested against observations of the natural world rather than resting solely on a priori reasoning, intuition, or revelation.

Empiricism, often used by natural scientists, says that "knowledge is based on experience" and that "knowledge is tentative and probabilistic, subject to continued revision and falsification". Empirical research, including experiments and validated measurement tools, guides the scientific method.

Etymology

The English term empirical derives from the Ancient Greek word ἐμπειρία, empeiria, which is cognate with and translates to the Latin experientia, from which the words experience and experiment are derived.

History

Background

A central concept in science and the scientific method is that conclusions must be empirically based on the evidence of the senses. Both natural and social sciences use working hypotheses that are testable by observation and experiment. The term semi-empirical is sometimes used to describe theoretical methods that make use of basic axioms, established scientific laws, and previous experimental results in order to engage in reasoned model building and theoretical inquiry.

Philosophical empiricists hold no knowledge to be properly inferred or deduced unless it is derived from one's sense-based experience. This view is commonly contrasted with rationalism, which states that knowledge may be derived from reason independently of the senses. For example, John Locke held that some knowledge (e.g. knowledge of God's existence) could be arrived at through intuition and reasoning alone. Similarly Robert Boyle, a prominent advocate of the experimental method, held that we have innate ideas. The main continental rationalists (Descartes, Spinoza, and Leibniz) were also advocates of the empirical "scientific method".

Early empiricism

Between 600 and 200 BCE

Between 600 and 200 BCE, the Vaisheshika school of Hindu philosophy, founded by the ancient Indian philosopher Kanada, accepted perception and inference as the only two reliable sources of knowledge. This is enumerated in his work Vaiśeṣika Sūtra.

c. 330 – 400 BCE

The earliest Western proto-empiricists were the Empiric school of ancient Greek medical practitioners, founded in 330 BCE. Its members rejected the three doctrines of the Dogmatic school, preferring to rely on the observation of phantasiai (i.e., phenomena, the appearances). The Empiric school was closely allied with Pyrrhonist school of philosophy, which made the philosophical case for their proto-empiricism.

The notion of tabula rasa ("clean slate" or "blank tablet") connotes a view of mind as an originally blank or empty recorder (Locke used the words "white paper") on which experience leaves marks. This denies that humans have innate ideas. The notion dates back to Aristotle, c. 350 BC:

What the mind (nous) thinks must be in it in the same sense as letters are on a tablet (grammateion) which bears no actual writing (grammenon); this is just what happens in the case of the mind. (Aristotle, On the Soul, 3.4.430a1).

Aristotle's explanation of how this was possible was not strictly empiricist in a modern sense, but rather based on his theory of potentiality and actuality, and experience of sense perceptions still requires the help of the active nous. These notions contrasted with Platonic notions of the human mind as an entity that pre-existed somewhere in the heavens, before being sent down to join a body on Earth (see Plato's Phaedo and Apology, as well as others). Aristotle was considered to give a more important position to sense perception than Plato, and commentators in the Middle Ages summarized one of his positions as "nihil in intellectu nisi prius fuerit in sensu" (Latin for "nothing in the intellect without first being in the senses").

This idea was later developed in ancient philosophy by the Stoic school, from about 330 BCE. Stoic epistemology generally emphasized that the mind starts blank, but acquires knowledge as the outside world is impressed upon it. The doxographer Aetius summarizes this view as "When a man is born, the Stoics say, he has the commanding part of his soul like a sheet of paper ready for writing upon."

A drawing of Ibn Sina (Avicenna) from 1271

Islamic Golden Age and Pre-Renaissance (5th to 15th centuries CE)

During the Middle Ages (from the 5th to the 15th century CE) Aristotle's theory of tabula rasa was developed by Islamic philosophers starting with Al Farabi (c. 872 – 951 CE), developing into an elaborate theory by Avicenna (c. 980 – 1037) and demonstrated as a thought experiment by Ibn Tufail. For Avicenna (Ibn Sina), for example, the tabula rasa is a pure potentiality that is actualized through education, and knowledge is attained through "empirical familiarity with objects in this world from which one abstracts universal concepts" developed through a "syllogistic method of reasoning in which observations lead to propositional statements which when compounded lead to further abstract concepts". The intellect itself develops from a material intellect (al-'aql al-hayulani), which is a potentiality "that can acquire knowledge to the active intellect (al-'aql al-fa'il), the state of the human intellect in conjunction with the perfect source of knowledge". So the immaterial "active intellect", separate from any individual person, is still essential for understanding to occur.

In the 12th century CE the Andalusian Muslim philosopher and novelist Abu Bakr Ibn Tufail (known as "Abubacer" or "Ebn Tophail" in the West) included the theory of tabula rasa as a thought experiment in his Arabic philosophical novel, Hayy ibn Yaqdhan in which he depicted the development of the mind of a feral child "from a tabula rasa to that of an adult, in complete isolation from society" on a desert island, through experience alone. The Latin translation of his philosophical novel, entitled Philosophus Autodidactus, published by Edward Pococke the Younger in 1671, had an influence on John Locke's formulation of tabula rasa in An Essay Concerning Human Understanding.

A similar Islamic theological novel, Theologus Autodidactus, was written by the Arab theologian and physician Ibn al-Nafis in the 13th century. It also dealt with the theme of empiricism through the story of a feral child on a desert island, but departed from its predecessor by depicting the development of the protagonist's mind through contact with society rather than in isolation from society.

During the 13th century Thomas Aquinas adopted the Aristotelian position that the senses are essential to mind into scholasticism. Bonaventure (1221–1274), one of Aquinas' strongest intellectual opponents, offered some of the strongest arguments in favour of the Platonic idea of the mind.

Renaissance Italy

In the late renaissance various writers began to question the medieval and classical understanding of knowledge acquisition in a more fundamental way. In political and historical writing Niccolò Machiavelli and his friend Francesco Guicciardini initiated a new realistic style of writing. Machiavelli in particular was scornful of writers on politics who judged everything in comparison to mental ideals and demanded that people should study the "effectual truth" instead. Their contemporary, Leonardo da Vinci (1452–1519) said, "If you find from your own experience that something is a fact and it contradicts what some authority has written down, then you must abandon the authority and base your reasoning on your own findings."

Significantly, an empirical metaphysical system was developed by the Italian philosopher Bernardino Telesio which had an enormous impact on the development of later Italian thinkers, including Telesio's students Antonio Persio and Sertorio Quattromani, his contemporaries Thomas Campanella and Giordano Bruno, and later British philosophers such as Francis Bacon, who regarded Telesio as "the first of the moderns.”  Telesio's influence can also be seen on the French philosophers René Descartes and Pierre Gassendi.

The decidedly anti-Aristotelian and anti-clerical music theorist Vincenzo Galilei (c. 1520 – 1591), father of Galileo and the inventor of monody, made use of the method in successfully solving musical problems, firstly, of tuning such as the relationship of pitch to string tension and mass in stringed instruments, and to volume of air in wind instruments; and secondly to composition, by his various suggestions to composers in his Dialogo della musica antica e moderna (Florence, 1581). The Italian word he used for "experiment" was esperienza. It is known that he was the essential pedagogical influence upon the young Galileo, his eldest son (cf. Coelho, ed. Music and Science in the Age of Galileo Galilei), arguably one of the most influential empiricists in history. Vincenzo, through his tuning research, found the underlying truth at the heart of the misunderstood myth of 'Pythagoras' hammers' (the square of the numbers concerned yielded those musical intervals, not the actual numbers, as believed), and through this and other discoveries that demonstrated the fallibility of traditional authorities, a radically empirical attitude developed, passed on to Galileo, which regarded "experience and demonstration" as the sine qua non of valid rational enquiry.

British empiricism

Thomas Hobbes

British empiricism, a retrospective characterization, emerged during the 17th century as an approach to early modern philosophy and modern science. Although both integral to this overarching transition, Francis Bacon, in England, advised empiricism at 1620, whereas René Descartes, in France, upheld rationalism around 1640, a distinction drawn by Immanuel Kant, in Germany, near 1780. (Bacon's natural philosophy was influenced by Italian philosopher Bernardino Telesio and by Swiss physician Paracelsus.) Contributing later in the 17th century, Thomas Hobbes and Baruch Spinoza are retrospectively identified likewise as an empiricist and a rationalist, respectively. In the Enlightenment during the 18th century, both George Berkeley, in England, and David Hume, in Scotland, became leading exponents of empiricism, a lead precedented in the late 17th century by John Locke, also in England, hence the dominance of empiricism in British philosophy.

In response to the early-to-mid-17th century "continental rationalism," John Locke (1632–1704) proposed in An Essay Concerning Human Understanding (1689) a very influential view wherein the only knowledge humans can have is a posteriori, i.e., based upon experience. Locke is famously attributed with holding the proposition that the human mind is a tabula rasa, a "blank tablet", in Locke's words "white paper", on which the experiences derived from sense impressions as a person's life proceeds are written. There are two sources of our ideas: sensation and reflection. In both cases, a distinction is made between simple and complex ideas. The former are unanalysable, and are broken down into primary and secondary qualities. Primary qualities are essential for the object in question to be what it is. Without specific primary qualities, an object would not be what it is. For example, an apple is an apple because of the arrangement of its atomic structure. If an apple were structured differently, it would cease to be an apple. Secondary qualities are the sensory information we can perceive from its primary qualities. For example, an apple can be perceived in various colours, sizes, and textures but it is still identified as an apple. Therefore, its primary qualities dictate what the object essentially is, while its secondary qualities define its attributes. Complex ideas combine simple ones, and divide into substances, modes, and relations. According to Locke, our knowledge of things is a perception of ideas that are in accordance or discordance with each other, which is very different from the quest for certainty of Descartes.

A generation later, the Irish Anglican bishop, George Berkeley (1685–1753), determined that Locke's view immediately opened a door that would lead to eventual atheism. In response to Locke, he put forth in his Treatise Concerning the Principles of Human Knowledge (1710) an important challenge to empiricism in which things only exist either as a result of their being perceived, or by virtue of the fact that they are an entity doing the perceiving. (For Berkeley, God fills in for humans by doing the perceiving whenever humans are not around to do it.) In his text Alciphron, Berkeley maintained that any order humans may see in nature is the language or handwriting of God. Berkeley's approach to empiricism would later come to be called subjective idealism.

The Scottish philosopher David Hume (1711–1776) responded to Berkeley's criticisms of Locke, as well as other differences between early modern philosophers, and moved empiricism to a new level of skepticism. Hume argued in keeping with the empiricist view that all knowledge derives from sense experience, but he accepted that this has implications not normally acceptable to philosophers. He wrote for example, "Locke divides all arguments into demonstrative and probable. On this view, we must say that it is only probable that all men must die or that the sun will rise to-morrow, because neither of these can be demonstrated. But to conform our language more to common use, we ought to divide arguments into demonstrations, proofs, and probabilities—by ‘proofs’ meaning arguments from experience that leave no room for doubt or opposition."

"I believe the most general and most popular explication of this matter, is to say [See Mr. Locke, chapter of power.], that finding from experience, that there are several new productions in matter, such as the motions and variations of body, and concluding that there must somewhere be a power capable of producing them, we arrive at last by this reasoning at the idea of power and efficacy. But to be convinced that this explication is more popular than philosophical, we need but reflect on two very obvious principles. First, That reason alone can never give rise to any original idea, and secondly, that reason, as distinguished from experience, can never make us conclude, that a cause or productive quality is absolutely requisite to every beginning of existence. Both these considerations have been sufficiently explained: and therefore shall not at present be any farther insisted on."

— Hume Section XIV "of the idea of necessary connexion in A Treatise of Human Nature

Hume divided all of human knowledge into two categories: relations of ideas and matters of fact (see also Kant's analytic-synthetic distinction). Mathematical and logical propositions (e.g. "that the square of the hypotenuse is equal to the sum of the squares of the two sides") are examples of the first, while propositions involving some contingent observation of the world (e.g. "the sun rises in the East") are examples of the second. All of people's "ideas", in turn, are derived from their "impressions". For Hume, an "impression" corresponds roughly with what we call a sensation. To remember or to imagine such impressions is to have an "idea". Ideas are therefore the faint copies of sensations.

David Hume's empiricism led to numerous philosophical schools.

Hume maintained that no knowledge, even the most basic beliefs about the natural world, can be conclusively established by reason. Rather, he maintained, our beliefs are more a result of accumulated habits, developed in response to accumulated sense experiences. Among his many arguments Hume also added another important slant to the debate about scientific method—that of the problem of induction. Hume argued that it requires inductive reasoning to arrive at the premises for the principle of inductive reasoning, and therefore the justification for inductive reasoning is a circular argument. Among Hume's conclusions regarding the problem of induction is that there is no certainty that the future will resemble the past. Thus, as a simple instance posed by Hume, we cannot know with certainty by inductive reasoning that the sun will continue to rise in the East, but instead come to expect it to do so because it has repeatedly done so in the past.

Hume concluded that such things as belief in an external world and belief in the existence of the self were not rationally justifiable. According to Hume these beliefs were to be accepted nonetheless because of their profound basis in instinct and custom. Hume's lasting legacy, however, was the doubt that his skeptical arguments cast on the legitimacy of inductive reasoning, allowing many skeptics who followed to cast similar doubt.

Phenomenalism

Most of Hume's followers have disagreed with his conclusion that belief in an external world is rationally unjustifiable, contending that Hume's own principles implicitly contained the rational justification for such a belief, that is, beyond being content to let the issue rest on human instinct, custom and habit. According to an extreme empiricist theory known as phenomenalism, anticipated by the arguments of both Hume and George Berkeley, a physical object is a kind of construction out of our experiences. Phenomenalism is the view that physical objects, properties, events (whatever is physical) are reducible to mental objects, properties, events. Ultimately, only mental objects, properties, events, exist—hence the closely related term subjective idealism. By the phenomenalistic line of thinking, to have a visual experience of a real physical thing is to have an experience of a certain kind of group of experiences. This type of set of experiences possesses a constancy and coherence that is lacking in the set of experiences of which hallucinations, for example, are a part. As John Stuart Mill put it in the mid-19th century, matter is the "permanent possibility of sensation". Mill's empiricism went a significant step beyond Hume in still another respect: in maintaining that induction is necessary for all meaningful knowledge including mathematics. As summarized by D.W. Hamlin:

[Mill] claimed that mathematical truths were merely very highly confirmed generalizations from experience; mathematical inference, generally conceived as deductive [and a priori] in nature, Mill set down as founded on induction. Thus, in Mill's philosophy there was no real place for knowledge based on relations of ideas. In his view logical and mathematical necessity is psychological; we are merely unable to conceive any other possibilities than those that logical and mathematical propositions assert. This is perhaps the most extreme version of empiricism known, but it has not found many defenders.

Mill's empiricism thus held that knowledge of any kind is not from direct experience but an inductive inference from direct experience. The problems other philosophers have had with Mill's position center around the following issues: Firstly, Mill's formulation encounters difficulty when it describes what direct experience is by differentiating only between actual and possible sensations. This misses some key discussion concerning conditions under which such "groups of permanent possibilities of sensation" might exist in the first place. Berkeley put God in that gap; the phenomenalists, including Mill, essentially left the question unanswered. In the end, lacking an acknowledgement of an aspect of "reality" that goes beyond mere "possibilities of sensation", such a position leads to a version of subjective idealism. Questions of how floor beams continue to support a floor while unobserved, how trees continue to grow while unobserved and untouched by human hands, etc., remain unanswered, and perhaps unanswerable in these terms. Secondly, Mill's formulation leaves open the unsettling possibility that the "gap-filling entities are purely possibilities and not actualities at all". Thirdly, Mill's position, by calling mathematics merely another species of inductive inference, misapprehends mathematics. It fails to fully consider the structure and method of mathematical science, the products of which are arrived at through an internally consistent deductive set of procedures which do not, either today or at the time Mill wrote, fall under the agreed meaning of induction.

The phenomenalist phase of post-Humean empiricism ended by the 1940s, for by that time it had become obvious that statements about physical things could not be translated into statements about actual and possible sense data. If a physical object statement is to be translatable into a sense-data statement, the former must be at least deducible from the latter. But it came to be realized that there is no finite set of statements about actual and possible sense-data from which we can deduce even a single physical-object statement. The translating or paraphrasing statement must be couched in terms of normal observers in normal conditions of observation. There is, however, no finite set of statements that are couched in purely sensory terms and can express the satisfaction of the condition of the presence of a normal observer. According to phenomenalism, to say that a normal observer is present is to make the hypothetical statement that were a doctor to inspect the observer, the observer would appear to the doctor to be normal. But, of course, the doctor himself must be a normal observer. If we are to specify this doctor's normality in sensory terms, we must make reference to a second doctor who, when inspecting the sense organs of the first doctor, would himself have to have the sense data a normal observer has when inspecting the sense organs of a subject who is a normal observer. And if we are to specify in sensory terms that the second doctor is a normal observer, we must refer to a third doctor, and so on (also see the third man).

Logical empiricism

Logical empiricism (also logical positivism or neopositivism) was an early 20th-century attempt to synthesize the essential ideas of British empiricism (e.g. a strong emphasis on sensory experience as the basis for knowledge) with certain insights from mathematical logic that had been developed by Gottlob Frege and Ludwig Wittgenstein. Some of the key figures in this movement were Otto Neurath, Moritz Schlick and the rest of the Vienna Circle, along with A.J. Ayer, Rudolf Carnap and Hans Reichenbach.

The neopositivists subscribed to a notion of philosophy as the conceptual clarification of the methods, insights and discoveries of the sciences. They saw in the logical symbolism elaborated by Frege (1848–1925) and Bertrand Russell (1872–1970) a powerful instrument that could rationally reconstruct all scientific discourse into an ideal, logically perfect, language that would be free of the ambiguities and deformations of natural language. This gave rise to what they saw as metaphysical pseudoproblems and other conceptual confusions. By combining Frege's thesis that all mathematical truths are logical with the early Wittgenstein's idea that all logical truths are mere linguistic tautologies, they arrived at a twofold classification of all propositions: the analytic (a priori) and the synthetic (a posteriori). On this basis, they formulated a strong principle of demarcation between sentences that have sense and those that do not: the so-called verification principle. Any sentence that is not purely logical, or is unverifiable is devoid of meaning. As a result, most metaphysical, ethical, aesthetic and other traditional philosophical problems came to be considered pseudoproblems.

In the extreme empiricism of the neopositivists—at least before the 1930s—any genuinely synthetic assertion must be reducible to an ultimate assertion (or set of ultimate assertions) that expresses direct observations or perceptions. In later years, Carnap and Neurath abandoned this sort of phenomenalism in favor of a rational reconstruction of knowledge into the language of an objective spatio-temporal physics. That is, instead of translating sentences about physical objects into sense-data, such sentences were to be translated into so-called protocol sentences, for example, "X at location Y and at time T observes such and such." The central theses of logical positivism (verificationism, the analytic–synthetic distinction, reductionism, etc.) came under sharp attack after World War II by thinkers such as Nelson Goodman, W.V. Quine, Hilary Putnam, Karl Popper, and Richard Rorty. By the late 1960s, it had become evident to most philosophers that the movement had pretty much run its course, though its influence is still significant among contemporary analytic philosophers such as Michael Dummett and other anti-realists.

Pragmatism

In the late 19th and early 20th century several forms of pragmatic philosophy arose. The ideas of pragmatism, in its various forms, developed mainly from discussions between Charles Sanders Peirce and William James when both men were at Harvard in the 1870s. James popularized the term "pragmatism", giving Peirce full credit for its patrimony, but Peirce later demurred from the tangents that the movement was taking, and redubbed what he regarded as the original idea with the name of "pragmaticism". Along with its pragmatic theory of truth, this perspective integrates the basic insights of empirical (experience-based) and rational (concept-based) thinking.

Charles Peirce (1839–1914) was highly influential in laying the groundwork for today's empirical scientific method. Although Peirce severely criticized many elements of Descartes' peculiar brand of rationalism, he did not reject rationalism outright. Indeed, he concurred with the main ideas of rationalism, most importantly the idea that rational concepts can be meaningful and the idea that rational concepts necessarily go beyond the data given by empirical observation. In later years he even emphasized the concept-driven side of the then ongoing debate between strict empiricism and strict rationalism, in part to counterbalance the excesses to which some of his cohorts had taken pragmatism under the "data-driven" strict-empiricist view.

Among Peirce's major contributions was to place inductive reasoning and deductive reasoning in a complementary rather than competitive mode, the latter of which had been the primary trend among the educated since David Hume wrote a century before. To this, Peirce added the concept of abductive reasoning. The combined three forms of reasoning serve as a primary conceptual foundation for the empirically based scientific method today. Peirce's approach "presupposes that (1) the objects of knowledge are real things, (2) the characters (properties) of real things do not depend on our perceptions of them, and (3) everyone who has sufficient experience of real things will agree on the truth about them. According to Peirce's doctrine of fallibilism, the conclusions of science are always tentative. The rationality of the scientific method does not depend on the certainty of its conclusions, but on its self-corrective character: by continued application of the method science can detect and correct its own mistakes, and thus eventually lead to the discovery of truth".

In his Harvard "Lectures on Pragmatism" (1903), Peirce enumerated what he called the "three cotary propositions of pragmatism" (L: cos, cotis whetstone), saying that they "put the edge on the maxim of pragmatism". First among these he listed the peripatetic-thomist observation mentioned above, but he further observed that this link between sensory perception and intellectual conception is a two-way street. That is, it can be taken to say that whatever we find in the intellect is also incipiently in the senses. Hence, if theories are theory-laden then so are the senses, and perception itself can be seen as a species of abductive inference, its difference being that it is beyond control and hence beyond critique—in a word, incorrigible. This in no way conflicts with the fallibility and revisability of scientific concepts, since it is only the immediate percept in its unique individuality or "thisness"—what the Scholastics called its haecceity—that stands beyond control and correction. Scientific concepts, on the other hand, are general in nature, and transient sensations do in another sense find correction within them. This notion of perception as abduction has received periodic revivals in artificial intelligence and cognitive science research, most recently for instance with the work of Irvin Rock on indirect perception.

Around the beginning of the 20th century, William James (1842–1910) coined the term "radical empiricism" to describe an offshoot of his form of pragmatism, which he argued could be dealt with separately from his pragmatism—though in fact the two concepts are intertwined in James's published lectures. James maintained that the empirically observed "directly apprehended universe needs ... no extraneous trans-empirical connective support", by which he meant to rule out the perception that there can be any value added by seeking supernatural explanations for natural phenomena. James' "radical empiricism" is thus not radical in the context of the term "empiricism", but is instead fairly consistent with the modern use of the term "empirical". His method of argument in arriving at this view, however, still readily encounters debate within philosophy even today.

John Dewey (1859–1952) modified James' pragmatism to form a theory known as instrumentalism. The role of sense experience in Dewey's theory is crucial, in that he saw experience as unified totality of things through which everything else is interrelated. Dewey's basic thought, in accordance with empiricism was that reality is determined by past experience. Therefore, humans adapt their past experiences of things to perform experiments upon and test the pragmatic values of such experience. The value of such experience is measured experientially and scientifically, and the results of such tests generate ideas that serve as instruments for future experimentation, in physical sciences as in ethics. Thus, ideas in Dewey's system retain their empiricist flavour in that they are only known a posteriori.

Buddhist cosmology

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Bud...