Search This Blog

Thursday, March 25, 2021

Naïve realism

From Wikipedia, the free encyclopedia
Naïve realism argues we perceive the world directly

In philosophy of perception and philosophy of mind, naïve realism (also known as direct realism, perceptual realism, or common sense realism) is the idea that the senses provide us with direct awareness of objects as they really are. When referred to as direct realism, naïve realism is often contrasted with indirect realism.

According to the naïve realist, the objects of perception are not merely representations of external objects, but are in fact those external objects themselves. The naïve realist is typically also a metaphysical realist, holding that these objects continue to obey the laws of physics and retain all of their properties regardless of whether or not there is anyone to observe them. They are composed of matter, occupy space, and have properties, such as size, shape, texture, smell, taste and colour, that are usually perceived correctly. The indirect realist, by contrast, holds that the objects of perception are simply representations of reality based on sensory inputs, and thus adheres to the primary/secondary quality distinction in ascribing properties to external objects.

In addition to indirect realism, naïve realism can also be contrasted with some forms of idealism, which claim that no world exists apart from mind-dependent ideas, and some forms of philosophical skepticism, which say that we cannot trust our senses or prove that we are not radically deceived in our beliefs; that our conscious experience is not of the real world but of an internal representation of the world.

Overview

The naïve realist is generally committed to the following views:

  • Metaphysical realism: There exists a world of material objects, which exist independently of being perceived, and which have properties such as shape, size, color, mass, and so on independently of being perceived
  • Empiricism: Some statements about these objects can be known to be true through sensory experience
  • Naïve realism: By means of our senses, we perceive the world directly, and pretty much as it is, meaning that our claims to have knowledge of it are justified

Among contemporary analytic philosophers who defended direct realism one might refer to, for example, Hilary Putnam, John McDowell, Galen Strawson, John R. Searle, and John L. Pollock.

Searle, for instance, disputes the popular assumption that "we can only directly perceive our own subjective experiences, but never objects and states of affairs in the world themselves". According to Searle, it has influenced many thinkers to reject direct realism. But Searle contends that the rejection of direct realism is based on a bad argument: the argument from illusion, which in turn relies on vague assumptions on the nature or existence of "sense data". Various sense data theories were deconstructed in 1962 by the British philosopher J. L. Austin in a book titled Sense and Sensibilia.

Talk of sense data has largely been replaced today by talk of representational perception in a broader sense, and scientific realists typically take perception to be representational and therefore assume that indirect realism is true. But the assumption is philosophical, and arguably little prevents scientific realists from assuming direct realism to be true. In a blog-post on "Naive realism and color realism", Hilary Putnam sums up with the following words: "Being an apple is not a natural kind in physics, but it is in biology, recall. Being complex and of no interest to fundamental physics isn't a failure to be "real". I think green is as real as applehood."

The direct realist claims that the experience of a sunset, for instance, is the real sunset that we directly experience. The indirect realist claims that our relation to reality is indirect, so the experience of a sunset is a subjective representation of what really is radiation as described by physics. But the direct realist does not deny that the sunset is radiation; the experience has a hierarchical structure, and the radiation is part of what amounts to the direct experience.

Simon Blackburn has argued that whatever positions they may take in books, articles or lectures, naive realism is the view of "philosophers when they are off-duty."

Scientific realism and naïve perceptual realism

Many philosophers claim that it is incompatible to accept naïve realism in the philosophy of perception and scientific realism in the philosophy of science. Scientific realism states that the universe contains just those properties that feature in a scientific description of it, which would mean that secondary qualities like color are not real per se, and that all that exists are certain wavelengths which are reflected by physical objects because of their microscopic surface texture.

John Locke notably held that the world only contains the primary qualities that feature in a corpuscularian scientific account of the world, and that secondary qualities are in some sense subjective and depend for their existence upon the presence of some perceiver who can observe the objects.

One should add, however, that naïve realism does not necessarily claim that reality is only what we see, hear, etc. Likewise, scientific realism does not claim that reality is only what can be described by fundamental physics. It follows that the relevant distinction to make is not between naïve and scientific realism but between direct and indirect realism.

Influence in psychology

Naïve realism in philosophy has also inspired work on visual perception in psychology. The leading direct realist theorist in psychology was J. J. Gibson. Other psychologists were heavily influenced by this approach, including William Mace, Claire Michaels, Edward Reed, Robert Shaw, and Michael Turvey. More recently, Carol Fowler has promoted a direct realist approach to speech perception.

Objectivity (science)

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Objectivity_(science)

Objectivity in science is an attempt to uncover truths about the natural world by eliminating personal biases, emotions, and false beliefs. It is often linked to observation as part of the scientific method. It is thus intimately related to the aim of testability and reproducibility. To be considered objective, the results of measurement must be communicated from person to person, and then demonstrated for third parties, as an advance in a collective understanding of the world. Such demonstrable knowledge has ordinarily conferred demonstrable powers of prediction or technology.

The problem of philosophical objectivity is contrasted with personal subjectivity, sometimes exacerbated by the overgeneralization of a hypothesis to the whole. E.g. Newton's law of universal gravitation appears to be the norm for the attraction between celestial bodies, but it was later superseded by the more general theory of relativity.

History

The scientific method was argued for by Enlightenment philosopher Francis Bacon, rose to popularity with the discoveries of Isaac Newton and his followers, and continued into later eras. In the early eighteenth century, there existed an epistemic virtue in science which has been called truth-to-nature. This ideal was practiced by Enlightenment naturalists and scientific atlas-makers, and involved active attempts to eliminate any idiosyncrasies in their representations of nature in order to create images thought best to represent "what truly is." Judgment and skill were deemed necessary in order to determine the "typical", "characteristic", "ideal", or "average." In practicing truth-to-nature naturalists did not seek to depict exactly what was seen; rather, they sought a reasoned image.

In the latter half of the nineteenth century objectivity in science was born when a new practice of mechanical objectivity appeared. "'Let nature speak for itself' became the watchword of a new brand of scientific objectivity." It was at this time that idealized representations of nature, which were previously seen as a virtue, were now seen as a vice. Scientists began to see it as their duty to actively restrain themselves from imposing their own projections onto nature. The aim was to liberate representations of nature from subjective, human interference and in order to achieve this scientists began using self-registering instruments, cameras, wax molds, and other technological devices.

In the twentieth century trained judgment supplemented mechanical objectivity as scientists began to recognize that, in order for images or data to be of any use, scientists needed to be able to see scientifically; that is, to interpret images or data and identify and group them according to particular professional training, rather than to simply depict them mechanically. Since the latter half of the nineteenth century, objectivity has come to involve a combination of trained judgment and mechanical objectivity.

Objectivity in measurement

Another methodological aspect is the avoidance of bias, which can involve cognitive bias, cultural bias, or sampling bias. Methods for avoiding or overcoming such biases include random sampling and double-blind trials. However, objectivity in measurement can be unobtainable in certain circumstances. Even the most quantitative social sciences such as economics employ measures that are constructs (conventions, to employ the term coined by Pierre Duhem).

The role of the scientific community

Various scientific processes, such as peer reviews, the discussions at scientific conferences, and other meetings where scientific results are presented, are part of a social process whose purpose is to strengthen the objective aspect of the scientific method.

Next to unintentional and systematic error, there is always the possibility of deliberate misrepresentation of scientific results, whether for gain, fame, or ideological motives. When such cases of scientific fraud come to light, they usually give rise to an academic scandal, but it is unknown how much fraud goes undiscovered. For important results, other groups will try to repeat the experiment. If they consistently fail, they will bring these negative results into the scientific debate.[according to whom?]

Critiques of scientific objectivity

A critical argument on scientific objectivity and positivism is that all science has a degree of interpretivism. In the 1920s, Percy Bridgman's The Logic of Modern Physics and the operationalism presented was centered in such recognition.

Thomas Kuhn's The Structure of Scientific Revolutions

Based on a historical review of the development of certain scientific theories in his book, The Structure of Scientific Revolutions, scientist and historian Thomas Kuhn raised some philosophical objections to claims of the possibility of scientific understanding being truly objective. In Kuhn's analysis, scientists in different disciplines organise themselves into de facto paradigms within which scientific research is done, junior scientists are educated, and scientific problems are determined.

When observational data arises which appears to contradict or falsify a given scientific paradigm, scientists within that paradigm historically have not immediately rejected it, as Karl Popper's philosophical theory of falsificationism would have them do. Instead they have gone to considerable lengths to resolve the apparent conflict without rejecting the paradigm. Through ad hoc variations to the theory and sympathetic interpretation of the data, supporting scientists will resolve the apparent conundrum. In extreme cases, they may ignore the data altogether. Thus, the failure of a scientific paradigm will go into crisis when a significant portion of scientists working in the field lose confidence in it. The corollary of this observation is that a paradigm is contingent on the social order amongst scientists at the time it gains ascendancy.

Kuhn's theory has been criticised by scientists such as Richard Dawkins and Alan Sokal as presenting a relativist view of scientific progress.In a postscript to the third edition of his book, Kuhn denied being a relativist.

Donna Haraway's Situated Knowledges

In Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective (1988), Donna Haraway argues that objectivity in science and philosophy is traditionally understood as a kind of disembodied and transcendent "conquering gaze from nowhere." She argues that this kind of objectivity, in which the subject is split apart and distanced from the object, is an impossible "illusion, a god trick." She demands a re-thinking of objectivity in such a way that, while still striving for "faithful accounts of the real world," we must also acknowledge our perspective within the world. She calls this new kind of knowledge-making "situated knowledges." Objectivity, she argues, "turns out to be about particular and specific embodiment and ... not about the false vision promising transcendence of all limits and responsibility". This new objectivity, "allows us to become answerable for what we learn how to see." Thus, Haraway is not only critiquing the idea that objectivity as we have long understood it is possible; she is also arguing that if we continue to approach knowledge-making in this way, then we wash our hands of any responsibility for our truth claims. In contrast, she argues, approaching knowledge-making from an embodied perspective forces us to take responsibility.

Criticism of science

From Wikipedia, the free encyclopedia
 
Personification of "Science" in front of the Boston Public Library

Criticism of science addresses problems within science in order to improve science as a whole and its role in society. Criticisms come from philosophy, from social movements like feminism, and from within science itself.

The emerging field of metascience seeks to increase the quality of and efficiency of scientific research by improving the scientific process.

Philosophical critiques

"All methodologies, even the most obvious ones, have their limits." ―Paul Feyerabend in Against Method

Philosopher of science Paul Feyerabend advanced the idea of epistemological anarchism, which holds that there are no useful and exception-free methodological rules governing the progress of science or the growth of knowledge, and that the idea that science can or should operate according to universal and fixed rules is unrealistic, pernicious and detrimental to science itself. Feyerabend advocates a democratic society where science is treated as an equal to other ideologies or social institutions such as religion, and education, or magic and mythology, and considers the dominance of science in society authoritarian and unjustified. He also contended (along with Imre Lakatos) that the demarcation problem of distinguishing science from pseudoscience on objective grounds is not possible and thus fatal to the notion of science running according to fixed, universal rules.

Feyerabend also criticized science for not having evidence for its own philosophical precepts. Particularly the notion of Uniformity of Law and the Uniformity of Process across time and space, as noted by Stephen Jay Gould. "We have to realize that a unified theory of the physical world simply does not exist" says Feyerabend, "We have theories that work in restricted regions, we have purely formal attempts to condense them into a single formula, we have lots of unfounded claims (such as the claim that all of chemistry can be reduced to physics), phenomena that do not fit into the accepted framework are suppressed; in physics, which many scientists regard as the one really basic science, we have now at least three different points of view...without a promise of conceptual (and not only formal) unification". In other words, science is begging the question when it presupposes that there is a universal truth with no proof thereof.

Historian Jacques Barzun termed science "a faith as fanatical as any in history" and warned against the use of scientific thought to suppress considerations of meaning as integral to human existence.

Sociologist Stanley Aronowitz scrutinizes science for operating with the presumption that the only acceptable criticisms of science are those conducted within the methodological framework that science has set up for itself. That science insists that only those who have been inducted into its community, through means of training and credentials, are qualified to make these criticisms. Aronowitz also alleges that while scientists consider it absurd that Fundamentalist Christianity uses biblical references to bolster their claim that the Bible is true, scientists pull the same tactic by using the tools of science to settle disputes concerning its own validity.

Philosopher of religion Alan Watts criticized science for operating under a materialist model of the world that he posited is simply a modified version of the Abrahamic worldview, that "the universe is constructed and maintained by a Lawmaker" (commonly identified as God or the Logos). Watts asserts that during the rise of secularism through the 18th to 20th century when scientific philosophers got rid of the notion of a lawmaker they kept the notion of law, and that the idea that the world is a material machine run by law is a presumption just as unscientific as religious doctrines that affirm it is a material machine made and run by a lawmaker.

Epistemology

David Parkin compared the epistemological stance of science to that of divination. He suggested that, to the degree that divination is an epistemologically specific means of gaining insight into a given question, science itself can be considered a form of divination that is framed from a Western view of the nature (and thus possible applications) of knowledge.

Author and Episkopos of Discordianism Robert Anton Wilson stresses that the instruments used in scientific investigation produce meaningful answers relevant only to the instrument, and that there is no objective vantage point from which science could verify its findings since all findings are relative to begin with.

Ethics

Joseph Wright of Derby (1768) An Experiment on a Bird in an Air Pump, National Gallery, London

Several academics have offered critiques concerning ethics in science. In Science and Ethics, for example, the professor of philosophy Bernard Rollin examines the relevance of ethics to science, and argues in favor of making education in ethics part and parcel of scientific training.

Social science scholars, like social anthropologist Tim Ingold, and scholars from philosophy and the humanities, like critical theorist Adorno, have criticized modern science for subservience to economic and technological interests. A related criticism is the debate on positivism. While before the 19th century science was perceived to be in opposition to religion, in contemporary society science is often defined as the antithesis of the humanities and the arts.

Many thinkers, such as Carolyn Merchant, Theodor Adorno and E. F. Schumacher considered that the 17th century scientific revolution shifted science from a focus on understanding nature, or wisdom, to a focus on manipulating nature, i.e. power, and that science's emphasis on manipulating nature leads it inevitably to manipulate people, as well. Science's focus on quantitative measures has led to critiques that it is unable to recognize important qualitative aspects of the world.

Critiques from within science

Metascience is the use of scientific methodology to study science itself, with the goal of increasing the quality of research while reducing waste. Meta-research has identified methodological weaknesses in many areas of science. Critics argue that reforms are needed to address these weaknesses.

Reproduciblity

The social sciences, such as social psychology, have long suffered from the problem of their studies being largely not reproducible. Now, medicine has come under similar pressures. In a phenomenon known as the replication crisis, journals are less likely to publish straight replication studies so it may be difficult to disprove results. Another result of publication bias is the Proteus phenomenon: early attempts to replicate results tend to contradict them. However, there are claims that this bias may be beneficial, allowing accurate meta-analysis with fewer publications.

Cognitive and publication biases

Critics argue that the biggest bias within science is motivated reasoning, whereby scientists are more likely to accept evidence that supports their hypothesis and more likely to scrutinize findings that do not. Scientists do not practice pure induction but instead often come into science with preconceived ideas and often will, unconsciously or consciously, interpret observations to support their own hypotheses through confirmation bias. For example, scientists may re-run trials when they do not support a hypothesis but use results from the first trial when they do support their hypothesis. It is often argued that while each individual has cognitive biases, these biases are corrected for when scientific evidence converges. However, systematic issues in the publication system of academic journals can often compound these biases. Issues like publication bias, where studies with non-significant results are less likely to be published, and selective outcome reporting bias, where only the significant outcomes out of a variety of outcomes are likely to be published, are common within academic literature. These biases have widespread implications, such as the distortion of meta-analyses where only studies that include positive results are likely to be included. Statistical outcomes can be manipulated as well, for example large numbers of participants can be used and trials overpowered so that small difference cause significant effects or inclusion criteria can be changed to include those are most likely to respond to a treatment. Whether produced on purpose or not, all of these issues need to be taken into consideration within scientific research, and peer-reviewed published evidence should not be assumed to be outside of the realm of bias and error; some critics are now claiming that many results in scientific journals are false or exaggerated.

Feminist critiques

Feminist scholars and women scientists such as Emily Martin, Evelyn Fox Keller, Ruth Hubbard, Londa Schiebinger and Bonnie Spanier have critiqued science because they believe it presents itself as objective and neutral while ignoring its inherent gender bias. They assert that gender bias exists in the language and practice of science, as well as in the expected appearance and social acceptance of who can be scientists within society.

Sandra Harding says that the "moral and political insights of the women's movement have inspired social scientists and biologists to raise critical questions about the ways traditional researchers have explained gender, sex, and relations within and between the social and natural worlds." Anne Fausto-Sterling is a prominent example of this kind of feminist work within biological science. Some feminists, such as Ruth Hubbard and Evelyn Fox Keller, criticize traditional scientific discourse as being historically biased towards a male perspective. A part of the feminist research agenda is the examination of the ways in which power inequities are created and/or reinforced in scientific and academic institutions.

Other feminist scholars, such as Ann Hibner Koblitz, Lenore Blum, Mary Gray, Mary Beth Ruskai, and Pnina Abir-Am and Dorinda Outram, have criticized some gender and science theories for ignoring the diverse nature of scientific research and the tremendous variation in women's experiences in different cultures and historical periods. For example, the first generation of women to receive advanced university degrees in Europe were almost entirely in the natural sciences and medicine—in part because those fields at the time were much more welcoming of women than were the humanities. Koblitz and others who are interested in increasing the number of women in science have expressed concern that some of the statements by feminist critics of science could undermine those efforts, notably the following assertion by Keller:

Just as surely as inauthenticity is the cost a woman suffers by joining men in misogynist jokes, so it is, equally, the cost suffered by a woman who identifies with an image of the scientist modeled on the patriarchal husband. Only if she undergoes a radical disidentification from self can she share masculine pleasure in mastering a nature cast in the image of woman as passive, inert, and blind.

Language in science

Emily Martin examines the metaphors used in science to support her claim that science reinforces socially constructed ideas about gender rather than objective views of nature. In her study about the fertilization process, Martin describes several cases when gender-biased perception skewed the descriptions of biological processes during fertilization and even possibly hampered the research. She asserts that classic metaphors of the strong dominant sperm racing to an idle egg are products of gendered stereotyping rather than a faithful portrayal of human fertilization. The notion that women are passive and men are active are socially constructed attributes of gender which, according to Martin, scientists have projected onto the events of fertilization and so obscuring the fact that eggs do play an active role. For example, she wrote that "even after having revealed...the egg to be a chemically active sperm catcher, even after discussing the egg's role in tethering the sperm, the research team continued for another three years to describe the sperm's role as actively penetrating the egg." Scott Gilbert, a developmental biologist at Swarthmore College supports her position: "if you don’t have an interpretation of fertilization that allows you to look at the egg as active, you won’t look for the molecules that can prove it. You simply won’t find activities that you don’t visualize."

Media and politics

The mass media face a number of pressures that can prevent them from accurately depicting competing scientific claims in terms of their credibility within the scientific community as a whole. Determining how much weight to give different sides in a scientific debate requires considerable expertise regarding the matter. Few journalists have real scientific knowledge, and even beat reporters who know a great deal about certain scientific issues may know little about other ones they are suddenly asked to cover.

Many issues damage the relationship of science to the media and the use of science and scientific arguments by politicians. As a very broad generalisation, many politicians seek certainties and facts whilst scientists typically offer probabilities and caveats. However, politicians' ability to be heard in the mass media frequently distorts the scientific understanding by the public. Examples in Britain include the controversy over the MMR inoculation, and the 1988 forced resignation of a government minister, Edwina Currie, for revealing the high probability that battery eggs were contaminated with Salmonella.

Some scientists and philosophers suggest that scientific theories are more or less shaped by the dominant political, economic, or cultural models of the time, even though the scientific community may claim to be exempt from social influences and historical conditions. For example, Zoologist Peter Kropotkin thought that the Darwinian theory of evolution overstressed a painful "we must struggle to survive" way of life, which he said was influenced by capitalism and the struggling lifestyles people lived within it. Karl Marx also thought that science was largely driven by and used as capital.

Robert Anton Wilson, Stanley Aronowitz, and Paul Feyerabend all thought that the military-industrial complex, large corporations, and the grants that came from them had an immense influence over the research and even results of scientific experiments. Aronowitz even went as far as to say "It does not matter that the scientific community ritualistically denies its alliance with economic/industrial and military power. The evidence is overwhelming that such is the case. Thus, every major power has a national science policy; the United States Military appropriates billions each year for 'basic' as well as 'applied' research".

 

Empiricism

In philosophy, empiricism is a theory that states that knowledge comes only or primarily from sensory experience. It is one of several views of epistemology, along with rationalism and skepticism. Empiricism emphasizes the role of empirical evidence in the formation of ideas, rather than innate ideas or traditions. However, empiricists may argue that traditions (or customs) arise due to relations of previous sense experiences.

Historically, empiricism was associated with the "blank slate" concept (tabula rasa), according to which the human mind is "blank" at birth and develops its thoughts only through experience.

Empiricism in the philosophy of science emphasizes evidence, especially as discovered in experiments. It is a fundamental part of the scientific method that all hypotheses and theories must be tested against observations of the natural world rather than resting solely on a priori reasoning, intuition, or revelation.

Empiricism, often used by natural scientists, says that "knowledge is based on experience" and that "knowledge is tentative and probabilistic, subject to continued revision and falsification". Empirical research, including experiments and validated measurement tools, guides the scientific method.

Etymology

The English term empirical derives from the Ancient Greek word ἐμπειρία, empeiria, which is cognate with and translates to the Latin experientia, from which the words experience and experiment are derived.

History

Background

A central concept in science and the scientific method is that conclusions must be empirically based on the evidence of the senses. Both natural and social sciences use working hypotheses that are testable by observation and experiment. The term semi-empirical is sometimes used to describe theoretical methods that make use of basic axioms, established scientific laws, and previous experimental results in order to engage in reasoned model building and theoretical inquiry.

Philosophical empiricists hold no knowledge to be properly inferred or deduced unless it is derived from one's sense-based experience. This view is commonly contrasted with rationalism, which states that knowledge may be derived from reason independently of the senses. For example, John Locke held that some knowledge (e.g. knowledge of God's existence) could be arrived at through intuition and reasoning alone. Similarly Robert Boyle, a prominent advocate of the experimental method, held that we have innate ideas. The main continental rationalists (Descartes, Spinoza, and Leibniz) were also advocates of the empirical "scientific method".

Early empiricism

Between 600 and 200 BCE

Between 600 and 200 BCE, the Vaisheshika school of Hindu philosophy, founded by the ancient Indian philosopher Kanada, accepted perception and inference as the only two reliable sources of knowledge. This is enumerated in his work Vaiśeṣika Sūtra.

c. 330 – 400 BCE

The earliest Western proto-empiricists were the Empiric school of ancient Greek medical practitioners, founded in 330 BCE. Its members rejected the three doctrines of the Dogmatic school, preferring to rely on the observation of phantasiai (i.e., phenomena, the appearances). The Empiric school was closely allied with Pyrrhonist school of philosophy, which made the philosophical case for their proto-empiricism.

The notion of tabula rasa ("clean slate" or "blank tablet") connotes a view of mind as an originally blank or empty recorder (Locke used the words "white paper") on which experience leaves marks. This denies that humans have innate ideas. The notion dates back to Aristotle, c. 350 BC:

What the mind (nous) thinks must be in it in the same sense as letters are on a tablet (grammateion) which bears no actual writing (grammenon); this is just what happens in the case of the mind. (Aristotle, On the Soul, 3.4.430a1).

Aristotle's explanation of how this was possible was not strictly empiricist in a modern sense, but rather based on his theory of potentiality and actuality, and experience of sense perceptions still requires the help of the active nous. These notions contrasted with Platonic notions of the human mind as an entity that pre-existed somewhere in the heavens, before being sent down to join a body on Earth (see Plato's Phaedo and Apology, as well as others). Aristotle was considered to give a more important position to sense perception than Plato, and commentators in the Middle Ages summarized one of his positions as "nihil in intellectu nisi prius fuerit in sensu" (Latin for "nothing in the intellect without first being in the senses").

This idea was later developed in ancient philosophy by the Stoic school, from about 330 BCE. Stoic epistemology generally emphasized that the mind starts blank, but acquires knowledge as the outside world is impressed upon it. The doxographer Aetius summarizes this view as "When a man is born, the Stoics say, he has the commanding part of his soul like a sheet of paper ready for writing upon."

A drawing of Ibn Sina (Avicenna) from 1271

Islamic Golden Age and Pre-Renaissance (5th to 15th centuries CE)

During the Middle Ages (from the 5th to the 15th century CE) Aristotle's theory of tabula rasa was developed by Islamic philosophers starting with Al Farabi (c. 872 – 951 CE), developing into an elaborate theory by Avicenna (c. 980 – 1037) and demonstrated as a thought experiment by Ibn Tufail. For Avicenna (Ibn Sina), for example, the tabula rasa is a pure potentiality that is actualized through education, and knowledge is attained through "empirical familiarity with objects in this world from which one abstracts universal concepts" developed through a "syllogistic method of reasoning in which observations lead to propositional statements which when compounded lead to further abstract concepts". The intellect itself develops from a material intellect (al-'aql al-hayulani), which is a potentiality "that can acquire knowledge to the active intellect (al-'aql al-fa'il), the state of the human intellect in conjunction with the perfect source of knowledge". So the immaterial "active intellect", separate from any individual person, is still essential for understanding to occur.

In the 12th century CE the Andalusian Muslim philosopher and novelist Abu Bakr Ibn Tufail (known as "Abubacer" or "Ebn Tophail" in the West) included the theory of tabula rasa as a thought experiment in his Arabic philosophical novel, Hayy ibn Yaqdhan in which he depicted the development of the mind of a feral child "from a tabula rasa to that of an adult, in complete isolation from society" on a desert island, through experience alone. The Latin translation of his philosophical novel, entitled Philosophus Autodidactus, published by Edward Pococke the Younger in 1671, had an influence on John Locke's formulation of tabula rasa in An Essay Concerning Human Understanding.

A similar Islamic theological novel, Theologus Autodidactus, was written by the Arab theologian and physician Ibn al-Nafis in the 13th century. It also dealt with the theme of empiricism through the story of a feral child on a desert island, but departed from its predecessor by depicting the development of the protagonist's mind through contact with society rather than in isolation from society.

During the 13th century Thomas Aquinas adopted the Aristotelian position that the senses are essential to mind into scholasticism. Bonaventure (1221–1274), one of Aquinas' strongest intellectual opponents, offered some of the strongest arguments in favour of the Platonic idea of the mind.

Renaissance Italy

In the late renaissance various writers began to question the medieval and classical understanding of knowledge acquisition in a more fundamental way. In political and historical writing Niccolò Machiavelli and his friend Francesco Guicciardini initiated a new realistic style of writing. Machiavelli in particular was scornful of writers on politics who judged everything in comparison to mental ideals and demanded that people should study the "effectual truth" instead. Their contemporary, Leonardo da Vinci (1452–1519) said, "If you find from your own experience that something is a fact and it contradicts what some authority has written down, then you must abandon the authority and base your reasoning on your own findings."

Significantly, an empirical metaphysical system was developed by the Italian philosopher Bernardino Telesio which had an enormous impact on the development of later Italian thinkers, including Telesio's students Antonio Persio and Sertorio Quattromani, his contemporaries Thomas Campanella and Giordano Bruno, and later British philosophers such as Francis Bacon, who regarded Telesio as "the first of the moderns.”  Telesio's influence can also be seen on the French philosophers René Descartes and Pierre Gassendi.

The decidedly anti-Aristotelian and anti-clerical music theorist Vincenzo Galilei (c. 1520 – 1591), father of Galileo and the inventor of monody, made use of the method in successfully solving musical problems, firstly, of tuning such as the relationship of pitch to string tension and mass in stringed instruments, and to volume of air in wind instruments; and secondly to composition, by his various suggestions to composers in his Dialogo della musica antica e moderna (Florence, 1581). The Italian word he used for "experiment" was esperienza. It is known that he was the essential pedagogical influence upon the young Galileo, his eldest son (cf. Coelho, ed. Music and Science in the Age of Galileo Galilei), arguably one of the most influential empiricists in history. Vincenzo, through his tuning research, found the underlying truth at the heart of the misunderstood myth of 'Pythagoras' hammers' (the square of the numbers concerned yielded those musical intervals, not the actual numbers, as believed), and through this and other discoveries that demonstrated the fallibility of traditional authorities, a radically empirical attitude developed, passed on to Galileo, which regarded "experience and demonstration" as the sine qua non of valid rational enquiry.

British empiricism

Thomas Hobbes

British empiricism, a retrospective characterization, emerged during the 17th century as an approach to early modern philosophy and modern science. Although both integral to this overarching transition, Francis Bacon, in England, advised empiricism at 1620, whereas René Descartes, in France, upheld rationalism around 1640, a distinction drawn by Immanuel Kant, in Germany, near 1780. (Bacon's natural philosophy was influenced by Italian philosopher Bernardino Telesio and by Swiss physician Paracelsus.) Contributing later in the 17th century, Thomas Hobbes and Baruch Spinoza are retrospectively identified likewise as an empiricist and a rationalist, respectively. In the Enlightenment during the 18th century, both George Berkeley, in England, and David Hume, in Scotland, became leading exponents of empiricism, a lead precedented in the late 17th century by John Locke, also in England, hence the dominance of empiricism in British philosophy.

In response to the early-to-mid-17th century "continental rationalism," John Locke (1632–1704) proposed in An Essay Concerning Human Understanding (1689) a very influential view wherein the only knowledge humans can have is a posteriori, i.e., based upon experience. Locke is famously attributed with holding the proposition that the human mind is a tabula rasa, a "blank tablet", in Locke's words "white paper", on which the experiences derived from sense impressions as a person's life proceeds are written. There are two sources of our ideas: sensation and reflection. In both cases, a distinction is made between simple and complex ideas. The former are unanalysable, and are broken down into primary and secondary qualities. Primary qualities are essential for the object in question to be what it is. Without specific primary qualities, an object would not be what it is. For example, an apple is an apple because of the arrangement of its atomic structure. If an apple were structured differently, it would cease to be an apple. Secondary qualities are the sensory information we can perceive from its primary qualities. For example, an apple can be perceived in various colours, sizes, and textures but it is still identified as an apple. Therefore, its primary qualities dictate what the object essentially is, while its secondary qualities define its attributes. Complex ideas combine simple ones, and divide into substances, modes, and relations. According to Locke, our knowledge of things is a perception of ideas that are in accordance or discordance with each other, which is very different from the quest for certainty of Descartes.

A generation later, the Irish Anglican bishop, George Berkeley (1685–1753), determined that Locke's view immediately opened a door that would lead to eventual atheism. In response to Locke, he put forth in his Treatise Concerning the Principles of Human Knowledge (1710) an important challenge to empiricism in which things only exist either as a result of their being perceived, or by virtue of the fact that they are an entity doing the perceiving. (For Berkeley, God fills in for humans by doing the perceiving whenever humans are not around to do it.) In his text Alciphron, Berkeley maintained that any order humans may see in nature is the language or handwriting of God. Berkeley's approach to empiricism would later come to be called subjective idealism.

The Scottish philosopher David Hume (1711–1776) responded to Berkeley's criticisms of Locke, as well as other differences between early modern philosophers, and moved empiricism to a new level of skepticism. Hume argued in keeping with the empiricist view that all knowledge derives from sense experience, but he accepted that this has implications not normally acceptable to philosophers. He wrote for example, "Locke divides all arguments into demonstrative and probable. On this view, we must say that it is only probable that all men must die or that the sun will rise to-morrow, because neither of these can be demonstrated. But to conform our language more to common use, we ought to divide arguments into demonstrations, proofs, and probabilities—by ‘proofs’ meaning arguments from experience that leave no room for doubt or opposition."

"I believe the most general and most popular explication of this matter, is to say [See Mr. Locke, chapter of power.], that finding from experience, that there are several new productions in matter, such as the motions and variations of body, and concluding that there must somewhere be a power capable of producing them, we arrive at last by this reasoning at the idea of power and efficacy. But to be convinced that this explication is more popular than philosophical, we need but reflect on two very obvious principles. First, That reason alone can never give rise to any original idea, and secondly, that reason, as distinguished from experience, can never make us conclude, that a cause or productive quality is absolutely requisite to every beginning of existence. Both these considerations have been sufficiently explained: and therefore shall not at present be any farther insisted on."

— Hume Section XIV "of the idea of necessary connexion in A Treatise of Human Nature

Hume divided all of human knowledge into two categories: relations of ideas and matters of fact (see also Kant's analytic-synthetic distinction). Mathematical and logical propositions (e.g. "that the square of the hypotenuse is equal to the sum of the squares of the two sides") are examples of the first, while propositions involving some contingent observation of the world (e.g. "the sun rises in the East") are examples of the second. All of people's "ideas", in turn, are derived from their "impressions". For Hume, an "impression" corresponds roughly with what we call a sensation. To remember or to imagine such impressions is to have an "idea". Ideas are therefore the faint copies of sensations.

David Hume's empiricism led to numerous philosophical schools.

Hume maintained that no knowledge, even the most basic beliefs about the natural world, can be conclusively established by reason. Rather, he maintained, our beliefs are more a result of accumulated habits, developed in response to accumulated sense experiences. Among his many arguments Hume also added another important slant to the debate about scientific method—that of the problem of induction. Hume argued that it requires inductive reasoning to arrive at the premises for the principle of inductive reasoning, and therefore the justification for inductive reasoning is a circular argument. Among Hume's conclusions regarding the problem of induction is that there is no certainty that the future will resemble the past. Thus, as a simple instance posed by Hume, we cannot know with certainty by inductive reasoning that the sun will continue to rise in the East, but instead come to expect it to do so because it has repeatedly done so in the past.

Hume concluded that such things as belief in an external world and belief in the existence of the self were not rationally justifiable. According to Hume these beliefs were to be accepted nonetheless because of their profound basis in instinct and custom. Hume's lasting legacy, however, was the doubt that his skeptical arguments cast on the legitimacy of inductive reasoning, allowing many skeptics who followed to cast similar doubt.

Phenomenalism

Most of Hume's followers have disagreed with his conclusion that belief in an external world is rationally unjustifiable, contending that Hume's own principles implicitly contained the rational justification for such a belief, that is, beyond being content to let the issue rest on human instinct, custom and habit. According to an extreme empiricist theory known as phenomenalism, anticipated by the arguments of both Hume and George Berkeley, a physical object is a kind of construction out of our experiences. Phenomenalism is the view that physical objects, properties, events (whatever is physical) are reducible to mental objects, properties, events. Ultimately, only mental objects, properties, events, exist—hence the closely related term subjective idealism. By the phenomenalistic line of thinking, to have a visual experience of a real physical thing is to have an experience of a certain kind of group of experiences. This type of set of experiences possesses a constancy and coherence that is lacking in the set of experiences of which hallucinations, for example, are a part. As John Stuart Mill put it in the mid-19th century, matter is the "permanent possibility of sensation". Mill's empiricism went a significant step beyond Hume in still another respect: in maintaining that induction is necessary for all meaningful knowledge including mathematics. As summarized by D.W. Hamlin:

[Mill] claimed that mathematical truths were merely very highly confirmed generalizations from experience; mathematical inference, generally conceived as deductive [and a priori] in nature, Mill set down as founded on induction. Thus, in Mill's philosophy there was no real place for knowledge based on relations of ideas. In his view logical and mathematical necessity is psychological; we are merely unable to conceive any other possibilities than those that logical and mathematical propositions assert. This is perhaps the most extreme version of empiricism known, but it has not found many defenders.

Mill's empiricism thus held that knowledge of any kind is not from direct experience but an inductive inference from direct experience. The problems other philosophers have had with Mill's position center around the following issues: Firstly, Mill's formulation encounters difficulty when it describes what direct experience is by differentiating only between actual and possible sensations. This misses some key discussion concerning conditions under which such "groups of permanent possibilities of sensation" might exist in the first place. Berkeley put God in that gap; the phenomenalists, including Mill, essentially left the question unanswered. In the end, lacking an acknowledgement of an aspect of "reality" that goes beyond mere "possibilities of sensation", such a position leads to a version of subjective idealism. Questions of how floor beams continue to support a floor while unobserved, how trees continue to grow while unobserved and untouched by human hands, etc., remain unanswered, and perhaps unanswerable in these terms. Secondly, Mill's formulation leaves open the unsettling possibility that the "gap-filling entities are purely possibilities and not actualities at all". Thirdly, Mill's position, by calling mathematics merely another species of inductive inference, misapprehends mathematics. It fails to fully consider the structure and method of mathematical science, the products of which are arrived at through an internally consistent deductive set of procedures which do not, either today or at the time Mill wrote, fall under the agreed meaning of induction.

The phenomenalist phase of post-Humean empiricism ended by the 1940s, for by that time it had become obvious that statements about physical things could not be translated into statements about actual and possible sense data. If a physical object statement is to be translatable into a sense-data statement, the former must be at least deducible from the latter. But it came to be realized that there is no finite set of statements about actual and possible sense-data from which we can deduce even a single physical-object statement. The translating or paraphrasing statement must be couched in terms of normal observers in normal conditions of observation. There is, however, no finite set of statements that are couched in purely sensory terms and can express the satisfaction of the condition of the presence of a normal observer. According to phenomenalism, to say that a normal observer is present is to make the hypothetical statement that were a doctor to inspect the observer, the observer would appear to the doctor to be normal. But, of course, the doctor himself must be a normal observer. If we are to specify this doctor's normality in sensory terms, we must make reference to a second doctor who, when inspecting the sense organs of the first doctor, would himself have to have the sense data a normal observer has when inspecting the sense organs of a subject who is a normal observer. And if we are to specify in sensory terms that the second doctor is a normal observer, we must refer to a third doctor, and so on (also see the third man).

Logical empiricism

Logical empiricism (also logical positivism or neopositivism) was an early 20th-century attempt to synthesize the essential ideas of British empiricism (e.g. a strong emphasis on sensory experience as the basis for knowledge) with certain insights from mathematical logic that had been developed by Gottlob Frege and Ludwig Wittgenstein. Some of the key figures in this movement were Otto Neurath, Moritz Schlick and the rest of the Vienna Circle, along with A.J. Ayer, Rudolf Carnap and Hans Reichenbach.

The neopositivists subscribed to a notion of philosophy as the conceptual clarification of the methods, insights and discoveries of the sciences. They saw in the logical symbolism elaborated by Frege (1848–1925) and Bertrand Russell (1872–1970) a powerful instrument that could rationally reconstruct all scientific discourse into an ideal, logically perfect, language that would be free of the ambiguities and deformations of natural language. This gave rise to what they saw as metaphysical pseudoproblems and other conceptual confusions. By combining Frege's thesis that all mathematical truths are logical with the early Wittgenstein's idea that all logical truths are mere linguistic tautologies, they arrived at a twofold classification of all propositions: the analytic (a priori) and the synthetic (a posteriori). On this basis, they formulated a strong principle of demarcation between sentences that have sense and those that do not: the so-called verification principle. Any sentence that is not purely logical, or is unverifiable is devoid of meaning. As a result, most metaphysical, ethical, aesthetic and other traditional philosophical problems came to be considered pseudoproblems.

In the extreme empiricism of the neopositivists—at least before the 1930s—any genuinely synthetic assertion must be reducible to an ultimate assertion (or set of ultimate assertions) that expresses direct observations or perceptions. In later years, Carnap and Neurath abandoned this sort of phenomenalism in favor of a rational reconstruction of knowledge into the language of an objective spatio-temporal physics. That is, instead of translating sentences about physical objects into sense-data, such sentences were to be translated into so-called protocol sentences, for example, "X at location Y and at time T observes such and such." The central theses of logical positivism (verificationism, the analytic–synthetic distinction, reductionism, etc.) came under sharp attack after World War II by thinkers such as Nelson Goodman, W.V. Quine, Hilary Putnam, Karl Popper, and Richard Rorty. By the late 1960s, it had become evident to most philosophers that the movement had pretty much run its course, though its influence is still significant among contemporary analytic philosophers such as Michael Dummett and other anti-realists.

Pragmatism

In the late 19th and early 20th century several forms of pragmatic philosophy arose. The ideas of pragmatism, in its various forms, developed mainly from discussions between Charles Sanders Peirce and William James when both men were at Harvard in the 1870s. James popularized the term "pragmatism", giving Peirce full credit for its patrimony, but Peirce later demurred from the tangents that the movement was taking, and redubbed what he regarded as the original idea with the name of "pragmaticism". Along with its pragmatic theory of truth, this perspective integrates the basic insights of empirical (experience-based) and rational (concept-based) thinking.

Charles Peirce (1839–1914) was highly influential in laying the groundwork for today's empirical scientific method. Although Peirce severely criticized many elements of Descartes' peculiar brand of rationalism, he did not reject rationalism outright. Indeed, he concurred with the main ideas of rationalism, most importantly the idea that rational concepts can be meaningful and the idea that rational concepts necessarily go beyond the data given by empirical observation. In later years he even emphasized the concept-driven side of the then ongoing debate between strict empiricism and strict rationalism, in part to counterbalance the excesses to which some of his cohorts had taken pragmatism under the "data-driven" strict-empiricist view.

Among Peirce's major contributions was to place inductive reasoning and deductive reasoning in a complementary rather than competitive mode, the latter of which had been the primary trend among the educated since David Hume wrote a century before. To this, Peirce added the concept of abductive reasoning. The combined three forms of reasoning serve as a primary conceptual foundation for the empirically based scientific method today. Peirce's approach "presupposes that (1) the objects of knowledge are real things, (2) the characters (properties) of real things do not depend on our perceptions of them, and (3) everyone who has sufficient experience of real things will agree on the truth about them. According to Peirce's doctrine of fallibilism, the conclusions of science are always tentative. The rationality of the scientific method does not depend on the certainty of its conclusions, but on its self-corrective character: by continued application of the method science can detect and correct its own mistakes, and thus eventually lead to the discovery of truth".

In his Harvard "Lectures on Pragmatism" (1903), Peirce enumerated what he called the "three cotary propositions of pragmatism" (L: cos, cotis whetstone), saying that they "put the edge on the maxim of pragmatism". First among these he listed the peripatetic-thomist observation mentioned above, but he further observed that this link between sensory perception and intellectual conception is a two-way street. That is, it can be taken to say that whatever we find in the intellect is also incipiently in the senses. Hence, if theories are theory-laden then so are the senses, and perception itself can be seen as a species of abductive inference, its difference being that it is beyond control and hence beyond critique—in a word, incorrigible. This in no way conflicts with the fallibility and revisability of scientific concepts, since it is only the immediate percept in its unique individuality or "thisness"—what the Scholastics called its haecceity—that stands beyond control and correction. Scientific concepts, on the other hand, are general in nature, and transient sensations do in another sense find correction within them. This notion of perception as abduction has received periodic revivals in artificial intelligence and cognitive science research, most recently for instance with the work of Irvin Rock on indirect perception.

Around the beginning of the 20th century, William James (1842–1910) coined the term "radical empiricism" to describe an offshoot of his form of pragmatism, which he argued could be dealt with separately from his pragmatism—though in fact the two concepts are intertwined in James's published lectures. James maintained that the empirically observed "directly apprehended universe needs ... no extraneous trans-empirical connective support", by which he meant to rule out the perception that there can be any value added by seeking supernatural explanations for natural phenomena. James' "radical empiricism" is thus not radical in the context of the term "empiricism", but is instead fairly consistent with the modern use of the term "empirical". His method of argument in arriving at this view, however, still readily encounters debate within philosophy even today.

John Dewey (1859–1952) modified James' pragmatism to form a theory known as instrumentalism. The role of sense experience in Dewey's theory is crucial, in that he saw experience as unified totality of things through which everything else is interrelated. Dewey's basic thought, in accordance with empiricism was that reality is determined by past experience. Therefore, humans adapt their past experiences of things to perform experiments upon and test the pragmatic values of such experience. The value of such experience is measured experientially and scientifically, and the results of such tests generate ideas that serve as instruments for future experimentation, in physical sciences as in ethics. Thus, ideas in Dewey's system retain their empiricist flavour in that they are only known a posteriori.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...