Search This Blog

Sunday, May 14, 2023

Epistemology

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Epistemology

Epistemology (/ɪˌpɪstəˈmɒləi/ (listen); from Ancient Greek ἐπιστήμη (epistḗmē) 'knowledge', and -logy), or the theory of knowledge, is the branch of philosophy concerned with knowledge. Epistemology is considered a major subfield of philosophy, along with other major subfields such as ethics, logic, and metaphysics.

Epistemologists study the nature, origin, and scope of knowledge, epistemic justification, the rationality of belief, and various related issues. Debates in epistemology are generally clustered around four core areas:

  1. The philosophical analysis of the nature of knowledge and the conditions required for a belief to constitute knowledge, such as truth and justification
  2. Potential sources of knowledge and justified belief, such as perception, reason, memory, and testimony
  3. The structure of a body of knowledge or justified belief, including whether all justified beliefs must be derived from justified foundational beliefs or whether justification requires only a coherent set of beliefs
  4. Philosophical skepticism, which questions the possibility of knowledge, and related problems, such as whether skepticism poses a threat to our ordinary knowledge claims and whether it is possible to refute skeptical arguments

In these debates and others, epistemology aims to answer questions such as "What do people know?", "What does it mean to say that people know something?", "What makes justified beliefs justified?", and "How do people know that they know?" Specialties in epistemology ask questions such as "How can people create formal models about issues related to knowledge?" (in formal epistemology), "What are the historical conditions of changes in different kinds of knowledge?" (in historical epistemology), "What are the methods, aims, and subject matter of epistemological inquiry?" (in metaepistemology), and "How do people know together?" (in social epistemology).

Background

Etymology

The word epistemology is derived from the ancient Greek epistēmē, meaning "knowledge, understanding, skill, scientific knowledge", and the English suffix -ology, meaning "the science or discipline of (what is indicated by the first element)". The word "epistemology" first appeared in 1847, in a review in New York's Eclectic Magazine :

The title of one of the principal works of Fichte is 'Wissenschaftslehre,' which, after the analogy of technology ... we render epistemology.

The word was first used to present a philosophy in English by Scottish philosopher James Frederick Ferrier in 1854. It was the title of the first section of his Institutes of Metaphysics :

This section of the science is properly termed the Epistemology—the doctrine or theory of knowing, just as ontology is the science of being... It answers the general question, 'What is knowing and the known?'—or more shortly, 'What is knowledge?'

History of epistemology

Epistemology, as a distinct field of inquiry, predates the introduction of the term into the lexicon of philosophy. John Locke, for instance, described his efforts in Essay Concerning Human Understanding (1689) as an inquiry "into the original, certainty, and extent of human knowledge, together with the grounds and degrees of belief, opinion, and assent".

René Descartes, who is often credited as the father of modern philosophy, was often preoccupied with epistemological questions in his work.

Almost every major historical philosopher has considered questions about what people know and how they know it. Among the Ancient Greek philosophers, Plato distinguished between inquiry regarding what people know and inquiry regarding what exists, particularly in the Republic, the Theaetetus, and the Meno. In Meno, the definition of knowledge as justified true knowledge appears for the first time. In other words, belief is required to have an explanation in order to be correct, beyond just happening to be right. A number of important epistemological concerns also appeared in the works of Aristotle.

During the subsequent Hellenistic period, philosophical schools began to appear which had a greater focus on epistemological questions, often in the form of philosophical skepticism. For instance, the Pyrrhonian skepticism of Pyrrho and Sextus Empiricus held that eudaimonia (flourishing, happiness, or "the good life") could be attained through the application of epoché (suspension of judgment) regarding all non-evident matters. Pyrrhonism was particularly concerned with undermining the epistemological dogmas of Stoicism and Epicureanism. The other major school of Hellenistic skepticism was Academic skepticism, most notably defended by Carneades and Arcesilaus, which predominated in the Platonic Academy for almost two centuries.

In ancient India the Ajñana school of ancient Indian philosophy promoted skepticism. Ajñana was a Śramaṇa movement and a major rival of early Buddhism, Jainism and the Ājīvika school. They held that it was impossible to obtain knowledge of metaphysical nature or to ascertain the truth value of philosophical propositions; and even if knowledge was possible, it was useless and disadvantageous for final salvation. They were specialized in refutation without propagating any positive doctrine of their own.

After the ancient philosophical era but before the modern philosophical era, a number of Medieval philosophers also engaged with epistemological questions at length. Most notable among the Medievals for their contributions to epistemology were Thomas Aquinas, John Duns Scotus, and William of Ockham.

During the Islamic Golden Age, one of the most prominent and influential philosophers, theologians, jurists, logicians and mystics in Islamic epistemology was Al-Ghazali. During his life, he wrote over 70 books on science, Islamic reasoning and Sufism. Al-Ghazali distributed his book The Incoherence of Philosophers, set apart as a defining moment in Islamic epistemology. He shaped a conviction that all occasions and connections are not the result of material conjunctions but are the present and prompt will of God.

Epistemology largely came to the fore in philosophy during the early modern period, which historians of philosophy traditionally divide up into a dispute between empiricists (including Francis Bacon, John Locke, David Hume, and George Berkeley) and rationalists (including René Descartes, Baruch Spinoza, and Gottfried Leibniz). The debate between them has often been framed using the question of whether knowledge comes primarily from sensory experience (empiricism), or whether a significant portion of our knowledge is derived entirely from our faculty of reason (rationalism). According to some scholars, this dispute was resolved in the late 18th century by Immanuel Kant, whose transcendental idealism famously made room for the view that "though all our knowledge begins with experience, it by no means follows that all [knowledge] arises out of experience".

Contemporary historiography

There are a number of different methods that contemporary scholars use when trying to understand the relationship between past epistemology and contemporary epistemology. One of the most contentious questions is this: "Should we assume that the problems of epistemology are perennial, and that trying to reconstruct and evaluate Plato's or Hume's or Kant's arguments is meaningful for current debates, too?" Similarly, there is also a question of whether contemporary philosophers should aim to rationally reconstruct and evaluate historical views in epistemology, or to merely describe them. Barry Stroud claims that doing epistemology competently requires the historical study of past attempts to find philosophical understanding of the nature and scope of human knowledge. He argues that since inquiry may progress over time, we may not realize how different the questions that contemporary epistemologists ask are from questions asked at various different points in the history of philosophy.

Central concepts in epistemology

Knowledge

Bertrand Russell famously brought attention to the distinction between propositional knowledge and knowledge by acquaintance.
 

Nearly all debates in epistemology are in some way related to knowledge. Most generally, "knowledge" is a familiarity, awareness, or understanding of someone or something, which might include facts (propositional knowledge), skills (procedural knowledge), or objects (acquaintance knowledge). Philosophers tend to draw an important distinction between three different senses of "knowing" something: "knowing that" (knowing the truth of propositions), "knowing how" (understanding how to perform certain actions), and "knowing by acquaintance" (directly perceiving an object, being familiar with it, or otherwise coming into contact with it). Epistemology is primarily concerned with the first of these forms of knowledge, propositional knowledge. All three senses of "knowing" can be seen in our ordinary use of the word. In mathematics, you can know that 2 + 2 = 4, but there is also knowing how to add two numbers, and knowing a person (e.g., knowing other persons, or knowing oneself), place (e.g., one's hometown), thing (e.g., cars), or activity (e.g., addition). While these distinctions are not explicit in English, they are explicitly made in other languages, including French, Portuguese, Spanish, Romanian, German and Dutch (although some languages closely related to English have been said to retain these verbs, such as Scots). The theoretical interpretation and significance of these linguistic issues remains controversial.

In his paper On Denoting and his later book Problems of Philosophy, Bertrand Russell brought a great deal of attention to the distinction between "knowledge by description" and "knowledge by acquaintance". Gilbert Ryle is similarly credited with bringing more attention to the distinction between knowing how and knowing that in The Concept of Mind. In Personal Knowledge, Michael Polanyi argues for the epistemological relevance of knowledge how and knowledge that; using the example of the act of balance involved in riding a bicycle, he suggests that the theoretical knowledge of the physics involved in maintaining a state of balance cannot substitute for the practical knowledge of how to ride, and that it is important to understand how both are established and grounded. This position is essentially Ryle's, who argued that a failure to acknowledge the distinction between "knowledge that" and "knowledge how" leads to infinite regress.

A priori and a posteriori knowledge

One of the most important distinctions in epistemology is between what can be known a priori (independently of experience) and what can be known a posteriori (through experience). The terms originate from the Analytic methods of Aristotle's Organon, and may be roughly defined as follows:

  • A priori knowledge is knowledge that is known independently of experience (that is, it is non-empirical, or arrived at before experience, usually by reason). It will henceforth be acquired through anything that is independent from experience.
  • A posteriori knowledge is knowledge that is known by experience (that is, it is empirical, or arrived at through experience).

Views that emphasize the importance of a priori knowledge are generally classified as rationalist. Views that emphasize the importance of a posteriori knowledge are generally classified as empiricist.

Belief

One of the core concepts in epistemology is belief. A belief is an attitude that a person holds regarding anything that they take to be true. For instance, to believe that snow is white is comparable to accepting the truth of the proposition "snow is white". Beliefs can be occurrent (e.g. a person actively thinking "snow is white"), or they can be dispositional (e.g. a person who if asked about the color of snow would assert "snow is white"). While there is not universal agreement about the nature of belief, most contemporary philosophers hold the view that a disposition to express belief B qualifies as holding the belief B. There are various different ways that contemporary philosophers have tried to describe beliefs, including as representations of ways that the world could be (Jerry Fodor), as dispositions to act as if certain things are true (Roderick Chisholm), as interpretive schemes for making sense of someone's actions (Daniel Dennett and Donald Davidson), or as mental states that fill a particular function (Hilary Putnam). Some have also attempted to offer significant revisions to our notion of belief, including eliminativists about belief who argue that there is no phenomenon in the natural world which corresponds to our folk psychological concept of belief (Paul Churchland) and formal epistemologists who aim to replace our bivalent notion of belief ("either I have a belief or I don't have a belief") with the more permissive, probabilistic notion of credence ("there is an entire spectrum of degrees of belief, not a simple dichotomy between belief and non-belief").

While belief plays a significant role in epistemological debates surrounding knowledge and justification, it also has many other philosophical debates in its own right. Notable debates include: "What is the rational way to revise one's beliefs when presented with various sorts of evidence?"; "Is the content of our beliefs entirely determined by our mental states, or do the relevant facts have any bearing on our beliefs (e.g. if I believe that I'm holding a glass of water, is the non-mental fact that water is H2O part of the content of that belief)?"; "How fine-grained or coarse-grained are our beliefs?"; and "Must it be possible for a belief to be expressible in language, or are there non-linguistic beliefs?"

Truth

Truth is the property or state of being in accordance with facts or reality. On most views, truth is the correspondence of language or thought to a mind-independent world. This is called the correspondence theory of truth. Among philosophers who think that it is possible to analyze the conditions necessary for knowledge, virtually all of them accept that truth is such a condition. There is much less agreement about the extent to which a knower must know why something is true in order to know. On such views, something being known implies that it is true. However, this should not be confused for the more contentious view that one must know that one knows in order to know (the KK principle).

Epistemologists disagree about whether belief is the only truth-bearer. Other common suggestions for things that can bear the property of being true include propositions, sentences, thoughts, utterances, and judgments. Plato, in his Gorgias, argues that belief is the most commonly invoked truth-bearer.

Many of the debates regarding truth are at the crossroads of epistemology and logic. Some contemporary debates regarding truth include: How do we define truth? Is it even possible to give an informative definition of truth? What things are truth-bearers and are therefore capable of being true or false? Are truth and falsity bivalent, or are there other truth values? What are the criteria of truth that allow us to identify it and to distinguish it from falsity? What role does truth play in constituting knowledge? And is truth absolute, or is it merely relative to one's perspective?

Justification

As the term "justification" is used in epistemology, a belief is justified if one has good reason for holding it. Loosely speaking, justification is the reason that someone holds a rationally admissible belief, on the assumption that it is a good reason for holding it. Sources of justification might include perceptual experience (the evidence of the senses), reason, and authoritative testimony, among others. Importantly however, a belief being justified does not guarantee that the belief is true, since a person could be justified in forming beliefs based on very convincing evidence that was nonetheless deceiving.

Internalism and externalism

A central debate about the nature of justification is a debate between epistemological externalists on the one hand and epistemological internalists on the other. While epistemic externalism first arose in attempts to overcome the Gettier problem, it has flourished in the time since as an alternative way of conceiving of epistemic justification. The initial development of epistemic externalism is often attributed to Alvin Goldman, although numerous other philosophers have worked on the topic in the time since.

Externalists hold that factors deemed "external", meaning outside of the psychological states of those who gain knowledge, can be conditions of justification. For example, an externalist response to the Gettier problem is to say that for a justified true belief to count as knowledge, there must be a link or dependency between the belief and the state of the external world. Usually, this is understood to be a causal link. Such causation, to the extent that it is "outside" the mind, would count as an external, knowledge-yielding condition. Internalists, on the other hand, assert that all knowledge-yielding conditions are within the psychological states of those who gain knowledge.

Though unfamiliar with the internalist/externalist debate himself, many point to René Descartes as an early example of the internalist path to justification. He wrote that because the only method by which we perceive the external world is through our senses, and that, because the senses are not infallible, we should not consider our concept of knowledge infallible. The only way to find anything that could be described as "indubitably true", he advocates, would be to see things "clearly and distinctly". He argued that if there is an omnipotent, good being who made the world, then it's reasonable to believe that people are made with the ability to know. However, this does not mean that man's ability to know is perfect. God gave man the ability to know but not with omniscience. Descartes said that man must use his capacities for knowledge correctly and carefully through methodological doubt.

The dictum "Cogito ergo sum" (I think, therefore I am) is also commonly associated with Descartes's theory. In his own methodological doubt—doubting everything he previously knew so he could start from a blank slate—the first thing that he could not logically bring himself to doubt was his own existence: "I do not exist" would be a contradiction in terms. The act of saying that one does not exist assumes that someone must be making the statement in the first place. Descartes could doubt his senses, his body, and the world around him—but he could not deny his own existence, because he was able to doubt and must exist to manifest that doubt. Even if some "evil genius" were deceiving him, he would have to exist to be deceived. This one sure point provided him with what he called his Archimedean point, in order to further develop his foundation for knowledge. Simply put, Descartes's epistemological justification depended on his indubitable belief in his own existence and his clear and distinct knowledge of God.

Defining knowledge

A central issue in epistemology is the question of what the nature of knowledge is or how to define it. Sometimes the expressions "theory of knowledge" and "analysis of knowledge" are used specifically for this form of inquiry. The term "knowledge" has various meanings in natural language. It can refer to an awareness of facts, as in knowing that Mars is a planet, to a possession of skills, as in knowing how to swim, or to an experiential acquaintance, as in knowing Daniel Craig personally. Factual knowledge, also referred to as propositional knowledge or descriptive knowledge, plays a special role in epistemology. On the linguistic level, it is distinguished from the other forms of knowledge since it can be expressed through a that-clause, i.e. using a formulation like "They know that..." followed by the known proposition.

Some features of factual knowledge are widely accepted: it is a form of cognitive success that establishes epistemic contact with reality. However, there are still various disagreements about its exact nature even though it has been studied intensely. Different factors are responsible for these disagreements. Some theorists try to furnish a practically useful definition by describing its most noteworthy and easily identifiable features. Others engage in an analysis of knowledge, which aims to provide a theoretically precise definition that identifies the set of essential features characteristic for all instances of knowledge and only for them. Differences in the methodology may also cause disagreements. In this regard, some epistemologists use abstract and general intuitions in order to arrive at their definitions. A different approach is to start from concrete individual cases of knowledge to determine what all of them have in common. Yet another method is to focus on linguistic evidence by studying how the term "knowledge" is commonly used. Different standards of knowledge are further sources of disagreement. A few theorists set these standards very high by demanding that absolute certainty or infallibility is necessary. On such a view, knowledge is a very rare thing. Theorists more in tune with ordinary language usually demand lower standards and see knowledge as something commonly found in everyday life.

As justified true belief

The historically most influential definition, discussed since ancient Greek philosophy, characterizes knowledge in relation to three essential features: as (1) a belief that is (2) true and (3) justified. There is still wide acceptance that the first two features are correct, i.e. that knowledge is a mental state that affirms a true proposition. However, there is a lot of dispute about the third feature: justification. This feature is usually included to distinguish knowledge from true beliefs that rest on superstition, lucky guesses, or faulty reasoning. This expresses the idea that knowledge is not the same as being right about something. Traditionally, justification is understood as the possession of evidence: a belief is justified if the believer has good evidence supporting it. Such evidence could be a perceptual experience, a memory, or a second belief.

Gettier problem and alternative definitions

An Euler diagram representing a version of the traditional definition of knowledge that is adapted to the Gettier problem. This problem gives us reason to think that not all justified true beliefs constitute knowledge.

The justified-true-belief account of knowledge came under severe criticism in the second half of the 20th century, when Edmund Gettier proposed various counterexamples. In a famous so-called Gettier-case, a person is driving on a country road. There are many barn façades along this road and only one real barn. But it is not possible to tell the difference between them from the road. The person then stops by a fortuitous coincidence in front of the only real barn and forms the belief that it is a barn. The idea behind this thought experiment is that this is not knowledge even though the belief is both justified and true. The reason is that it is just a lucky accident since the person cannot tell the difference: they would have formed exactly the same justified belief if they had stopped at another site, in which case the belief would have been false.

Various additional examples were proposed along similar lines. Most of them involve a justified true belief that apparently fails to amount to knowledge because the belief's justification is in some sense not relevant to its truth. These counterexamples have provoked very diverse responses. Some theorists think that one only needs to modify one's conception of justification to avoid them. But the more common approach is to search for an additional criterion. On this view, all cases of knowledge involve a justified true belief but some justified true beliefs do not amount to knowledge since they lack this additional feature. There are diverse suggestions for this fourth criterion. Some epistemologists require that no false belief is involved in the justification or that no defeater of the belief is present. A different approach is to require that the belief tracks truth, i.e. that the person would not have the belief if it was false. Some even require that the justification has to be infallible, i.e. that it necessitates the belief's truth.

A quite different approach is to affirm that the justified-true-belief account of knowledge is deeply flawed and to seek a complete reconceptualization of knowledge. These reconceptualizations often do not require justification at all. One such approach is to require that the true belief was produced by a reliable process. Naturalized epistemologists often hold that the believed fact has to cause the belief. Virtue theorists are also interested in how the belief is produced. For them, the belief must be a manifestation of a cognitive virtue.

The value problem

We generally assume that knowledge is more valuable than mere true belief. If so, what is the explanation? A formulation of the value problem in epistemology first occurs in Plato's Meno. Socrates points out to Meno that a man who knew the way to Larissa could lead others there correctly. But so, too, could a man who had true beliefs about how to get there, even if he had not gone there or had any knowledge of Larissa. Socrates says that it seems that both knowledge and true opinion can guide action. Meno then wonders why knowledge is valued more than true belief and why knowledge and true belief are different. Socrates responds that knowledge is more valuable than mere true belief because it is tethered or justified. Justification, or working out the reason for a true belief, locks down true belief.

The problem is to identify what (if anything) makes knowledge more valuable than mere true belief, or that makes knowledge more valuable than a mere minimal conjunction of its components, such as justification, safety, sensitivity, statistical likelihood, and anti-Gettier conditions, on a particular analysis of knowledge that conceives of knowledge as divided into components (to which knowledge-first epistemological theories, which posit knowledge as fundamental, are notable exceptions). The value problem re-emerged in the philosophical literature on epistemology in the twenty-first century following the rise of virtue epistemology in the 1980s, partly because of the obvious link to the concept of value in ethics.

Virtue epistemology

In contemporary philosophy, epistemologists including Ernest Sosa, John Greco, Jonathan Kvanvig, Linda Zagzebski, and Duncan Pritchard have defended virtue epistemology as a solution to the value problem. They argue that epistemology should also evaluate the "properties" of people as epistemic agents (i.e. intellectual virtues), rather than merely the properties of propositions and propositional mental attitudes.

The value problem has been presented as an argument against epistemic reliabilism by Linda Zagzebski, Wayne Riggs, and Richard Swinburne, among others. Zagzebski analogizes the value of knowledge to the value of espresso produced by an espresso maker: "The liquid in this cup is not improved by the fact that it comes from a reliable espresso maker. If the espresso tastes good, it makes no difference if it comes from an unreliable machine." For Zagzebski, the value of knowledge deflates to the value of mere true belief. She assumes that reliability in itself has no value or disvalue, but Goldman and Olsson disagree. They point out that Zagzebski's conclusion rests on the assumption of veritism: all that matters is the acquisition of true belief. To the contrary, they argue that a reliable process for acquiring a true belief adds value to the mere true belief by making it more likely that future beliefs of a similar kind will be true. By analogy, having a reliable espresso maker that produced a good cup of espresso would be more valuable than having an unreliable one that luckily produced a good cup because the reliable one would more likely produce good future cups compared to the unreliable one.

The value problem is important to assessing the adequacy of theories of knowledge that conceive of knowledge as consisting of true belief and other components. According to Kvanvig, an adequate account of knowledge should resist counterexamples and allow an explanation of the value of knowledge over mere true belief. Should a theory of knowledge fail to do so, it would prove inadequate.

One of the more influential responses to the problem is that knowledge is not particularly valuable and is not what ought to be the main focus of epistemology. Instead, epistemologists ought to focus on other mental states, such as understanding. Advocates of virtue epistemology have argued that the value of knowledge comes from an internal relationship between the knower and the mental state of believing.

Acquiring knowledge

Sources of knowledge

There are many proposed sources of knowledge and justified belief which we take to be actual sources of knowledge in our everyday lives. Some of the most commonly discussed include perception, reason, memory, and testimony.

Important distinctions

A prioria posteriori distinction

As mentioned above, epistemologists draw a distinction between what can be known a priori (independently of experience) and what can only be known a posteriori (through experience). Much of what we call a priori knowledge is thought to be attained through reason alone, as featured prominently in rationalism. This might also include a non-rational faculty of intuition, as defended by proponents of innatism. In contrast, a posteriori knowledge is derived entirely through experience or as a result of experience, as emphasized in empiricism. This also includes cases where knowledge can be traced back to an earlier experience, as in memory or testimony.

A way to look at the difference between the two is through an example. Bruce Russell gives two propositions in which the reader decides which one he believes more. Option A: All crows are birds. Option B: All crows are black. If you believe option A, then you are a priori justified in believing it because you don't have to see a crow to know it's a bird. If you believe in option B, then you are posteriori justified to believe it because you have seen many crows therefore knowing they are black. He goes on to say that it doesn't matter if the statement is true or not, only that if you believe in one or the other that matters.

The idea of a priori knowledge is that it is based on intuition or rational insights. Laurence BonJour says in his article "The Structure of Empirical Knowledge", that a "rational insight is an immediate, non-inferential grasp, apprehension or 'seeing' that some proposition is necessarily true." (3) Going back to the crow example, by Laurence BonJour's definition the reason you would believe in option A is because you have an immediate knowledge that a crow is a bird, without ever experiencing one.

Evolutionary psychology takes a novel approach to the problem. It says that there is an innate predisposition for certain types of learning. "Only small parts of the brain resemble a tabula rasa; this is true even for human beings. The remainder is more like an exposed negative waiting to be dipped into a developer fluid".

Analytic–synthetic distinction

The analytic–synthetic distinction was first proposed by Immanuel Kant.
 

Immanuel Kant, in his Critique of Pure Reason, drew a distinction between "analytic" and "synthetic" propositions. He contended that some propositions are such that we can know they are true just by understanding their meaning. For example, consider, "My father's brother is my uncle." We can know it is true solely by virtue of our understanding in what its terms mean. Philosophers call such propositions "analytic". Synthetic propositions, on the other hand, have distinct subjects and predicates. An example would be, "My father's brother has black hair." Kant stated that all mathematical and scientific statements are synthetic a priori propositions because they are necessarily true but our knowledge about the attributes of the mathematical or physical subjects we can only get by logical inference.

While this distinction is first and foremost about meaning and is therefore most relevant to the philosophy of language, the distinction has significant epistemological consequences, seen most prominently in the works of the logical positivists. In particular, if the set of propositions which can only be known a posteriori is coextensive with the set of propositions which are synthetically true, and if the set of propositions which can be known a priori is coextensive with the set of propositions which are analytically true (or in other words, which are true by definition), then there can only be two kinds of successful inquiry: Logico-mathematical inquiry, which investigates what is true by definition, and empirical inquiry, which investigates what is true in the world. Most notably, this would exclude the possibility that branches of philosophy like metaphysics could ever provide informative accounts of what actually exists.

The American philosopher Willard Van Orman Quine, in his paper "Two Dogmas of Empiricism", famously challenged the analytic-synthetic distinction, arguing that the boundary between the two is too blurry to provide a clear division between propositions that are true by definition and propositions that are not. While some contemporary philosophers take themselves to have offered more sustainable accounts of the distinction that are not vulnerable to Quine's objections, there is no consensus about whether or not these succeed.

Science as knowledge acquisition

Science is often considered to be a refined, formalized, systematic, institutionalized form of the pursuit and acquisition of empirical knowledge. As such, the philosophy of science may be viewed as an application of the principles of epistemology and as a foundation for epistemological inquiry.

The regress problem

The regress problem (also known as Agrippa's Trilemma) is the problem of providing a complete logical foundation for human knowledge. The traditional way of supporting a rational argument is to appeal to other rational arguments, typically using chains of reason and rules of logic. A classic example that goes back to Aristotle is deducing that Socrates is mortal. We have a logical rule that says All humans are mortal and an assertion that Socrates is human and we deduce that Socrates is mortal. In this example how do we know that Socrates is human? Presumably we apply other rules such as: All born from human females are human. Which then leaves open the question how do we know that all born from humans are human? This is the regress problem: how can we eventually terminate a logical argument with some statements that do not require further justification but can still be considered rational and justified? As John Pollock stated:

... to justify a belief one must appeal to a further justified belief. This means that one of two things can be the case. Either there are some beliefs that we can be justified for holding, without being able to justify them on the basis of any other belief, or else for each justified belief there is an infinite regress of (potential) justification [the nebula theory]. On this theory there is no rock bottom of justification. Justification just meanders in and out through our network of beliefs, stopping nowhere.

The apparent impossibility of completing an infinite chain of reasoning is thought by some to support skepticism. It is also the impetus for Descartes's famous dictum: I think, therefore I am. Descartes was looking for some logical statement that could be true without appeal to other statements.

Responses to the regress problem

Many epistemologists studying justification have attempted to argue for various types of chains of reasoning that can escape the regress problem.

Foundationalism

Foundationalists respond to the regress problem by asserting that certain "foundations" or "basic beliefs" support other beliefs but do not themselves require justification from other beliefs. These beliefs might be justified because they are self-evident, infallible, or derive from reliable cognitive mechanisms. Perception, memory, and a priori intuition are often considered possible examples of basic beliefs.

The chief criticism of foundationalism is that if a belief is not supported by other beliefs, accepting it may be arbitrary or unjustified.

Coherentism

Another response to the regress problem is coherentism, which is the rejection of the assumption that the regress proceeds according to a pattern of linear justification. To avoid the charge of circularity, coherentists hold that an individual belief is justified circularly by the way it fits together (coheres) with the rest of the belief system of which it is a part. This theory has the advantage of avoiding the infinite regress without claiming special, possibly arbitrary status for some particular class of beliefs. Yet, since a system can be coherent while also being wrong, coherentists face the difficulty of ensuring that the whole system corresponds to reality. Additionally, most logicians agree that any argument that is circular is, at best, only trivially valid. That is, to be illuminating, arguments must operate with information from multiple premises, not simply conclude by reiterating a premise.

Nigel Warburton writes in Thinking from A to Z that "[c]ircular arguments are not invalid; in other words, from a logical point of view there is nothing intrinsically wrong with them. However, they are, when viciously circular, spectacularly uninformative."

Infinitism

An alternative resolution to the regress problem is known as "infinitism". Infinitists take the infinite series to be merely potential, in the sense that an individual may have indefinitely many reasons available to them, without having consciously thought through all of these reasons when the need arises. This position is motivated in part by the desire to avoid what is seen as the arbitrariness and circularity of its chief competitors, foundationalism and coherentism. The most prominent defense of infinitism has been given by Peter Klein.

Foundherentism

An intermediate position, known as "foundherentism", is advanced by Susan Haack. Foundherentism is meant to unify foundationalism and coherentism. Haack explains the view by using a crossword puzzle as an analogy. Whereas, for example, infinitists regard the regress of reasons as taking the form of a single line that continues indefinitely, Haack has argued that chains of properly justified beliefs look more like a crossword puzzle, with various different lines mutually supporting each other. Thus, Haack's view leaves room for both chains of beliefs that are "vertical" (terminating in foundational beliefs) and chains that are "horizontal" (deriving their justification from coherence with beliefs that are also members of foundationalist chains of belief).

Schools of thought in epistemology

Empiricism

David Hume, one of the most staunch defenders of empiricism
 

Empiricism is a view in the theory of knowledge which focuses on the role of experience, especially experience based on perceptual observations by the senses, in the generation of knowledge. Certain forms exempt disciplines such as mathematics and logic from these requirements.

There are many variants of empiricism, including British empiricism, logical empiricism, phenomenalism, and some versions of common sense philosophy. Most forms of empiricism give epistemologically privileged status to sensory impressions or sense data, although this plays out very differently in different cases. Some of the most famous historical empiricists include John Locke, David Hume, George Berkeley, Francis Bacon, John Stuart Mill, Rudolf Carnap, and Bertrand Russell.

Rationalism

Rationalism is the epistemological view that reason is the chief source of knowledge and the main determinant of what constitutes knowledge. More broadly, it can also refer to any view which appeals to reason as a source of knowledge or justification. Rationalism is one of the two classical views in epistemology, the other being empiricism. Rationalists claim that the mind, through the use of reason, can directly grasp certain truths in various domains, including logic, mathematics, ethics, and metaphysics. Rationalist views can range from modest views in mathematics and logic (such as that of Gottlob Frege) to ambitious metaphysical systems (such as that of Baruch Spinoza).

Some of the most famous rationalists include Plato, René Descartes, Baruch Spinoza, and Gottfried Leibniz.

Skepticism

Skepticism is a position that questions the possibility of human knowledge, either in particular domains or on a general level. Skepticism does not refer to any one specific school of philosophy, but is rather a thread that runs through many epistemological debates. Ancient Greek skepticism began during the Hellenistic period in philosophy, which featured both Pyrrhonism (notably defended by Pyrrho, Sextus Empiricus, and Aenesidemus) and Academic skepticism (notably defended by Arcesilaus and Carneades). Among ancient Indian philosophers, skepticism was notably defended by the Ajñana school and in the Buddhist Madhyamika tradition. In modern philosophy, René Descartes' famous inquiry into mind and body began as an exercise in skepticism, in which he started by trying to doubt all purported cases of knowledge in order to search for something that was known with absolute certainty.

Epistemic skepticism questions whether knowledge is possible at all. Generally speaking, skeptics argue that knowledge requires certainty, and that most or all of our beliefs are fallible (meaning that our grounds for holding them always, or almost always, fall short of certainty), which would together entail that knowledge is always or almost always impossible for us. Characterizing knowledge as strong or weak is dependent on a person's viewpoint and their characterization of knowledge. Much of modern epistemology is derived from attempts to better understand and address philosophical skepticism.

Pyrrhonism

One of the oldest forms of epistemic skepticism can be found in Agrippa's trilemma (named after the Pyrrhonist philosopher Agrippa the Skeptic) which demonstrates that certainty can not be achieved with regard to beliefs. Pyrrhonism dates back to Pyrrho of Elis from the 4th century BCE, although most of what we know about Pyrrhonism today is from the surviving works of Sextus Empiricus. Pyrrhonists claim that for any argument for a non-evident proposition, an equally convincing argument for a contradictory proposition can be produced. Pyrrhonists do not dogmatically deny the possibility of knowledge, but instead point out that beliefs about non-evident matters cannot be substantiated.

Cartesian skepticism

The Cartesian evil demon problem, first raised by René Descartes, supposes that our sensory impressions may be controlled by some external power rather than the result of ordinary veridical perception. In such a scenario, nothing we sense would actually exist, but would instead be mere illusion. As a result, we would never be able to know anything about the world, since we would be systematically deceived about everything. The conclusion often drawn from evil demon skepticism is that even if we are not completely deceived, all of the information provided by our senses is still compatible with skeptical scenarios in which we are completely deceived, and that we must therefore either be able to exclude the possibility of deception or else must deny the possibility of infallible knowledge (that is, knowledge which is completely certain) beyond our immediate sensory impressions. While the view that no beliefs are beyond doubt other than our immediate sensory impressions is often ascribed to Descartes, he in fact thought that we can exclude the possibility that we are systematically deceived, although his reasons for thinking this are based on a highly contentious ontological argument for the existence of a benevolent God who would not allow such deception to occur.

Responses to philosophical skepticism

Epistemological skepticism can be classified as either "mitigated" or "unmitigated" skepticism. Mitigated skepticism rejects "strong" or "strict" knowledge claims but does approve weaker ones, which can be considered "virtual knowledge", but only with regard to justified beliefs. Unmitigated skepticism rejects claims of both virtual and strong knowledge. Characterizing knowledge as strong, weak, virtual or genuine can be determined differently depending on a person's viewpoint as well as their characterization of knowledge. Some of the most notable attempts to respond to unmitigated skepticism include direct realism, disjunctivism, common sense philosophy, pragmatism, fideism, and fictionalism.

Pragmatism

Pragmatism is a fallibilist epistemology that emphasizes the role of action in knowing. Different interpretations of pragmatism variously emphasize: truth as the final outcome of ideal scientific inquiry and experimentation, truth as closely related to usefulness, experience as transacting with (instead of representing) nature, and human practices as the foundation of language. Pragmatism's origins are often attributed to Charles Sanders Peirce, William James, and John Dewey. In 1878, Peirce formulated the maxim: "Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object."

William James suggested that through a pragmatist epistemology, theories "become instruments, not answers to enigmas in which we can rest". In James's pragmatic method, which he adapted from Peirce, metaphysical disputes can be settled by tracing the practical consequences of the different sides of the argument. If this process does not resolve the dispute, then "the dispute is idle".

Contemporary versions of pragmatism have been developed by thinkers such as Richard Rorty and Hilary Putnam. Rorty proposed that values were historically contingent and dependent upon their utility within a given historical period. Contemporary philosophers working in pragmatism are called neopragmatists, and also include Nicholas Rescher, Robert Brandom, Susan Haack, and Cornel West.

Naturalized epistemology

In certain respects an intellectual descendant of pragmatism, naturalized epistemology considers the evolutionary role of knowledge for agents living and evolving in the world. It de-emphasizes the questions around justification and truth, and instead asks, empirically, how reliable beliefs are formed and the role that evolution played in the development of such processes. It suggests a more empirical approach to the subject as a whole, leaving behind philosophical definitions and consistency arguments, and instead using psychological methods to study and understand how "knowledge" is actually formed and is used in the natural world. As such, it does not attempt to answer the analytic questions of traditional epistemology, but rather replace them with new empirical ones.

Naturalized epistemology was first proposed in "Epistemology Naturalized", a seminal paper by W.V.O. Quine. A less radical view has been defended by Hilary Kornblith in Knowledge and its Place in Nature, in which he seeks to turn epistemology towards empirical investigation without completely abandoning traditional epistemic concepts.

Epistemic relativism

Epistemic relativism is the view that what is true, rational, or justified for one person need not be true, rational, or justified for another person. Epistemic relativists therefore assert that while there are relative facts about truth, rationality, justification, and so on, there is no perspective-independent fact of the matter. Note that this is distinct from epistemic contextualism, which holds that the meaning of epistemic terms vary across contexts (e.g. "I know" might mean something different in everyday contexts and skeptical contexts). In contrast, epistemic relativism holds that the relevant facts vary, not just linguistic meaning. Relativism about truth may also be a form of ontological relativism, insofar as relativists about truth hold that facts about what exists vary based on perspective.

Epistemic constructivism

Constructivism is a view in philosophy according to which all "knowledge is a compilation of human-made constructions", "not the neutral discovery of an objective truth". Whereas objectivism is concerned with the "object of our knowledge", constructivism emphasizes "how we construct knowledge". Constructivism proposes new definitions for knowledge and truth, which emphasize intersubjectivity rather than objectivity, and viability rather than truth. The constructivist point of view is in many ways comparable to certain forms of pragmatism.

Epistemic idealism

Idealism is a broad term referring to both an ontological view about the world being in some sense mind-dependent and a corresponding epistemological view that everything people know can be reduced to mental phenomena. First and foremost, "idealism" is a metaphysical doctrine. As an epistemological doctrine, idealism shares a great deal with both empiricism and rationalism. Some of the most famous empiricists have been classified as idealists (particularly Berkeley), and yet the subjectivism inherent to idealism also resembles that of Descartes in many respects. Many idealists believe that knowledge is primarily (at least in some areas) acquired by a priori processes, or that it is innate—for example, in the form of concepts not derived from experience. The relevant theoretical concepts may purportedly be part of the structure of the human mind (as in Kant's theory of transcendental idealism), or they may be said to exist independently of the mind (as in Plato's theory of Forms).

Some of the most famous forms of idealism include transcendental idealism (developed by Immanuel Kant), subjective idealism (developed by George Berkeley), and absolute idealism (developed by Georg Wilhelm Friedrich Hegel and Friedrich Schelling).

Bayesian epistemology

Bayesian epistemology is a formal approach to various topics in epistemology that has its roots in Thomas Bayes' work in the field of probability theory. One advantage of its formal method in contrast to traditional epistemology is that its concepts and theorems can be defined with a high degree of precision. It is based on the idea that beliefs can be interpreted as subjective probabilities. As such, they are subject to the laws of probability theory, which act as the norms of rationality. These norms can be divided into static constraints, governing the rationality of beliefs at any moment, and dynamic constraints, governing how rational agents should change their beliefs upon receiving new evidence. The most characteristic Bayesian expression of these principles is found in the form of Dutch books, which illustrate irrationality in agents through a series of bets that lead to a loss for the agent no matter which of the probabilistic events occurs. Bayesians have applied these fundamental principles to various epistemological topics but Bayesianism does not cover all topics of traditional epistemology.

Feminist epistemology

Feminist epistemology is a subfield of epistemology which applies feminist theory to epistemological questions. It began to emerge as a distinct subfield in the 20th century. Prominent feminist epistemologists include Miranda Fricker (who developed the concept of epistemic injustice), Donna Haraway (who first proposed the concept of situated knowledge), Sandra Harding, and Elizabeth Anderson. Harding proposes that feminist epistemology can be broken into three distinct categories: Feminist empiricism, standpoint epistemology, and postmodern epistemology.

Feminist epistemology has also played a significant role in the development of many debates in social epistemology.

Decolonial epistemology

Epistemicide is a term used in decolonisation studies that describes the killing of knowledge systems under systemic oppression such as colonisation and slavery. The term was coined by Boaventura de Sousa Santos, who presented the significance of such physical violence creating the centering of Western knowledge in the current world. This term challenges the thought of what is seen as knowledge in academia today.

Indian pramana

Indian schools of philosophy, such as the Hindu Nyaya and Carvaka schools, and the Jain and Buddhist philosophical schools, developed an epistemological tradition independently of the Western philosophical tradition called "pramana". Pramana can be translated as "instrument of knowledge" and refers to various means or sources of knowledge that Indian philosophers held to be reliable. Each school of Indian philosophy had their own theories about which pramanas were valid means to knowledge and which were unreliable (and why). A Vedic text, Taittirīya Āraṇyaka (c. 9th–6th centuries BCE), lists "four means of attaining correct knowledge": smṛti ("tradition" or "scripture"), pratyakṣa ("perception"), aitihya ("communication by one who is expert", or "tradition"), and anumāna ("reasoning" or "inference").

In the Indian traditions, the most widely discussed pramanas are: Pratyakṣa (perception), Anumāṇa (inference), Upamāṇa (comparison and analogy), Arthāpatti (postulation, derivation from circumstances), Anupalabdi (non-perception, negative/cognitive proof) and Śabda (word, testimony of past or present reliable experts). While the Nyaya school (beginning with the Nyāya Sūtras of Gotama, between 6th-century BCE and 2nd-century CE) were a proponent of realism and supported four pramanas (perception, inference, comparison/analogy and testimony), the Buddhist epistemologists (Dignaga and Dharmakirti) generally accepted only perception and inference. The Carvaka school of materialists only accepted the pramana of perception, and hence were among the first empiricists in the Indian traditions. Another school, the Ajñana, included notable proponents of philosophical skepticism.

The theory of knowledge of the Buddha in the early Buddhist texts has been interpreted as a form of pragmatism as well as a form of correspondence theory. Likewise, the Buddhist philosopher Dharmakirti has been interpreted both as holding a form of pragmatism or correspondence theory for his view that what is true is what has effective power (arthakriya). The Buddhist Madhyamika school's theory of emptiness (shunyata) meanwhile has been interpreted as a form of philosophical skepticism.

The main contribution to epistemology by the Jains has been their theory of "many sided-ness" or "multi-perspectivism" (Anekantavada), which says that since the world is multifaceted, any single viewpoint is limited (naya – a partial standpoint). This has been interpreted as a kind of pluralism or perspectivism. According to Jain epistemology, none of the pramanas gives absolute or perfect knowledge since they are each limited points of view.

Domains of inquiry in epistemology

Formal epistemology

Formal epistemology uses formal tools and methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.

Historical epistemology

Historical epistemology is the study of the historical conditions of, and changes in, different kinds of knowledge. There are many versions of or approaches to historical epistemology, which is different from history of epistemology. Twentieth-century French historical epistemologists like Abel Rey, Gaston Bachelard, Jean Cavaillès, and Georges Canguilhem focused specifically on changes in scientific discourse.

Metaepistemology

Metaepistemology is the metaphilosophical study of the methods, aims, and subject matter of epistemology. In general, metaepistemology aims to better understand our first-order epistemological inquiry. Some goals of metaepistemology are identifying inaccurate assumptions made in epistemological debates and determining whether the questions asked in mainline epistemology are the right epistemological questions to be asking.

Social epistemology

Social epistemology deals with questions about knowledge in contexts where our knowledge attributions cannot be explained by simply examining individuals in isolation from one another, meaning that the scope of our knowledge attributions must be widened to include broader social contexts. It also explores the ways in which interpersonal beliefs can be justified in social contexts. The most common topics discussed in contemporary social epistemology are testimony, which deals with the conditions under which a belief "x is true" which resulted from being told "x is true" constitutes knowledge; peer disagreement, which deals with when and how I should revise my beliefs in light of other people holding beliefs that contradict mine; and group epistemology, which deals with what it means to attribute knowledge to groups rather than individuals, and when group knowledge attributions are appropriate.

Communication theory

From Wikipedia, the free encyclopedia

Communication theory is a proposed description of communication phenomena, the relationships among them, a storyline describing these relationships, and an argument for these three elements. Communication theory provides a way of talking about and analyzing key events, processes, and commitments that together form communication. Theory can be seen as a way to map the world and make it navigable; communication theory gives us tools to answer empirical, conceptual, or practical communication questions. Communication is defined in both commonsense and specialized ways. Communication theory emphasizes its symbolic and social process aspects as seen from two perspectives—as exchange of information (the transmission perspective), and as work done to connect and thus enable that exchange (the ritual perspective).

Sociolinguistic research in the 1950s and 1960s demonstrated that the level to which people change their formality of their language depending on the social context that they are in. This had been explained in terms of social norms that dictated language use. The way that we use language differs from person to person.

Communication theories have emerged from multiple historical points of origin, including classical traditions of oratory and rhetoric, Enlightenment-era conceptions of society and the mind, and post-World War II efforts to understand propaganda and relationships between media and society. Prominent historical and modern foundational communication theorists include Kurt Lewin, Harold Lasswell, Paul Lazarsfeld, Carl Hovland, James Carey, Elihu Katz, Kenneth Burke, John Dewey, Jurgen Habermas, Marshall McLuhan, Theodor Adorno, Antonio Gramsci, Robert E. Park, George Herbert Mead, Joseph Walther, Claude Shannon and Stuart Hall—although some of these theorists may not explicitly associate themselves with communication as a discipline or field of study.

Models and Elements of Communication Theory

One key activity in communication theory is the development of models and concepts used to describe communication. In the Linear Model, communication works in one direction: a sender encodes some message and sends it through a channel for a receiver to decode. In comparison, the Interactional Model of communication is bidirectional. People send and receive messages in a cooperative fashion as they continuously encode and decode information. The Transactional Model assumes that information is sent and received simultaneously through a noisy channel, and further considers a frame of reference or experience each person brings to the interaction.

Some of the basic elements of communication studied in communication theory are:

  • Source: Shannon calls this element the "information source", which "produces a message or sequence of messages to be communicated to the receiving terminal."
  • Sender: Shannon calls this element the "transmitter", which "operates on the message in some way to produce a signal suitable for transmission over the channel." In Aristotle, this element is the "speaker" (orator).
  • Channel: For Shannon, the channel is "merely the medium used to transmit the signal from transmitter to receiver."
  • Receiver: For Shannon, the receiver "performs the inverse operation of that done by the transmitter, reconstructing the message from the signal."
  • Destination: For Shannon, the destination is "the person (or thing) for whom the message is intended".
  • Message: from Latin mittere, "to send". The message is a concept, information, communication, or statement that is sent in a verbal, written, recorded, or visual form to the recipient.
  • Feedback
  • Entropic elements, positive and negative

Epistemology in Communication Theory

Communication theories vary substantially in their epistemology, and articulating this philosophical commitment is part of the theorizing process. Although the various epistemic positions used in communication theories can vary, one categorization scheme distinguishes among interpretive empirical, metric empirical or post-positivist, rhetorical, and critical epistemologies. Communication theories may also fall within or vary by distinct domains of interest, including information theory, rhetoric and speech, interpersonal communication, organizational communication, sociocultural communication, political communication, computer-mediated communication, and critical perspectives on media and communication.

Interpretive Empirical Epistemology

Interpretive empirical epistemology or interpretivism seeks to develop subjective insight and understanding of communication phenomena through the grounded study of local interactions. When developing or applying an interpretivist theory, the researcher themself is a vital instrument. Theories characteristic of this epistemology include structuration and symbolic interactionism, and frequently associated methods include discourse analysis and ethnography.

Metric Empirical or Post-Positivist Epistemology

A metric empirical or post-positivist epistemology takes an axiomatic and sometimes causal view of phenomena, developing evidence about association or making predictions, and using methods oriented to measurement of communication phenomena. Post-positivist theories are generally evaluated by their accuracy, consistency, fruitfulness, and parsimoniousness. Theories characteristic of a post-positivist epistemology may originate from a wide range of perspectives, including pragmatist, behaviorist, cognitivist, structuralist, or functionalist. Although post-positivist work may be qualitative or quantitative, statistical analysis is a common form of evidence and scholars taking this approach often seek to develop results that can be reproduced by others.

Rhetorical Epistemology

A rhetorical epistemology lays out a formal, logical, and global view of phenomena with particular concern for persuasion through speech. A rhetorical epistemology often draws from Greco-Roman foundations such as the works of Aristotle and Cicero although recent work also draws from Michel Foucault, Kenneth Burke, Marxism, second-wave feminism, and cultural studies. Rhetoric has changed overtime. Fields of rhetoric and composition have grown to become more interested in alternative types of rhetoric.

Critical Epistemology

A critical epistemology is explicitly political and intentional with respect to its standpoint, articulating an ideology and criticizing phenomena with respect to this ideology. A critical epistemology is driven by its values and oriented to social and political change. Communication theories associated with this epistemology include deconstructionism, cultural Marxism, third-wave feminism, and resistance studies.

New modes of Communication

During the mid-1970's, presiding paradigm had passed in regards to the development in communication. More specifically the increase in a participatory approach which challenged studies like diffusionism which had dominated the 1950s. There is no valid reason for studying people as an aggregation of specific individuals that have their social experience unified and cancelled out with the means of allowing only the attributes of socio-economic status, age and sex, representative of the them except by assuming that the audience is a mass.

Communication Theory by Perspective/Subdiscipline

Approaches to theory also vary by perspective or subdiscipline. The communication theory as a field model proposed by Robert Craig has been an influential approach to breaking down the field of communication theory into perspectives, each with its own strengths, weaknesses, and trade-offs.

Information Theory

In information theory, communication theories examine the technical process of information exchange while typically using mathematics.  This perspective on communication theory originated from the development of information theory in the early 1920s. Limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. The history of information theory as a form of communication theory can be traced through a series of key papers during this time. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system. Ralph Hartley's 1928 paper, Transmission of Information, uses the word "information" as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers. The main landmark event that opened the way to the development of the information theory form of communication theory was the publication of an article by Claude Shannon (1916–2001) in the Bell System Technical Journal in July and October 1948 under the title "A Mathematical Theory of Communication". Shannon focused on the problem of how best to encode the information that a sender wants to transmit. He also used tools in probability theory, developed by Norbert Wiener.

They marked the nascent stages of applied communication theory at that time. Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing the field of information theory. "The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point." In 1949, in a declassified version of Shannon's wartime work on the mathematical theory of cryptography ("Communication Theory of Secrecy Systems"), he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of sampling theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later. In 1951, Shannon made his fundamental contribution to natural language processing and computational linguistics with his article "Prediction and Entropy of Printed English" (1951), providing a clear quantifiable link between cultural practice and probabilistic cognition.

Interpersonal Communication

Theories in interpersonal communication are concerned with the ways in which very small groups of people communicate with one another. It also provides the framework in which we view the world around us. Although interpersonal communication theories have their origin in mass communication studies of attitude and response to messages, since the 1970s, interpersonal communication theories have taken on a distinctly personal focus. Interpersonal theories examine relationships and their development, non-verbal communication, how we adapt to one another during conversation, how we develop the messages we seek to convey, and how deception works.

Organizational Communication

Organizational communication theories address not only the ways in which people use communication in organizations, but also how they use communication to constitute that organization, developing structures, relationships, and practices to achieve their goals. Although early organization communication theories were characterized by a so-called container model (the idea that an organization is a clearly bounded object inside which communication happens in a straightforward manner following hierarchical lines), more recent theories have viewed the organization as a more fluid entity with fuzzy boundaries. Studies within the field of organizational communication mention communication as a facilitating act and a precursor to organizational activity as cooperative systems.

Given that its object of study is the organization, it is perhaps not surprising that organization communication scholarship has important connections to theories of management, with Management Communication Quarterly serving as a key venue for disseminating scholarly work. However, theories in organizational communication retain a distinct identity through their critical perspective toward power and attention to the needs and interests of workers, rather than privileging the will of management.

Organizational communication can be distinguished by its orientation to four key problematics: voice (who can speak within an organization), rationality (how decisions are made and whose ends are served), organization (how is the organization itself structured and how does it function), and the organization-society relationship (how the organization may alternately serve, exploit, and reflect society as a whole).

Sociocultural Communication

This line of theory examines how social order is both produced and reproduced through communication. Communication problems in the sociocultural tradition may be theorized in terms of misalignment, conflict, or coordination failure. Theories in this domain explore dynamics such as micro and macro level phenomena, structure versus agency, the local versus the global, and communication problems which emerge due to gaps of space and time, sharing some kinship with sociological and anthropological perspectives but distinguished by keen attention to communication as constructed and constitutive.

Political Communication

Political communication theories are concerned with the public exchange of messages among political actors of all kinds. This scope is in contrast to theories of political science which look inside political institutions to understand decision-making processes. Early political communication theories examined the roles of mass communication (i.e. television and newspapers) and political parties on political discourse. However, as the conduct of political discourse has expanded, theories of political communication have likewise developed, to now include models of deliberation and sensemaking, and discourses about a wide range of political topics: the role of the media (e.g. as a gatekeeper, framer, and agenda-setter); forms of government (e.g. democracy, populism, and autocracy); social change (e.g. activism and protests); economic order (e.g. capitalism, neoliberalism and socialism); human values (e.g. rights, norms, freedom, and authority.); and propaganda, disinformation, and trust. Two of the important emerging areas for theorizing about political communication are the examination of civic engagement and international comparative work (given that much of political communication has been done in the United States).

Computer-Mediated Communication

Theories of computer-mediated communication or CMC emerged as a direct response to the rapid emergence of novel mediating communication technologies in the form of computers. CMC scholars inquire as to what may be lost and what may be gained when we shift many of our formerly unmediated and entrained practices (that is, activities that were necessarily conducted in a synchronized, ordered, dependent fashion) into mediated and disentrained modes. For example, a discussion that once required a meeting can now be an e-mail thread, an appointment confirmation that once involved a live phone call can now be a click on a text message, a collaborative writing project that once required an elaborate plan for drafting, circulating, and annotating can now take place in a shared document.

CMC theories fall into three categories: cues-filtered-out theories, experiential/perceptual theories, and adaptation to/exploitation of media. Cues-filtered-out theories have often treated face-to-face interaction as the gold standard against which mediated communication should be compared, and includes such theories as social presence theory, media richness theory, and the Social Identity model of Deindividuation Effects (SIDE). Experiential/perceptual theories are concerned with how individuals perceive the capacity of technologies, such as whether the technology creates psychological closeness (electronic propinquity theory). Adaptation/exploitation theories consider how people may creatively expand or make use of the limitations in CMC systems, including social information processing theory (SIP) and the idea of the hyperpersonal (when people make use of the limitations of the mediated channel to create a selective view of themselves with their communication partner, developing an impression that exceeds reality). Theoretical work from Joseph Walther has been highly influential in the development of CMC. Theories in this area often examine the limitations and capabilities of new technologies, taking up an 'affordances' perspective inquiring what the technology may "request, demand, encourage, discourage, refuse, and allow." Recently the theoretical and empirical focus of CMC has shifted more explicitly away from the 'C' (i.e. Computer) and toward the 'M' (i.e. Mediation).

Rhetoric and Speech

Theories in rhetoric and speech are often concerned with discourse as an art, including practical consideration of the power of words and our ability to improve our skills through practice. Rhetorical theories provide a way of analyzing speeches when read in an exegetical manner (close, repeated reading to extract themes, metaphors, techniques, argument, meaning, etc.); for example with respect to their relationship to power or justice, or their persuasion, emotional appeal, or logic.

Critical Perspectives on Media and Communication

Critical social theory in communication, while sharing some traditions with rhetoric, is explicitly oriented toward "articulating, questioning, and transcending presuppositions that are judged to be untrue, dishonest, or unjust."(p. 147) Some work bridges this distinction to form critical rhetoric. Critical theories have their roots in the Frankfurt School, which brought together anti-establishment thinkers alarmed by the rise of Nazism and propaganda, including the work of Max Horkheimer and Theodor Adorno. Modern critical perspectives often engage with emergent social movements such as post-colonialism and queer theory, seeking to be reflective and emancipatory. One of the influential bodies of theory in this area comes from the work of Stuart Hall, who questioned traditional assumptions about the monolithic functioning of mass communication with his Encoding/Decoding Model of Communication and offered significant expansions of theories of discourse, semiotics, and power through media criticism and explorations of linguistic codes and cultural identity.

Axiology

Axiology is concerned with how values inform research and theory development. Most communication theory is guided by one of three axiological approaches. The first approach recognizes that values will influence theorists' interests but suggests that those values must be set aside once actual research begins. Outside replication of research findings is particularly important in this approach to prevent individual researchers' values from contaminating their findings and interpretations. The second approach rejects the idea that values can be eliminated from any stage of theory development. Within this approach, theorists do not try to divorce their values from inquiry. Instead, they remain mindful of their values so that they understand how those values contextualize, influence or skew their findings. The third approach not only rejects the idea that values can be separated from research and theory, but rejects the idea that they should be separated. This approach is often adopted by critical theorists who believe that the role of communication theory is to identify oppression and produce social change. In this axiological approach, theorists embrace their values and work to reproduce those values in their research and theory development.

Science wars

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Science_wars

The science wars were a series of scholarly and public discussions in the 1990s over the social place of science in making authoritative claims about the world. HighBeam Encyclopedia, citing the Encyclopedia of Science and Religion, defines the science wars as the discussions about the "way the sciences are related to or incarnated in culture, history, and practice[...] [which] came to be called a 'war' in the mid 1990s because of a strong polarization over questions of legitimacy and authority. One side [...] is concerned with defending the authority of science as rooted in objective evidence and rational procedures. The other side argues that it is legitimate and fruitful to study the sciences as institutions and social-technical networks whose development is influenced by linguistics, economics, politics, and other factors surrounding formally rational procedures and isolated established facts."

The science wars took place principally in the United States in the 1990s in the academic and mainstream press. Scientific realists (such as Norman Levitt, Paul R. Gross, Jean Bricmont and Alan Sokal) accused many writers, whom they described as 'postmodernist', of having effectively rejected scientific objectivity, the scientific method, empiricism, and scientific knowledge.

Though much of the theory associated with 'postmodernism' (see post-structuralism) did not make any interventions into the natural sciences, the scientific realists took aim at its general influence. The scientific realists argued that large swathes of scholarship, amounting to a rejection of objectivity and realism, had been influenced by major 20th-century post-structuralist philosophers (such as Jacques Derrida, Gilles Deleuze, Jean-François Lyotard and others), whose work they declare to be incomprehensible or meaningless. They implicate a broad range of fields in this trend, including cultural studies, feminist studies, comparative literature, media studies, and especially science and technology studies, which does apply such methods to the study of science.

Solid-state physicist N. David Mermin understands the science wars as a series of exchanges between scientists and "sociologists, historians and literary critics" whom the scientists "thought ...were ludicrously ignorant of science, making all kinds of nonsensical pronouncements. The other side dismissed these charges as naive, ill informed and self-serving." Sociologist Harry Collins wrote that the "science wars" began "in the early 1990s with attacks by natural scientists or ex-natural scientists who had assumed the role of spokespersons for science. The subject of the attacks was the analysis of science coming out of literary studies and the social sciences."

Historical background

Until the mid-20th century, the philosophy of science had concentrated on the viability of scientific method and knowledge, proposing justifications for the truth of scientific theories and observations and attempting to discover at a philosophical level why science worked. Karl Popper, an early opponent of logical positivism in the 20th century, repudiated the classical observationalist/inductivist form of scientific method in favour of empirical falsification. He is also known for his opposition to the classical justificationist/verificationist account of knowledge which he replaced with critical rationalism, "the first non justificational philosophy of criticism in the history of philosophy". His criticisms of scientific method were adopted by several postmodernist critiques.

A number of 20th-century philosophers maintained that logical models of pure science do not apply to actual scientific practice. It was the publication of Thomas Kuhn's The Structure of Scientific Revolutions in 1962, however, which fully opened the study of science to new disciplines by suggesting that the evolution of science was in part socially determined and that it did not operate under the simple logical laws put forward by the logical positivist school of philosophy.

Kuhn described the development of scientific knowledge not as a linear increase in truth and understanding, but as a series of periodic revolutions which overturned the old scientific order and replaced it with new orders (what he called "paradigms"). Kuhn attributed much of this process to the interactions and strategies of the human participants in science rather than its own innate logical structure. (See sociology of scientific knowledge).

Some interpreted Kuhn's ideas to mean that scientific theories were, either wholly or in part, social constructs, which many interpreted as diminishing the claim of science to representing objective reality, and that reality had a lesser or potentially irrelevant role in the formation of scientific theories. In 1971, Jerome Ravetz published Scientific knowledge and its social problems, a book describing the role that the scientific community, as a social construct, plays in accepting or rejecting objective scientific knowledge.

Postmodernism

A number of different philosophical and historical schools, often grouped together as "postmodernism", began reinterpreting scientific achievements of the past through the lens of the practitioners, often positing the influence of politics and economics in the development of scientific theories in addition to scientific observations. Rather than being presented as working entirely from positivistic observations, many scientists of the past were scrutinized for their connection to issues of gender, sexual orientation, race, and class. Some more radical philosophers, such as Paul Feyerabend, argued that scientific theories were themselves incoherent and that other forms of knowledge production (such as those used in religion) served the material and spiritual needs of their practitioners with equal validity as did scientific explanations.

Imre Lakatos advanced a midway view between the "postmodernist" and "realist" camps. For Lakatos, scientific knowledge is progressive; however, it progresses not by a strict linear path where every new element builds upon and incorporates every other, but by an approach where a "core" of a "research program" is established by auxiliary theories which can themselves be falsified or replaced without compromising the core. Social conditions and attitudes affect how strongly one attempts to resist falsification for the core of a program, but the program has an objective status based on its relative explanatory power. Resisting falsification only becomes ad-hoc and damaging to knowledge when an alternate program with greater explanatory power is rejected in favor of another with less. But because it is changing a theoretical core, which has broad ramifications for other areas of study, accepting a new program is also revolutionary as well as progressive. Thus, for Lakatos the character of science is that of being both revolutionary and progressive; both socially informed and objectively justified.

The science wars

In Higher Superstition: The Academic Left and Its Quarrels With Science (1994), the scientists Paul R. Gross and Norman Levitt accused postmodernists of anti-intellectualism, presented the shortcomings of relativism, and suggested that postmodernists knew little about the scientific theories they criticized and practiced poor scholarship for political reasons. The authors insist that the "science critics" misunderstood the theoretical approaches they criticized, given their "caricature, misreading, and condescension, [rather] than argument". The book sparked the so-called science wars. Higher Superstition inspired a New York Academy of Sciences conference titled The Flight from Science and Reason, organized by Gross, Levitt, and Gerald Holton. Attendees of the conference were critical of the polemical approach of Gross and Levitt, yet agreed upon the intellectual inconsistency of how laymen, non-scientist, and social studies intellectuals dealt with science.

Social Text

In 1996, Social Text, a Duke University publication of postmodern critical theory, compiled a "Science Wars" issue containing brief articles by postmodernist academics in the social sciences and the humanities, that emphasized the roles of society and politics in science. In the introduction to the issue, the Social Text editor, Andrew Ross, said that the attack upon science studies was a conservative reaction to reduced funding for scientific research, characterizing the Flight from Science and Reason conference as an attempted "linking together a host of dangerous threats: scientific creationism, New Age alternatives and cults, astrology, UFO-ism, the radical science movement, postmodernism, and critical science studies, alongside the ready-made historical specters of Aryan-Nazi science and the Soviet error of Lysenkoism" that "degenerated into name-calling".

The historian Dorothy Nelkin characterised Gross and Levitt's vigorous response as a "call to arms in response to the failed marriage of Science and the State"—in contrast to the scientists' historical tendency to avoid participating in perceived political threats, such as creation science, the animal rights movement, and anti-abortionists' attempts to curb fetal research. At the end of the Soviet–American Cold War (1945–91), military funding of science declined, while funding agencies demanded accountability, and research became directed by private interests. Nelkin suggested that postmodernist critics were "convenient scapegoats" who diverted attention from problems in science.

Also in 1996, physicist Alan Sokal had submitted an article to Social Text titled "Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity", which proposed that quantum gravity is a linguistic and social construct and that quantum physics supports postmodernist criticisms of scientific objectivity. After holding the article back from earlier issues due to Sokal's refusal to consider revisions, the staff published it in the "Science Wars" issue as a relevant contribution. Later, in the May 1996 issue of Lingua Franca, in the article "A Physicist Experiments With Cultural Studies", Sokal exposed his parody-article, "Transgressing the Boundaries" as an experiment testing the intellectual rigor of an academic journal that would "publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors' ideological preconceptions". The matter became known as the "Sokal Affair" and brought greater public attention to the wider conflict.

Jacques Derrida, a frequent target of "anti-relativist" criticism in the wake of Sokal's article, responded to the hoax in "Sokal and Bricmont Aren't Serious", first published in Le Monde. He called Sokal's action sad (triste) for having overshadowed Sokal's mathematical work and ruined the chance to sort out controversies of scientific objectivity in a careful way. Derrida went on to fault him and co-author Jean Bricmont for what he considered an act of intellectual bad faith: they had accused him of scientific incompetence in the English edition of a follow-up book (an accusation several English reviewers noted), but deleted the accusation from the French edition and denied that it had ever existed. He concluded, as the title indicates, that Sokal was not serious in his approach, but had used the spectacle of a "quick practical joke" to displace the scholarship Derrida believed the public deserved.

Continued conflict

In the first few years after the 'Science Wars' edition of Social Text, the seriousness and volume of discussion increased significantly, much of it focused on reconciling the 'warring' camps of postmodernists and scientists. One significant event was the 'Science and Its Critics' conference in early 1997; it brought together scientists and scholars who study science, and featured Alan Sokal and Steve Fuller as keynote speakers. The conference generated the final wave of substantial press coverage (in both news media and scientific journals), though by no means resolved the fundamental issues of social construction and objectivity in science.

Other attempts have been made to reconcile the two camps. Mike Nauenberg, a physicist at the University of California, Santa Cruz, organized a small conference in May 1997 that was attended by scientists and sociologists of science alike, among them Alan Sokal, N. David Mermin and Harry Collins. In the same year, Collins organized the Southampton Peace Workshop, which again brought together a broad range of scientists and sociologists. The Peace Workshop gave rise to the idea of a book that intended to map out some of the arguments between the disputing parties. The One Culture?: A Conversation about Science, edited by chemist Jay A. Labinger and sociologist Harry Collins, was eventually published in 2001. The book, the title of which is a reference to C.P. Snow's The Two Cultures, contains contributions from authors such as Alan Sokal, Jean Bricmont, Steven Weinberg and Steven Shapin.

Other important publications related to the science wars include Fashionable Nonsense by Sokal and Jean Bricmont (1998), The Social Construction of What? by Ian Hacking (1999) and Who Rules in Science by James Robert Brown.

To John C. Baez, the Bogdanov Affair in 2002 served as the bookend to the Sokal controversy: the review, acceptance, and publication of papers, later alleged to be nonsense, in peer-reviewed physics journals. Cornell physics professor Paul Ginsparg, argued that the cases are not at all similar, and that the fact that some journals and scientific institutions have low standards is "hardly a revelation". The new editor in chief of the journal Annals of Physics, who was appointed after the controversy along with a new editorial staff, had said that the standards of the journal had been poor leading up to the publication since the previous editor had become sick and died.

Interest in the science wars has waned considerably in recent years. Though the events of the science wars are still occasionally mentioned in mainstream press, they have had little effect on either the scientific community or the community of critical theorists. Both sides continue to maintain that the other does not understand their theories, or mistakes constructive criticisms and scholarly investigations for attacks. In 1999 Bruno Latour said "Scientists always stomp around meetings talking about 'bridging the two-culture gap', but when scores of people from outside the sciences begin to build just that bridge, they recoil in horror and want to impose the strangest of all gags on free speech since Socrates: only scientists should speak about science!" Subsequently, Latour has suggested a re-evaluation of sociology's epistemology based on lessons learnt from the Science Wars: "... scientists made us realize that there was not the slightest chance that the type of social forces we use as a cause could have objective facts as their effects".

Reviewing Sokal's Beyond the Hoax, Mermin stated that "As a sign that the science wars are over, I cite the 2008 election of Bruno Latour [...] to Foreign Honorary Membership in that bastion of the establishment, the American Academy of Arts and Sciences" and opined that "we are not only beyond Sokal's hoax, but beyond the science wars themselves".

However, more recently some of the leading critical theorists have recognized that their critiques have at times been counter-productive, and are providing intellectual ammunition for reactionary interests.

Writing about these developments in the context of global warming, Latour noted that "dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives. Was I wrong to participate in the invention of this field known as science studies? Is it enough to say that we did not really mean what we said?"

Kendrick Frazier notes that Latour is interested in helping to rebuild trust in science and that Latour has said that some of the authority of science needs to be regained.

In 2016, Shawn Lawrence Otto, in his book The War on Science: Who's Waging It, Why It Matters, and What We can Do About It, that the winners of the war on science "will chart the future of power, democracy, and freedom itself."

Quantum cryptography

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Quantum_crypto...