Search This Blog

Saturday, June 26, 2021

Philosophy of language

From Wikipedia, the free encyclopedia

In analytic philosophy, philosophy of language investigates the nature of language, the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.

Gottlob Frege and Bertrand Russell were pivotal figures in analytic philosophy's "linguistic turn". These writers were followed by Ludwig Wittgenstein (Tractatus Logico-Philosophicus), the Vienna Circle as well as the logical positivists, and Willard Van Orman Quine.

In continental philosophy, language is not studied as a separate discipline. Rather, it is an inextricable part of many other areas of thought, such as phenomenology, structural semiotics, language of mathematics, hermeneutics, existentialism, deconstruction and critical theory.

History

Ancient philosophy

In the West, inquiry into language stretches back to the 5th century BC with Socrates, Plato, Aristotle, and the Stoics. Both in India and in Greece, linguistic speculation predates the emergence of grammatical traditions of systematic description of language, which emerged around the 5th century BC in India (see Yāska), and around the 3rd century BC in Greece (see Rhianus).

In the dialogue Cratylus, Plato considered the question of whether the names of things were determined by convention or by nature. He criticized conventionalism because it led to the bizarre consequence that anything can be conventionally denominated by any name. Hence, it cannot account for the correct or incorrect application of a name. He claimed that there was a natural correctness to names. To do this, he pointed out that compound words and phrases have a range of correctness. He also argued that primitive names had a natural correctness, because each phoneme represented basic ideas or sentiments. For example, for Plato the letter l and its sound represented the idea of softness. However, by the end of the Cratylus, he had admitted that some social conventions were also involved, and that there were faults in the idea that phonemes had individual meanings. Plato is often considered a proponent of extreme realism.

Aristotle interested himself with the issues of logic, categories, and meaning creation. He separated all things into categories of species and genus. He thought that the meaning of a predicate was established through an abstraction of the similarities between various individual things. This theory later came to be called nominalism. However, since Aristotle took these similarities to be constituted by a real commonality of form, he is more often considered a proponent of "moderate realism".

The Stoic philosophers made important contributions to the analysis of grammar, distinguishing five parts of speech: nouns, verbs, appellatives (names or epithets), conjunctions and articles. They also developed a sophisticated doctrine of the lektón associated with each sign of a language, but distinct from both the sign itself and the thing to which it refers. This lektón was the meaning (or sense) of every term. The complete lektón of a sentence is what we would now call its proposition. Only propositions were considered "truth-bearers" or "truth-vehicles" (i.e., they could be called true or false) while sentences were simply their vehicles of expression. Different lektá could also express things besides propositions, such as commands, questions and exclamations.

Medieval philosophy

Medieval philosophers were greatly interested in the subtleties of language and its usage. For many scholastics, this interest was provoked by the necessity of translating Greek texts into Latin. There were several noteworthy philosophers of language in the medieval period. According to Peter J. King, (although this has been disputed), Peter Abelard anticipated the modern theories of reference. Also, William of Ockham's Summa Logicae brought forward one of the first serious proposals for codifying a mental language.

The scholastics of the high medieval period, such as Ockham and John Duns Scotus, considered logic to be a scientia sermocinalis (science of language). The result of their studies was the elaboration of linguistic-philosophical notions whose complexity and subtlety has only recently come to be appreciated. Many of the most interesting problems of modern philosophy of language were anticipated by medieval thinkers. The phenomena of vagueness and ambiguity were analyzed intensely, and this led to an increasing interest in problems related to the use of syncategorematic words such as and, or, not, if, and every. The study of categorematic words (or terms) and their properties was also developed greatly. One of the major developments of the scholastics in this area was the doctrine of the suppositio. The supposition of a term is the interpretation that is given of it in a specific context. It can be proper or improper (as when it is used in metaphor, metonyms and other figures of speech). A proper suppositio, in turn, can be either formal or material accordingly when it refers to its usual non-linguistic referent (as in "Charles is a man"), or to itself as a linguistic entity (as in "Charles has seven letters"). Such a classification scheme is the precursor of modern distinctions between use and mention, and between language and metalanguage.

There is a tradition called speculative grammar which existed from the 11th to the 13th century. Leading scholars included, among others, Martin of Dacia and Thomas of Erfurt.

Modern philosophy

Linguists of the Renaissance and Baroque periods such as Johannes Goropius Becanus, Athanasius Kircher and John Wilkins were infatuated with the idea of a philosophical language reversing the confusion of tongues, influenced by the gradual discovery of Chinese characters and Egyptian hieroglyphs (Hieroglyphica). This thought parallels the idea that there might be a universal language of music.

European scholarship began to absorb the Indian linguistic tradition only from the mid-18th century, pioneered by Jean François Pons and Henry Thomas Colebrooke (the editio princeps of Varadarāja, a 17th-century Sanskrit grammarian, dating to 1849).

In the early 19th century, the Danish philosopher Søren Kierkegaard insisted that language ought to play a larger role in Western philosophy. He argues that philosophy has not sufficiently focused on the role language plays in cognition and that future philosophy ought to proceed with a conscious focus on language:

If the claim of philosophers to be unbiased were all it pretends to be, it would also have to take account of language and its whole significance in relation to speculative philosophy ... Language is partly something originally given, partly that which develops freely. And just as the individual can never reach the point at which he becomes absolutely independent ... so too with language.

Contemporary philosophy

The phrase "linguistic turn" was used to describe the noteworthy emphasis that contemporary philosophers put upon language.

Language began to play a central role in Western philosophy in the early 20th century. One of the central figures involved in this development was the German philosopher Gottlob Frege, whose work on philosophical logic and the philosophy of language in the late 19th century influenced the work of 20th-century analytic philosophers Bertrand Russell and Ludwig Wittgenstein. The philosophy of language became so pervasive that for a time, in analytic philosophy circles, philosophy as a whole was understood to be a matter of philosophy of language.

In continental philosophy, the foundational work in the field was Ferdinand de Saussure's Cours de linguistique générale, published posthumously in 1916.

Major topics and sub-fields

Communication

Firstly, this field of study seeks to better understand what speakers and listeners do with language in communication, and how it is used socially. Specific interests include the topics of language learning, language creation, and speech acts.

Secondly, the question of how language relates to the minds of both the speaker and the interpreter is investigated. Of specific interest is the grounds for successful translation of words and concepts into their equivalents in another language.

Composition and parts

It has long been known that there are different parts of speech. One part of the common sentence is the lexical word, which is composed of nouns, verbs, and adjectives. A major question in the field – perhaps the single most important question for formalist and structuralist thinkers – is, "How does the meaning of a sentence emerge out of its parts?"

Example of a syntactic tree

Many aspects of the problem of the composition of sentences are addressed in the field of linguistics of syntax. Philosophical semantics tends to focus on the principle of compositionality to explain the relationship between meaningful parts and whole sentences. The principle of compositionality asserts that a sentence can be understood on the basis of the meaning of the parts of the sentence (i.e., words, morphemes) along with an understanding of its structure (i.e., syntax, logic). Further, syntactic propositions are arranged into 'discourse' or 'narrative' structures, which also encode meanings through pragmatics like temporal relations and pronominals.

It is possible to use the concept of functions to describe more than just how lexical meanings work: they can also be used to describe the meaning of a sentence. Take, for a moment, the sentence "The horse is red". We may consider "the horse" to be the product of a propositional function. A propositional function is an operation of language that takes an entity (in this case, the horse) as an input and outputs a semantic fact (i.e., the proposition that is represented by "The horse is red"). In other words, a propositional function is like an algorithm. The meaning of "red" in this case is whatever takes the entity "the horse" and turns it into the statement, "The horse is red."

Linguists have developed at least two general methods of understanding the relationship between the parts of a linguistic string and how it is put together: syntactic and semantic trees. Syntactic trees draw upon the words of a sentence with the grammar of the sentence in mind. Semantic trees, on the other hand, focus upon the role of the meaning of the words and how those meanings combine to provide insight onto the genesis of semantic facts.

Mind and language

Innateness and learning

Some of the major issues at the intersection of philosophy of language and philosophy of mind are also dealt with in modern psycholinguistics. Some important questions are How much of language is innate? Is language acquisition a special faculty in the mind? What is the connection between thought and language?

There are three general perspectives on the issue of language learning. The first is the behaviorist perspective, which dictates that not only is the solid bulk of language learned, but it is learned via conditioning. The second is the hypothesis testing perspective, which understands the child's learning of syntactic rules and meanings to involve the postulation and testing of hypotheses, through the use of the general faculty of intelligence. The final candidate for explanation is the innatist perspective, which states that at least some of the syntactic settings are innate and hardwired, based on certain modules of the mind.

There are varying notions of the structure of the brain when it comes to language. Connectionist models emphasize the idea that a person's lexicon and their thoughts operate in a kind of distributed, associative network. Nativist models assert that there are specialized devices in the brain that are dedicated to language acquisition. Computation models emphasize the notion of a representational language of thought and the logic-like, computational processing that the mind performs over them. Emergentist models focus on the notion that natural faculties are a complex system that emerge from simpler biological parts. Reductionist models attempt to explain higher-level mental processes in terms of the basic low-level neurophysiological activity of the brain.

Language and thought

An important problem which touches both philosophy of language and philosophy of mind is to what extent language influences thought and vice versa. There have been a number of different perspectives on this issue, each offering a number of insights and suggestions.

Linguists Sapir and Whorf suggested that language limited the extent to which members of a "linguistic community" can think about certain subjects (a hypothesis paralleled in George Orwell's novel Nineteen Eighty-Four). In other words, language was analytically prior to thought. Philosopher Michael Dummett is also a proponent of the "language-first" viewpoint.

The stark opposite to the Sapir–Whorf position is the notion that thought (or, more broadly, mental content) has priority over language. The "knowledge-first" position can be found, for instance, in the work of Paul Grice. Further, this view is closely associated with Jerry Fodor and his language of thought hypothesis. According to his argument, spoken and written language derive their intentionality and meaning from an internal language encoded in the mind. The main argument in favor of such a view is that the structure of thoughts and the structure of language seem to share a compositional, systematic character. Another argument is that it is difficult to explain how signs and symbols on paper can represent anything meaningful unless some sort of meaning is infused into them by the contents of the mind. One of the main arguments against is that such levels of language can lead to an infinite regress. In any case, many philosophers of mind and language, such as Ruth Millikan, Fred Dretske and Fodor, have recently turned their attention to explaining the meanings of mental contents and states directly.

Another tradition of philosophers has attempted to show that language and thought are coextensive – that there is no way of explaining one without the other. Donald Davidson, in his essay "Thought and Talk", argued that the notion of belief could only arise as a product of public linguistic interaction. Daniel Dennett holds a similar interpretationist view of propositional attitudes. To an extent, the theoretical underpinnings to cognitive semantics (including the notion of semantic framing) suggest the influence of language upon thought. However, the same tradition views meaning and grammar as a function of conceptualization, making it difficult to assess in any straightforward way.

Some thinkers, like the ancient sophist Gorgias, have questioned whether or not language was capable of capturing thought at all.

...speech can never exactly represent perceptibles, since it is different from them, and perceptibles are apprehended each by the one kind of organ, speech by another. Hence, since the objects of sight cannot be presented to any other organ but sight, and the different sense-organs cannot give their information to one another, similarly speech cannot give any information about perceptibles. Therefore, if anything exists and is comprehended, it is incommunicable.

There are studies that prove that languages shape how people understand causality. Some of them were performed by Lera Boroditsky. For example, English speakers tend to say things like "John broke the vase" even for accidents. However, Spanish or Japanese speakers would be more likely to say "the vase broke itself". In studies conducted by Caitlin Fausey at Stanford University speakers of English, Spanish and Japanese watched videos of two people popping balloons, breaking eggs and spilling drinks either intentionally or accidentally. Later everyone was asked whether they could remember who did what. Spanish and Japanese speakers did not remember the agents of accidental events as well as did English speakers.

Russian speakers, who make an extra distinction between light and dark blue in their language, are better able to visually discriminate shades of blue. The Piraha, a tribe in Brazil, whose language has only terms like few and many instead of numerals, are not able to keep track of exact quantities.

In one study German and Spanish speakers were asked to describe objects having opposite gender assignment in those two languages. The descriptions they gave differed in a way predicted by grammatical gender. For example, when asked to describe a "key"—a word that is masculine in German and feminine in Spanish—the German speakers were more likely to use words like "hard", "heavy", "jagged", "metal", "serrated" and "useful" whereas Spanish speakers were more likely to say "golden", "intricate", "little", "lovely", "shiny" and "tiny". To describe a "bridge", which is feminine in German and masculine in Spanish, the German speakers said "beautiful", "elegant", "fragile", "peaceful", "pretty" and "slender", and the Spanish speakers said "big", "dangerous", "long", "strong", "sturdy" and "towering". This was the case even though all testing was done in English, a language without grammatical gender.

In a series of studies conducted by Gary Lupyan, people were asked to look at a series of images of imaginary aliens. Whether each alien was friendly or hostile was determined by certain subtle features but participants were not told what these were. They had to guess whether each alien was friendly or hostile, and after each response they were told if they were correct or not, helping them learn the subtle cues that distinguished friend from foe. A quarter of the participants were told in advance that the friendly aliens were called "leebish" and the hostile ones "grecious", while another quarter were told the opposite. For the rest, the aliens remained nameless. It was found that participants who were given names for the aliens learned to categorize the aliens far more quickly, reaching 80 per cent accuracy in less than half the time taken by those not told the names. By the end of the test, those told the names could correctly categorize 88 per cent of aliens, compared to just 80 per cent for the rest. It was concluded that naming objects helps us categorize and memorize them.

In another series of experiments a group of people was asked to view furniture from an IKEA catalog. Half the time they were asked to label the object – whether it was a chair or lamp, for example – while the rest of the time they had to say whether or not they liked it. It was found that when asked to label items, people were later less likely to recall the specific details of products, such as whether a chair had arms or not. It was concluded that labeling objects helps our minds build a prototype of the typical object in the group at the expense of individual features.

Meaning

The topic that has received the most attention in the philosophy of language has been the nature of meaning, to explain what "meaning" is, and what we mean when we talk about meaning. Within this area, issues include: the nature of synonymy, the origins of meaning itself, our apprehension of meaning, and the nature of composition (the question of how meaningful units of language are composed of smaller meaningful parts, and how the meaning of the whole is derived from the meaning of its parts).

There have been several distinctive explanations of what a linguistic "meaning" is. Each has been associated with its own body of literature.

  • The ideational theory of meaning, most commonly associated with the British empiricist John Locke, claims that meanings are mental representations provoked by signs. Although this view of meaning has been beset by a number of problems from the beginning (see the main article for details), interest in it has been renewed by some contemporary theorists under the guise of semantic internalism.
  • The truth-conditional theory of meaning holds meaning to be the conditions under which an expression may be true or false. This tradition goes back at least to Frege and is associated with a rich body of modern work, spearheaded by philosophers like Alfred Tarski and Donald Davidson.
  • The use theory of meaning, most commonly associated with the later Wittgenstein, helped inaugurate the idea of "meaning as use", and a communitarian view of language. Wittgenstein was interested in the way in which the communities use language, and how far it can be taken. It is also associated with P. F. Strawson, John Searle, Robert Brandom, and others.
  • The constructivist theory of meaning claims that speech is not only passively describing a given reality, but it can change the (social) reality it is describing through speech acts, which for linguistics was as revolutionary a discovery as for physics was the discovery that the very act of measurement can change the measured reality itself. Speech act theory was developed by J. L. Austin, although other previous thinkers have had similar ideas.
  • The reference theory of meaning, also known collectively as semantic externalism, views meaning to be equivalent to those things in the world that are actually connected to signs. There are two broad subspecies of externalism: social and environmental. The first is most closely associated with Tyler Burge and the second with Hilary Putnam, Saul Kripke and others.
  • The verificationist theory of meaning is generally associated with the early 20th century movement of logical positivism. The traditional formulation of such a theory is that the meaning of a sentence is its method of verification or falsification. In this form, the thesis was abandoned after the acceptance by most philosophers of the Duhem–Quine thesis of confirmation holism after the publication of Quine's "Two Dogmas of Empiricism". However, Michael Dummett has advocated a modified form of verificationism since the 1970s. In this version, the comprehension (and hence meaning) of a sentence consists in the hearer's ability to recognize the demonstration (mathematical, empirical or other) of the truth of the sentence.
  • A pragmatic theory of meaning is any theory in which the meaning (or understanding) of a sentence is determined by the consequences of its application. Dummett attributes such a theory of meaning to Charles Sanders Peirce and other early 20th century American pragmatists.
  • The contrast theory of meaning suggests that knowledge attributions have a ternary structure of the form 'S knows that p rather than q'. This is in contrast to the traditional view whereby knowledge attributions have a binary structure of the form 'S knows that p'.

Other theories exist to discuss non-linguistic meaning (i.e., meaning as conveyed by body language, meanings as consequences, etc.).

Reference

Investigations into how language interacts with the world are called theories of reference. Gottlob Frege was an advocate of a mediated reference theory. Frege divided the semantic content of every expression, including sentences, into two components: sense and reference. The sense of a sentence is the thought that it expresses. Such a thought is abstract, universal and objective. The sense of any sub-sentential expression consists in its contribution to the thought that its embedding sentence expresses. Senses determine reference and are also the modes of presentation of the objects to which expressions refer. Referents are the objects in the world that words pick out. The senses of sentences are thoughts, while their referents are truth values (true or false). The referents of sentences embedded in propositional attitude ascriptions and other opaque contexts are their usual senses.

Bertrand Russell, in his later writings and for reasons related to his theory of acquaintance in epistemology, held that the only directly referential expressions are, what he called, "logically proper names". Logically proper names are such terms as I, now, here and other indexicals. He viewed proper names of the sort described above as "abbreviated definite descriptions" (see Theory of descriptions). Hence Joseph R. Biden may be an abbreviation for "the current President of the United States and husband of Jill Biden". Definite descriptions are denoting phrases which are analyzed by Russell into existentially quantified logical constructions. Such phrases denote in the sense that there is an object that satisfies the description. However, such objects are not to be considered meaningful on their own, but have meaning only in the proposition expressed by the sentences of which they are a part. Hence, they are not directly referential in the same way as logically proper names, for Russell.

On Frege's account, any referring expression has a sense as well as a referent. Such a "mediated reference" view has certain theoretical advantages over Mill's view. For example, co-referential names, such as Samuel Clemens and Mark Twain, cause problems for a directly referential view because it is possible for someone to hear "Mark Twain is Samuel Clemens" and be surprised – thus, their cognitive content seems different.

Despite the differences between the views of Frege and Russell, they are generally lumped together as descriptivists about proper names. Such descriptivism was criticized in Saul Kripke's Naming and Necessity.

Kripke put forth what has come to be known as "the modal argument" (or "argument from rigidity"). Consider the name Aristotle and the descriptions "the greatest student of Plato", "the founder of logic" and "the teacher of Alexander". Aristotle obviously satisfies all of the descriptions (and many of the others we commonly associate with him), but it is not necessarily true that if Aristotle existed then Aristotle was any one, or all, of these descriptions. Aristotle may well have existed without doing any single one of the things for which he is known to posterity. He may have existed and not have become known to posterity at all or he may have died in infancy. Suppose that Aristotle is associated by Mary with the description "the last great philosopher of antiquity" and (the actual) Aristotle died in infancy. Then Mary's description would seem to refer to Plato. But this is deeply counterintuitive. Hence, names are rigid designators, according to Kripke. That is, they refer to the same individual in every possible world in which that individual exists. In the same work, Kripke articulated several other arguments against "Frege–Russell" descriptivism.

The whole philosophical enterprise of studying reference has been critiqued by linguist Noam Chomsky in various works.

Social interaction and language

A common claim is that language is governed by social conventions. Questions inevitably arise on surrounding topics. One question is, "What exactly is a convention, and how do we study it?", and second, "To what extent do conventions even matter in the study of language?" David Kellogg Lewis proposed a worthy reply to the first question by expounding the view that a convention is a rationally self-perpetuating regularity in behavior. However, this view seems to compete to some extent with the Gricean view of speaker's meaning, requiring either one (or both) to be weakened if both are to be taken as true.

Some have questioned whether or not conventions are relevant to the study of meaning at all. Noam Chomsky proposed that the study of language could be done in terms of the I-Language, or internal language of persons. If this is so, then it undermines the pursuit of explanations in terms of conventions, and relegates such explanations to the domain of "metasemantics". "Metasemantics" is a term used by philosopher of language Robert Stainton to describe all those fields that attempt to explain how semantic facts arise. One fruitful source of research involves investigation into the social conditions that give rise to, or are associated with, meanings and languages. Etymology (the study of the origins of words) and stylistics (philosophical argumentation over what makes "good grammar", relative to a particular language) are two other examples of fields that are taken to be metasemantic.

Not surprisingly, many separate (but related) fields have investigated the topic of linguistic convention within their own research paradigms. The presumptions that prop up each theoretical view are of interest to the philosopher of language. For instance, one of the major fields of sociology, symbolic interactionism, is based on the insight that human social organization is based almost entirely on the use of meanings. In consequence, any explanation of a social structure (like an institution) would need to account for the shared meanings which create and sustain the structure.

Rhetoric is the study of the particular words that people use to achieve the proper emotional and rational effect in the listener, be it to persuade, provoke, endear, or teach. Some relevant applications of the field include the examination of propaganda and didacticism, the examination of the purposes of swearing and pejoratives (especially how it influences the behavior of others, and defines relationships), or the effects of gendered language. It can also be used to study linguistic transparency (or speaking in an accessible manner), as well as performative utterances and the various tasks that language can perform (called "speech acts"). It also has applications to the study and interpretation of law, and helps give insight to the logical concept of the domain of discourse.

Literary theory is a discipline that some literary theorists claim overlaps with the philosophy of language. It emphasizes the methods that readers and critics use in understanding a text. This field, an outgrowth of the study of how to properly interpret messages, is unsurprisingly closely tied to the ancient discipline of hermeneutics.

Truth

Finally, philosophers of language investigate how language and meaning relate to truth and the reality being referred to. They tend to be less interested in which sentences are actually true, and more in what kinds of meanings can be true or false. A truth-oriented philosopher of language might wonder whether or not a meaningless sentence can be true or false, or whether or not sentences can express propositions about things that do not exist, rather than the way sentences are used.

Language and continental philosophy

In continental philosophy, language is not studied as a separate discipline, as it is in analytic philosophy. Rather, it is an inextricable part of many other areas of thought, such as phenomenology, structural semiotics, hermeneutics, existentialism, structuralism, deconstruction and critical theory. The idea of language is often related to that of logic in its Greek sense as "logos", meaning discourse or dialectic. Language and concepts are also seen as having been formed by history and politics, or even by historical philosophy itself.

The field of hermeneutics, and the theory of interpretation in general, has played a significant role in 20th century continental philosophy of language and ontology beginning with Martin Heidegger. Heidegger combines phenomenology with the hermeneutics of Wilhelm Dilthey. Heidegger believed language was one of the most important concepts for Dasein. Heidegger believed that language today is worn out because of overuse of important words, and would be inadequate for in-depth study of Being (Sein). For example, Sein (being), the word itself, is saturated with multiple meanings. Thus, he invented new vocabulary and linguistic styles, based on Ancient Greek and Germanic etymological word relations, to disambiguate commonly used words. He avoided words like consciousness, ego, human, nature, etc. and instead talked holistically of Being-in-the-world, Dasein.

With such new concepts as Being-in-the-world, Heidegger constructs his theory of language, centered on speech. He believed speech (talking, listening, silence) was the most essential and pure form of language. Heidegger claims writing is only a supplement to speech, because even readers construct or contribute their own "talk" while reading. The most important feature of language is its projectivity, the idea that language is prior to human speech. This means that when one is "thrown" into the world, his existence is characterized from the beginning by a certain pre-comprehension of the world. However, only after naming, or "articulation of intelligibility", can one have primary access to Dasein and Being-in-the-World.

Hans-Georg Gadamer expanded on these ideas of Heidegger and proposed a complete hermeneutic ontology. In Truth and Method, Gadamer describes language as "the medium in which substantive understanding and agreement take place between two people." In addition, Gadamer claims that the world is linguistically constituted, and cannot exist apart from language. For example, monuments and statues cannot communicate without the aid of language. Gadamer also claims that every language constitutes a world-view, because the linguistic nature of the world frees each individual from an objective environment: "... the fact that we have a world at all depends upon [language] and presents itself in it. The world as world exists for man as for no other creature in the world."

Paul Ricœur, on the other hand, proposed a hermeneutics which, reconnecting with the original Greek sense of the term, emphasized the discovery of hidden meanings in the equivocal terms (or "symbols") of ordinary language. Other philosophers who have worked in this tradition include Luigi Pareyson and Jacques Derrida.

Semiotics is the study of the transmission, reception and meaning of signs and symbols in general. In this field, human language (both natural and artificial) is just one among many ways that humans (and other conscious beings) are able to communicate. It allows them to take advantage of and effectively manipulate the external world in order to create meaning for themselves and transmit this meaning to others. Every object, every person, every event, and every force communicates (or signifies) continuously. The ringing of a telephone for example, is the telephone. The smoke that I see on the horizon is the sign that there is a fire. The smoke signifies. The things of the world, in this vision, seem to be labeled precisely for intelligent beings who only need to interpret them in the way that humans do. Everything has meaning. True communication, including the use of human language, however, requires someone (a sender) who sends a message, or text, in some code to someone else (a receiver). Language is studied only insofar as it is one of these forms (the most sophisticated form) of communication. Some important figures in the history of semiotics, are Charles Sanders Peirce, Roland Barthes, and Roman Jakobson. In modern times, its best-known figures include Umberto Eco, A. J. Greimas, Louis Hjelmslev, and Tullio De Mauro. Investigations on signs in non-human communications are subject to biosemiotics, a field founded in the late 20th century by Thomas Sebeok and Thure von Uexküll.

Problems in the philosophy of language

Formal versus informal approaches

Another of the questions that has divided philosophers of language is the extent to which formal logic can be used as an effective tool in the analysis and understanding of natural languages. While most philosophers, including Gottlob Frege, Alfred Tarski and Rudolf Carnap, have been more or less skeptical about formalizing natural languages, many of them developed formal languages for use in the sciences or formalized parts of natural language for investigation. Some of the most prominent members of this tradition of formal semantics include Tarski, Carnap, Richard Montague and Donald Davidson.

On the other side of the divide, and especially prominent in the 1950s and '60s, were the so-called "ordinary language philosophers". Philosophers such as P. F. Strawson, John Langshaw Austin and Gilbert Ryle stressed the importance of studying natural language without regard to the truth-conditions of sentences and the references of terms. They did not believe that the social and practical dimensions of linguistic meaning could be captured by any attempts at formalization using the tools of logic. Logic is one thing and language is something entirely different. What is important is not expressions themselves but what people use them to do in communication.

Hence, Austin developed a theory of speech acts, which described the kinds of things which can be done with a sentence (assertion, command, inquiry, exclamation) in different contexts of use on different occasions. Strawson argued that the truth-table semantics of the logical connectives (e.g., , and ) do not capture the meanings of their natural language counterparts ("and", "or" and "if-then"). While the "ordinary language" movement basically died out in the 1970s, its influence was crucial to the development of the fields of speech-act theory and the study of pragmatics. Many of its ideas have been absorbed by theorists such as Kent Bach, Robert Brandom, Paul Horwich and Stephen Neale. In recent work, the division between semantics and pragmatics has become a lively topic of discussion at the interface of philosophy and linguistics, for instance in work by Sperber and Wilson, Carston and Levinson.

While keeping these traditions in mind, the question of whether or not there is any grounds for conflict between the formal and informal approaches is far from being decided. Some theorists, like Paul Grice, have been skeptical of any claims that there is a substantial conflict between logic and natural language.

Problem of universals and composition

One debate that has captured the interest of many philosophers is the debate over the meaning of universals. One might ask, for example, "When people say the word rocks, what is it that the word represents?" Two different answers have emerged to this question. Some have said that the expression stands for some real, abstract universal out in the world called "rocks". Others have said that the word stands for some collection of particular, individual rocks that we associate with merely a nomenclature. The former position has been called philosophical realism, and the latter nominalism.

The issue here can be explicated if we examine the proposition "Socrates is a Man".

From the realist's perspective, the connection between S and M is a connection between two abstract entities. There is an entity, "man", and an entity, "Socrates". These two things connect in some way or overlap.

From a nominalist's perspective, the connection between S and M is the connection between a particular entity (Socrates) and a vast collection of particular things (men). To say that Socrates is a man is to say that Socrates is a part of the class of "men". Another perspective is to consider "man" to be a property of the entity, "Socrates".

There is a third way, between nominalism and (extreme) realism, usually called "moderate realism" and attributed to Aristotle and Thomas Aquinas. Moderate realists hold that "man" refers to a real essence or form that is really present and identical in Socrates and all other men, but "man" does not exist as a separate and distinct entity. This is a realist position, because "Man" is real, insofar as it really exists in all men; but it is a moderate realism, because "Man" is not an entity separate from the men it informs.

Nature of language

Languages are thought of as sign systems in a semiotic tradition dating from John Locke and culminating in Saussure's notion of language as semiology: an interactive system of a semantic and a symbolic level. Building on Saussurian structuralism, Louis Hjelmslev saw the organisation of the levels as fully computational.

Age of Enlightenment philosopher Antoine Arnauld argued that people had created language rationally in a step-by-step process to fulfill a psychological need to communicate with others. 19th century romanticism emphasised human agency and free will in meaning construction. More lately, Eugenio Coșeriu underlined the role of intention in the processes, while others including Esa Itkonen believe that the social construction of language takes place unconsciously. In Saussure's notion, language is a social fact which arises from social interaction, but can neither be reduced to the individual acts nor to human psychology, which supports the autonomy of the study of language from other sciences.

Humanistic views are challenged by biological theories of language which consider languages as natural phenomena. Charles Darwin considered languages as species. 19th century evolutionary linguistics was furthest developed by August Schleicher who compared languages to plants, animals and crystals. In Neo-Darwinism, Richard Dawkins and other proponents of cultural replicator theories consider languages as populations of mind viruses. Noam Chomsky, on the other hand, holds the view that language is not an organism but an organ, and that linguistic structures are crystallised. This is hypothesised as having been caused by a single mutation in humans, but Steven Pinker argues it is the result of human and cultural co-evolution.

Translation and interpretation

Translation and interpretation are two other problems that philosophers of language have attempted to confront. In the 1950s, W.V. Quine argued for the indeterminacy of meaning and reference based on the principle of radical translation. In Word and Object, Quine asks readers to imagine a situation in which they are confronted with a previously undocumented, group of indigenous people where they must attempt to make sense of the utterances and gestures that its members make. This is the situation of radical translation.

He claimed that, in such a situation, it is impossible in principle to be absolutely certain of the meaning or reference that a speaker of the indigenous peoples language attaches to an utterance. For example, if a speaker sees a rabbit and says "gavagai", is she referring to the whole rabbit, to the rabbit's tail, or to a temporal part of the rabbit. All that can be done is to examine the utterance as a part of the overall linguistic behaviour of the individual, and then use these observations to interpret the meaning of all other utterances. From this basis, one can form a manual of translation. But, since reference is indeterminate, there will be many such manuals, no one of which is more correct than the others. For Quine, as for Wittgenstein and Austin, meaning is not something that is associated with a single word or sentence, but is rather something that, if it can be attributed at all, can only be attributed to a whole language. The resulting view is called semantic holism.

Inspired by Quine's discussion, Donald Davidson extended the idea of radical translation to the interpretation of utterances and behavior within a single linguistic community. He dubbed this notion radical interpretation. He suggested that the meaning that any individual ascribed to a sentence could only be determined by attributing meanings to many, perhaps all, of the individual's assertions, as well as their mental states and attitudes.

Vagueness

One issue that has troubled philosophers of language and logic is the problem of the vagueness of words. The specific instances of vagueness that most interest philosophers of language are those where the existence of "borderline cases" makes it seemingly impossible to say whether a predicate is true or false. Classic examples are "is tall" or "is bald", where it cannot be said that some borderline case (some given person) is tall or not-tall. In consequence, vagueness gives rise to the paradox of the heap. Many theorists have attempted to solve the paradox by way of n-valued logics, such as fuzzy logic, which have radically departed from classical two-valued logics.

 

From each according to his ability, to each according to his needs

"From each according to his ability, to each according to his needs" (German: Jeder nach seinen Fähigkeiten, jedem nach seinen Bedürfnissen) is a slogan popularised by Karl Marx in his 1875 Critique of the Gotha Program. The principle refers to free access to and distribution of goods, capital and services. In the Marxist view, such an arrangement will be made possible by the abundance of goods and services that a developed communist system will be capable to produce; the idea is that, with the full development of socialism and unfettered productive forces, there will be enough to satisfy everyone's needs.

Origin of the phrase

The complete paragraph containing Marx's statement of the creed in the Critique of the Gotha Program is as follows:

In a higher phase of communist society, after the enslaving subordination of the individual to the division of labor, and therewith also the antithesis between mental and physical labor, has vanished; after labor has become not only a means of life but life's prime want; after the productive forces have also increased with the all-around development of the individual, and all the springs of co-operative wealth flow more abundantly—only then can the narrow horizon of bourgeois right be crossed in its entirety and society inscribe on its banners: From each according to his ability, to each according to his needs!

Although Marx is popularly thought of as the originator of the phrase, the slogan was common within the socialist movement. For example, August Becker in 1844 described it as the basic principle of communism and Louis Blanc used it in 1851. The French socialist Saint-Simonists of the 1820s and 1830s used slightly different slogans such as," from each according to his ability, to each ability according to its work" or," From each according to his capacity, to each according to his works.” The origin of this phrasing has also been attributed to the French utopian Étienne-Gabriel Morelly, who proposed in his 1755 Code of Nature "Sacred and Fundamental Laws that would tear out the roots of vice and of all the evils of a society", including:

I. Nothing in society will belong to anyone, either as a personal possession or as capital goods, except the things for which the person has immediate use, for either his needs, his pleasures, or his daily work.
II. Every citizen will be a public man, sustained by, supported by, and occupied at the public expense.
III. Every citizen will make his particular contribution to the activities of the community according to his capacity, his talent and his age; it is on this basis that his duties will be determined, in conformity with the distributive laws.

A similar phrase can be found in the Guilford Covenant in 1639:

We whose names are here underwritten, intending by God's gracious permission to plant ourselves in New England, and if it may be, in the southerly part about Quinnipiack, do faithfully promise each, for ourselves and our families and those that belong to us, that we will, the Lord assisting us, sit down and join ourselves together in one entire plantation, and be helpful each to the other in any common work, according to every man's ability, and as need shall require, and we promise not to desert or leave each other or the plantation, but with the consent of the rest, or the greater part of the company who have entered into this engagement.

Some scholars trace the phrase to the New Testament. In Acts of the Apostles the lifestyle of the community of believers in Jerusalem is described as communal (without individual possession), and uses the phrase "distribution was made unto every man according as he had need" (διεδίδετο δὲ ἑκάστῳ καθότι ἄν τις χρείαν εἶχεν):

Acts 4:32–35: ³² And the multitude of them that believed were of one heart and of one soul: neither said any of them that ought of the things which he possessed was his own; but they had all things common. ³³ And with great power gave the apostles witness of the resurrection of the Lord Jesus: and great grace was upon them all. ³⁴ Neither was there any among them that lacked: for as many as were possessors of lands or houses sold them, and brought the prices of the things that were sold, ³⁵ And laid them down at the apostles' feet: and distribution was made unto every man according as he had need.

Other scholars find its origins in "the Roman legal concept of obligation in solidum", in which "everyone assumes responsibility for anyone who cannot pay his debt, and he is conversely responsible for everyone else". James Furner argues:

If x = a disadvantage, and y = action to redress that disadvantage, the principle of solidarity is: if any member of a group acquires x, each member has a duty to perform y (if they can assist). All we then need to add, to get to the fundamental principle of developed communism, is to assume that non-satisfaction of a need is a disadvantage. The corresponding principle of solidarity in respect of need says: if any member of society has an unsatisfied need, each member has a duty to produce its object (if they can). But that is precisely what the principle 'from each according to their abilities, to each according to their needs!' dictates. In Marx's vision, the basic principle of developed communism is a principle of solidarity in respect of need.

Debates on the idea

Marx delineated the specific conditions under which such a creed would be applicable—a society where technology and social organization had substantially eliminated the need for physical labor in the production of things, where "labor has become not only a means of life but life's prime want". Marx explained his belief that, in such a society, each person would be motivated to work for the good of society despite the absence of a social mechanism compelling them to work, because work would have become a pleasurable and creative activity. Marx intended the initial part of his slogan, "from each according to his ability" to suggest not merely that each person should work as hard as they can, but that each person should best develop their particular talents.

Claiming themselves to be at a "lower stage of communism" (i.e. "socialism", in line with Vladimir Lenin’s terminology), the Soviet Union adapted the formula as: "From each according to his ability, to each according to his work (labour investment)". This was incorporated in Article 12 of the 1936 Constitution of the Soviet Union, but described by Leon Trotsky as "This inwardly contradictory, not to say nonsensical, formula".

While liberation theology has sought to interpret the Christian call for justice in a way that is in harmony with this Marxist dictum, many have noted that Jesus' teaching in the Parable of the Talents (Matthew 25:14–30) affirms only "TO each according to his ability" (Matt. 25:15), and not "FROM each according to his ability".

In popular culture

In Ayn Rand's 1957 novel Atlas Shrugged, a large and profitable motor company adopted this slogan as its method for determining employee compensation. The system quickly fell prey to corruption and greed, forcing the most capable employees to work overtime in order to satisfy the needs of the least competent and to funnel money to the owners. As a result, the company went bankrupt within four years.

In Margaret Atwood's 1985 novel The Handmaid's Tale, members of a dystopian society recited the phrase thrice daily. Notably the phrase is altered to read "From each according to her ability; to each according to his need", demonstrating a perversion of the phrase's original intention by Atwood's fictional society.

In Vladimir Voinovich's 1986 novel Moscow 2042, the slogan was parodied in the context of "communism in one city". Every morning the radio announced: "Comrades, your needs for today are as follows: ...".

"According to Need" is a 5-part documentary from the podcast 99% Invisible. It focuses on the homeless and what America is doing (and Oakland, California, in particular is doing) to find them housing.

 

Anthropological science fiction

From Wikipedia, the free encyclopedia
 

The anthropologist Leon E. Stover says of science fiction's relationship to anthropology: "Anthropological science fiction enjoys the philosophical luxury of providing answers to the question "What is man?" while anthropology the science is still learning how to frame it". The editors of a collection of anthropological SF stories observed:

Anthropology is the science of man. It tells the story from ape-man to spaceman, attempting to describe in detail all the epochs of this continuing history. Writers of fiction, and in particular science fiction, peer over the anthropologists' shoulders as the discoveries are made, then utilize the material in fictional works. Where the scientist must speculate reservedly from known fact and make a small leap into the unknown, the writer is free to soar high on the wings of fancy.

Charles F. Urbanowicz, Professor of Anthropology, California State University, Chico has said of anthropology and SF:

Anthropology and science fiction often present data and ideas so bizarre and unusual that readers, in their first confrontation with both, often fail to appreciate either science fiction or anthropology. Intelligence does not merely consist of fact, but in the integration of ideas -- and ideas can come from anywhere, especially good science fiction!

The difficulty in describing category boundaries for 'anthropological SF' is illustrated by a reviewer of an anthology of anthropological SF, written for the journal American Anthropologist, which warned against too broad a definition of the subgenre, saying: "Just because a story has anthropologists as protagonists or makes vague references to 'culture' does not qualify it as anthropological science fiction, although it may be 'pop' anthropology." The writer concluded the book review with the opinion that only "twelve of the twenty-six selections can be considered as examples of anthropological science fiction."

This difficulty of categorization explains the exclusions necessary when seeking the origins of the subgenre. Thus:

Nineteenth-century utopian writings and lost-race sagas notwithstanding, anthropological science fiction is generally considered a late-twentieth-century phenomenon, best exemplified by the work of writers such as Ursula K. Le Guin, Michael Bishop, Joanna Russ, Ian Watson, and Chad Oliver.

Again, questions of description are not simple as Gary Westfahl observes:

... others present hard science fiction as the most rigorous and intellectually demanding form of science fiction, implying that those who do not produce it are somehow failing to realize the true potential of science fiction. This is objectionable ...; writers like Chad Oliver and Ursula K. Le Guin, for example, bring to their writing a background in anthropology that makes their extrapolated aliens and future societies every bit as fascinating and intellectually involving as the technological marvels and strange planets of hard science fiction. Because anthropology is a social science, not a natural science, it is hard to classify their works as hard science fiction, but one cannot justly construe this observation as a criticism.

Despite being described as a "late-twentieth-century phenomenon" (above) anthropological SF's roots can be traced further back in history. H. G. Wells (1866–1946) has been called "the Shakespeare of SF" and his first anthropological story has been identified by anthropologist Leon E. Stover as "The Grisly Folk". Stover notes that this story is about Neanderthal Man, and writing in 1973, continues: "[the story] opens with the line 'Can these bones live?' Writers are still trying to make them live, the latest being Golding. Some others in between have been de Camp, Del Rey, Farmer, and Klass."

A more contemporary example of the Neanderthal as subject is Robert J. Sawyer's trilogy "The Neanderthal Parallax" – here "scientists from an alternative earth in which Neanderthals superseded homo sapiens cross over to our world. The series as a whole allows Sawyer to explore questions of evolution and humanity's relationship to the environment."

Authors and works

Chad Oliver

Anthropological science fiction is best exemplified by the work of writers such as Ursula K. Le Guin, Michael Bishop, Joanna Russ, Ian Watson, and Chad Oliver. Of this pantheon, Oliver is alone in being also a professional anthropologist, author of academic tomes such as Ecology and Cultural Continuity as Contributing Factors in the Social Organization of the Plains Indians (1962) and The Discovery of Anthropology (1981) in addition to his anthropologically-inflected science fiction. Although he tried, in a superficial way, to separate these two aspects of his career, signing his anthropology texts with his given name "Symmes C. Oliver", he nonetheless saw them as productively interrelated. "I like to think," he commented in a 1984 interview, "that there's a kind of feedback ... that the kind of open-minded perspective in science fiction conceivably has made me a better anthropologist. And on the other side of the coin, the kind of rigor that anthropology has, conceivably has made me a better science fiction writer."

Thus "Oliver's Unearthly Neighbors (1960) highlights the methods of ethnographic fieldwork by imagining their application to a nonhuman race on another world. His Blood's a Rover (1955 [1952]) spells out the problems of applied anthropology by sending a technical-assistance team to an underdeveloped planet. His Rite of Passage (1966 [1954]) is a lesson in the patterning of culture, how humans everywhere unconsciously work out a blueprint for living. Anthropological wisdom is applied to the conscious design of a new blueprint for American society in his Mother of Necessity (1972 [1955])". Oliver's The Winds of Time is a "science fiction novel giving an excellent introduction to the field methods of descriptive linguistics".

In 1993 a journal of SF criticism requested from writers and critics of SF a list of their 'most neglected' writers, and Chad Oliver was listed in three replies. Among the works chosen were: Shadows in the Sun, Unearthly Neighbors, and The Shores of Another Sea. One respondent declared that "Oliver's anthropological SF is the precursor of more recent novels by Ursula K. Le Guin, Michael Bishop, and others"; another that "Chad Oliver was developing quiet, superbly crafted anthropological fictions long before anyone had heard of Le Guin; maybe his slight output and unassuming plots (and being out of print) have caused people to overlook the carefully thought-out ideas behind his fiction".

In the novel Shadows in the Sun the protagonist, Paul Ellery, is an anthropologist doing field work in the town of Jefferson Springs, Texas—a place where he discovers extraterrestrial aliens. It has been remarked that:

Not only are these aliens comprehensible in anthropological terms, but it is anthropology, rather than the physical sciences, that promises a solution to the problem of alien colonization. According to the science of anthropology, every society, regardless of its level of development, has to functionally meet certain human needs. The aliens of Jefferson Springs "had learned, long ago, that it was the cultural core that counted-the deep and underlying spirit and belief and knowledge, the tone and essence of living. Once you had that, the rest was window dressing. Not only that, but the rest, the cultural superstructure, was relatively equal in all societies (115; emphasis in original). For Ellery, the aliens are not "supermen" (a favorite Campbellian conceit): despite their fantastic technologies, they are ultimately ordinary people with the expected array of weaknesses – laziness, factionalism, arrogance – whose cultural life is as predictable as any Earth society's. Since they are not superior, they are susceptible to defeat, but the key lies not in the procurement of advanced technologies, but in the creative cultural work of Earth people themselves.

A reviewer of The Shores of Another Sea finds the book "curiously flat despite its exploration of an almost mythical, and often horrific, theme". The reviewer's reaction is not surprising because, as Samuel Gerald Collins points out in the 'New Wave Anthropology' section of his comprehensive review of Chad Oliver's work: "In many ways, the novel is very much unlike Oliver's previous work; there is little moral resolution, nor is anthropology of much help in determining what motivates the aliens. In striking contrast to the familiar chumminess of the aliens in Shadows in the Sun and The Winds of Time, humans and aliens in Shores of Another Sea systematically misunderstand one another." Collins continues:

In fact, the intervening decade between Oliver's field research and the publication of Shores [1971] had been one of critical self-reflection in the field of anthropology. In the United States, qualms about the Vietnam war, together with evidence that anthropologists had been employed as spies and propagandists by the US government, prompted critiques of anthropology's role in systems of national and global power. Various strains of what came to be known as dependency theory disrupted the self-congratulatory evolutionism of modernization models, evoking and critiquing a world system whose political economy structurally mandated unequal development. Less narrowly academic works such as Vine Deloria, Jr.'s, Custer Died for Your Sins (1969), combined with the efforts of civil-rights groups like the American Indian Movement, skewered anthropology's paternalist pretensions. Two major collections of essays -- Dell Hymes's Reinventing Anthropology (1972) and Talal Asad's Anthropology and the Colonial Encounter (1973) -- explored anthropology's colonial legacy and precipitated a critical engagement with the ethics and politics of ethnographic representation.:253

At the conclusion of his essay, discussing Chad Oliver's legacy Collins says:

The lesson of Chad Oliver for sf is that his Campbell-era commitments to the power of technology, rational thinking, and the evolutionary destiny of "humanity" came to seem an enshrinement of a Western imperialist vision that needed to be transcended, through a rethinking of otherness driven by anthropological theory and practice. Above all, Oliver's career speaks to many of the shared impulses and assumptions of anthropology and sf, connections that have only grown more multifarious and complex since his death in 1993.:257

Ursula K. Le Guin

It has often been observed that Ursula K. Le Guin's interest in anthropology and its influence on her fiction derives from the influence of both her mother Theodora Kroeber, and of her father, Alfred L. Kroeber.

Warren G. Rochelle in his essay on Le Guin notes that from her parents she:

acquired the "anthropological attitude" necessary for the observation of another culture – or for her, the invention of another culture: the recognition and appreciation of cultural diversity, the necessity to be a "close and impartial observer", who is objective, yet recognizes the inescapable subjectivity that comes with participation in an alien culture.

Another critic has observed that Le Guin's "concern with cultural biases is evident throughout her literary career", and continues,

In The Word for World is Forest (1972), for example, she explicitly demonstrates the failure of colonialists to comprehend other cultures, and shows how the desire to dominate and control interferes with the ability to perceive the other. Always Coming Home (1985) is an attempt to allow another culture to speak for itself through songs and music (available in cassette form), writings, and various unclassifiable fragments. Like a documentary, the text presents the audience with pieces of information that they can sift through and examine. But unlike a traditional anthropological documentary, there is no "voice-over" to interpret that information and frame it for them. The absence of "voice-over" commentary in the novel forces the reader to draw conclusions rather than rely on a scientific analysis which would be tainted with cultural blind spots. The novel, consequently, preserves the difference of the alien culture and removes the observing neutral eye from the scene until the very end.

Le Guin's novel The Left Hand of Darkness has been called "the most sophisticated and technically plausible work of anthropological science fiction, insofar as the relationship of culture and biology is concerned", and also rated as "perhaps her most notable book". This novel forms part of Le Guin's Hainish Cycle (so termed because it develops as a whole "a vast story about diverse planets seeded with life by the ancient inhabitants of Hain". The series is "a densely textured anthropology, unfolding through a cycle of novels and stories and actually populated by several anthropologists and ethnologists"." Le Guin employs the SF trope of inter-stellar travel which allows for fictional human colonies on other worlds developing widely differing social systems. For example, in The Left Hand of Darkness "a human envoy to the snowbound planet of Gethan struggles to understand its sexually ambivalent inhabitants". Published in 1969, this Le Guin novel:

is only one of many subsequent novels that have dealt with androgyny and multiple gender/sex identities through a variety of approaches, from Samuel R. Delany's Triton (1976), Joanna Russ's Female Man (1975), Marge Piercy's Woman at the Edge of Time (1976), Marion Zimmer Bradley's Darkover series (1962–1996) and Octavia Butler's Xenogenesis Trilogy (1987-89). Though innovative in its time, it is not its construction of androgyny itself that is remarkable about Le Guin's text. Rather, it is her focus on the way that the androgynes are perceived and how they are constructed within a particular discourse, that of scientific observation. This discourse is manifested specifically in the language of anthropology, the social sciences as a whole, and diplomacy. This focus, in turn, places Le Guin's novel within a body of later works – such as Mary Gentle's Golden Witchbreed novels (1984-87) and C. J. Cherryh's Foreigner series (1994-96) – that deal with an outside observer's arrival on an alien planet, all of which indicate the difficulty of translating the life-style of an alien species into a language and cultural experience that is comprehensible. As such, these texts provide critiques of anthropological discourse that are similar to Trinh Minh-ha's attempts to problematize the colonialist beginnings and imperialistic undertones of anthropology as a science.

Geoffery Samuel has pointed out some specific anthropological aspect to Le Guin's fiction, noting that:

the culture of the people of Gethen in The Left Hand of Darkness clearly owes a lot to North-West Coast Indian and Eskimo culture; the role of dreams of Athshe (in The Word for World is Forest) is very reminiscent of that described for the Temiar people of Malaysia; and the idea of a special vocabulary of terms of address correlated with a hierarchy of knowledge, in City of Illusions, recalls the honorific terminologies of many Far Eastern cultures (such as Java or Tibet).

However, Fredric Jameson says of The Left Hand of Darkness that the novel is "constructed from a heterogeneous group of narratives modes ...", and that:

... we find here intermingled: the travel narrative (with anthropological data), the pastiche myth, the political novel (in the restricted sense of the drama of court intrigue), straight SF (the Hainish colonization, the spaceship in orbit around Gethen's sun), Orwellian dystopia ..., adventure story ..., and finally even, something like a multiracial love story (the drama of communication between the two cultures and species).

Similarly Adam Roberts warns against a too narrow an interpretation of Le Guin's fiction, pointing out that her writing is always balanced and that "balance as such forms one of her major concerns. Both Left Hand and The Dispossed (1974) balance form to theme, of symbol to narration, flawlessly". Nevertheless, there is no doubt that the novel The Left Hand of Darkness is steeped in anthropological thought, with one academic critic noting that "the theories of [French anthropologist] Claude Lévi-Strauss provide an access to understanding the workings of the myths" in the novel. Later in the essay the author explains:

Unlike the openended corpus of actual myths that anthropologists examine, the corpus of myths in The Left Hand of Darkness is closed and complete. Therefore, it is possible to analyze the entire set of Gethenian myths and establish the ways in which they are connected. Kinship exchange, in the Lévi-Straussian sense, comprises their dominant theme. In them, Le Guin articulates the theme of exchange by employing contrary images – heat and cold, dark and light, home and exile, name and namelessness, life and death, murder and sex – so as finally to reconcile their contrariety. The myths present wholeness, or unity, as an ideal; but that wholeness is never merely the integrity of an individual who stands apart from society. Instead, it consists of the tenuous and temporary integration of individuals into social units.

 

Biocultural anthropology

From Wikipedia, the free encyclopedia

Biocultural anthropology can be defined in numerous ways. It is the scientific exploration of the relationships between human biology and culture. "Instead of looking for the underlying biological roots of human behavior, biocultural anthropology attempts to understand how culture affects our biological capacities and limitations."

History

Physical anthropologists throughout the first half of the 20th century viewed this relationship from a racial perspective; that is, from the assumption that typological human biological differences lead to cultural differences. After World War II the emphasis began to shift toward an effort to explore the role culture plays in shaping human biology. The shift towards understanding the role of culture to human biology led to the development of Dual inheritance theory in the 1960s. In relation to, and following the development of Dual-inheritance theory, biocultural evolution was introduced and first used in the 1970s.

Key research

  • Biocultural approaches to human biology have been utilized since at least 1958 when American Biological Anthropologist Frank B. Livingstone contributed early research explaining the linkages among population growth, subsistence strategy, and the distribution of the sickle cell gene in Liberia.
  • Human adaptability research in the 1960s focused on two biocultural approaches to fatigue: functional differentiation of skeletal muscles associated with various movements, and human adaptability to modern living involving different work types.
  • "What's Cultural about Biocultural Research," Written by William W. Dressler, connects the cultural perspective of biocultural anthropology to "cultural consonance" which is defined as "a model to assess the approximation of an individuals behavior compared to the guiding awareness of his or her culture. This research has been used to examine outcomes in blood pressure, depressive symptoms, body composition, and dietary habits.
  • Dr. Romendro Khongsdier's approach to the study of human variation and evolution.
  • "Building a New Biocultural Synthesis" by Alan H. Goodman and Thomas L. Leatherman.
  • "New Directions in Biocultural Anthropology" edited by Molly Zuckerman and Debra Martin uses various case studies from around the world to understand how biocultural anthropology can be used to understand the relationship between biology and culture in both past and present populations.

Contemporary biocultural anthropology

Biocultural methods focus on the interactions between humans and their environment to understand human biological adaptation and variation. Contemporary biocultural anthropologists view culture as having several key roles in human biological variation:

  • Culture is a major human adaptation, permitting individuals and populations to adapt to widely varying local ecologies.
  • Characteristic human biological or biobehavioral features, such as a large frontal cortex and intensive parenting compared to other primates, are viewed in part as an adaptation to the complex social relations created by culture.
  • Culture shapes the political economy, thereby influencing what resources are available to individuals to feed and shelter themselves, protect themselves from disease, and otherwise maintain their health.
  • Culture shapes the way people think about the world, altering their biology by influencing their behavior (e.g., food choice) or more directly through psychosomatic effects (e.g., the biological effects of psychological stress).

While biocultural anthropologists are found in many academic anthropology departments, usually as a minority of the faculty, certain departments have placed considerable emphasis on the "biocultural synthesis". Historically, this has included Emory University, the University of Alabama, UMass Amherst (especially in biocultural bioarchaeology), and the University of Washington, each of which built Ph.D. programs around biocultural anthropology; Binghamton University, which has a M.S. program in biomedical anthropology; Oregon State University, University of Kentucky and others. Paul Baker, an anthropologist at Penn State whose work focused upon human adaptation to environmental variations, is credited with having popularized the concept of "biocultural" anthropology as a distinct subcategory of anthropology in general. Khongsdier argues that biocultural anthropology is the future of anthropology because it serves as a guiding force towards greater integration of the subdisciplines.

Controversy

Other anthropologists, both biological and cultural, have criticized the biocultural synthesis, generally as part of a broader critique of "four-field holism" in U.S. anthropology. Typically such criticisms rest on the belief that biocultural anthropology imposes holism upon the biological and cultural subfields without adding value, or even destructively. For instance, contributors in the edited volume Unwrapping the Sacred Bundle: Reflections on the Disciplining of Anthropology argued that the biocultural synthesis, and anthropological holism more generally, are artifacts from 19th century social evolutionary thought that inappropriately impose scientific positivism upon cultural anthropology.

Some departments of anthropology have fully split, usually dividing scientific from humanistic anthropologists, such as Stanford's highly publicized 1998 division into departments of "Cultural and Social Anthropology" and "Anthropological Sciences". Underscoring the continuing controversy, this split is now being reversed over the objections of some faculty. Other departments, such as at Harvard, have distinct biological and sociocultural anthropology "wings" not designed to foster cross subdisciplinary interchange.

Biocultural research has shown to contain a few challenges to the researcher. "In general we are much more experienced in measuring the biological than the cultural. It is also difficult to precisely define what is meant by constructs such as socioeconomic status, poverty, rural, and urban. Operationalizing key variables so that they can be measured in ways that are enthnographically valid as well as replicable. Defining and measuring multiple causal pathways."

Philosophy of biology

From Wikipedia, the free encyclopedia

The philosophy of biology is a subfield of philosophy of science, which deals with epistemological, metaphysical, and ethical issues in the biological and biomedical sciences. Although philosophers of science and philosophers generally have long been interested in biology (e.g., Aristotle, Descartes, and even Kant), philosophy of biology only emerged as an independent field of philosophy in the 1960s and 1970s[citation needed]. Philosophers of science then began paying increasing attention to biology, from the rise of Neodarwinism in the 1930s and 1940s to the discovery of the structure of DNA in 1953 to more recent advances in genetic engineering. Other key ideas include the reduction of all life processes to biochemical reactions, and the incorporation of psychology into a broader neuroscience.

Overview

Philosophers of biology examine the practices, theories, and concepts of biologists with a view toward better understanding biology as a scientific discipline (or group of scientific fields). Scientific ideas are philosophically analyzed and their consequences are explored. Philosophers of biology have also explored how our understanding of biology relates to epistemology, ethics, aesthetics, and metaphysics and whether progress in biology should compel modern societies to rethink traditional values concerning all aspects of human life. It is sometimes difficult to separate the philosophy of biology from theoretical biology.

  • "What is a biological species?"
  • "What is natural selection, and how does it operate in nature?"
  • "How should we distinguish disease states from non-disease states?"
  • "What is life?"
  • "What makes humans uniquely human?"
  • "What is the basis of moral thinking?"
  • "How is rationality possible, given our biological origins?"
  • "Is evolution compatible with Christianity or other religious systems?"

Increasingly, ideas drawn from philosophical ontology and logic are being used by biologists in the domain of bioinformatics. Ontologies such as the Gene Ontology are being used to annotate the results of biological experiments in a variety of model organisms in order to create logically tractable bodies of data available for reasoning and search. The Gene Ontology itself is a species-neutral graph-theoretical representation of biological types joined together by formally defined relations.

Philosophy of biology today has become a visible, well-organized discipline - with its own journals, conferences, and professional organizations. The largest of the latter is the International Society for the History, Philosophy, and Social Studies of Biology (ISHPSSB).

Biological Laws and Autonomy of Biology

A prominent question in the philosophy of biology is whether or not there can be distinct biological laws in the way there are distinct physical laws.

Scientific reductionism is the view that higher-level biological processes reduce to physical and chemical processes. For example, the biological process of respiration is explained as a biochemical process involving oxygen and carbon dioxide. Some philosophers of biology have attempted to answer the question of whether all biological processes reduce to physical or chemical ones. On the reductionist view, there would be no distinctly biological laws.

Holism is the view that emphasizes higher-level processes, phenomena at a larger level that occur due to the pattern of interactions between the elements of a system over time. For example, to explain why one species of finch survives a drought while others die out, the holistic method looks at the entire ecosystem. Reducing an ecosystem to its parts in this case would be less effective at explaining overall behavior (in this case, the decrease in biodiversity). As individual organisms must be understood in the context of their ecosystems, holists argue, so must lower-level biological processes be understood in the broader context of the living organism in which they take part. Proponents of this view cite our growing understanding of the multidirectional and multilayered nature of gene modulation (including epigenetic changes) as an area where a reductionist view is inadequate for full explanatory power.

All processes in organisms obey physical laws, but some argue that the difference between inanimate and biological processes is that the organisation of biological properties is subject to control by coded information. This has led some biologists and philosophers (for example, Ernst Mayr and David Hull) to return to the strictly philosophical reflections of Charles Darwin to resolve some of the problems which confronted them when they tried to employ a philosophy of science derived from classical physics. The positivist approach used in physics emphasised a strict determinism (as opposed to high probability) and led to the discovery of universally applicable laws, testable in the course of experiment. It was difficult for biology, beyond a basic microbiological level, to use this approach. Standard philosophy of science seemed to leave out a lot of what characterised living organisms - namely, a historical component in the form of an inherited genotype.

Philosophers of biology have also examined the notion of “teleology.” Some have argued that scientists have had no need for a notion of cosmic teleology that can explain and predict evolution, since one was provided by Darwin. But teleological explanations relating to purpose or function have remained useful in biology, for example, in explaining the structural configuration of macromolecules and the study of co-operation in social systems. By clarifying and restricting the use of the term “teleology” to describe and explain systems controlled strictly by genetic programmes or other physical systems, teleological questions can be framed and investigated while remaining committed to the physical nature of all underlying organic processes. While some philosophers claim that the ideas of Charles Darwin ended the last remainders of teleology in biology, the matter continues to be debated. Debates in these areas of philosophy of biology turn on how one views reductionism more generally.

Ethical Implications of Biology

Sharon Street claims that contemporary evolutionary biological theory creates what she calls a “Darwinian Dilemma” for realists. She argues that this is because it is unlikely that our evaluative judgements about morality are tracking anything true about the world. Rather, she says, it is likely that moral judgements and intuitions that promote our reproductive fitness[link] were selected for, and there is no reason to think “true” moral intuitions would be selected for as well. She notes that a moral intuition most people share, that someone being a close family member is a prima facie good reason to help them, happens to an intuition likely to increase reproductive fitness, while a moral intuition almost no one has, that someone being a close family member is a reason not to help them, is likely to decrease reproductive fitness.

David Copp responded to Street by arguing that realists can avoid this so-called dilemma by accepting what he calls a “quasi-tracking” position. Copp explains that what he means by quasi tracking is that it is likely that moral positions in a given society would have evolved to be at least somewhat close to the truth. He justifies this by appealing to the claim that the purpose of morality is to allow a society to meet certain basic needs, such as social stability, and a society with a successful moral codes would be better at doing this.

Other perspectives

While the overwhelming majority of English-speaking scholars operating under the banner of "philosophy of biology" work within the Anglo-American tradition of analytical philosophy, there is a stream of philosophic work in continental philosophy which seeks to deal with issues deriving from biological science. The communication difficulties involved between these two traditions are well known, not helped by differences in language. Gerhard Vollmer is often thought of as a bridge but, despite his education and residence in Germany, he largely works in the Anglo-American tradition, particularly pragmatism, and is famous for his development of Konrad Lorenz's and Willard Van Orman Quine's idea of evolutionary epistemology. On the other hand, one scholar who has attempted to give a more continental account of the philosophy of biology is Hans Jonas. His "The Phenomenon of Life" (New York, 1966) sets out boldly to offer an "existential interpretation of biological facts", starting with the organism's response to stimulus and ending with man confronting the Universe, and drawing upon a detailed reading of phenomenology. This is unlikely to have much influence on mainstream philosophy of biology, but indicates, as does Vollmer's work, the current powerful influence of biological thought on philosophy. Another account is given by the late Virginia Tech philosopher Marjorie Grene.

Another perspective on the philosophy of biology is how developments in modern biological research and biotechnologies have influenced traditional philosophical ideas about the distinction between biology and technology, as well as implications for ethics, society, and culture. An example is the work of philosopher Eugene Thacker in his book Biomedia. Building on current research in fields such as bioinformatics and biocomputing, as well as on work in the history of science (particularly the work of Georges Canguilhem, Lily E. Kay, and Hans-Jörg Rheinberger), Thacker defines biomedia in the following way: "Biomedia entail the informatic recontextualization of biological components and processes, for ends that may be medical or non-medical...biomedia continuously make the dual demand that information materialize itself as gene or protein compounds. This point cannot be overstated: biomedia depend upon an understanding of biological as informational but not immaterial."

Some approaches to the philosophy of biology incorporate perspectives from science studies and/or science and technology studies, anthropology, sociology of science, and political economy. This includes work by scholars such as Melinda Cooper, Luciana Parisi, Paul Rabinow, Kaushik Sundar Rajan, Nikolas Rose, and Catherine Waldby.

Philosophy of biology was historically associated very closely with theoretical evolutionary biology, however more recently there have been more diverse movements within philosophy of biology including movements to examine for instance molecular biology.

Scientific discovery process

Research in biology continues to be less guided by theory than it is in other sciences. This is especially the case where the availability of high throughput screening techniques for the different "-omics" fields such as genomics, whose complexity makes them predominantly data-driven. Such data-intensive scientific discovery is by some considered to be the fourth paradigm, after empiricism, theory and computer simulation. Others reject the idea that data driven research is about to replace theory. As Krakauer et al. put it: "machine learning is a powerful means of preprocessing data in preparation for mechanistic theory building, but should not be considered the final goal of a scientific inquiry." In regard to cancer biology, Raspe et al. state: "A better understanding of tumor biology is fundamental for extracting the relevant information from any high throughput data." The journal Science chose cancer immunotherapy as the breakthrough of 2013. According to their explanation a lesson to be learned from the successes of cancer immunotherapy is that they emerged from decoding of basic biology.

Theory in biology is to some extent less strictly formalized than in physics. Besides 1) classic mathematical-analytical theory, as in physics, there is 2) statistics-based, 3) computer simulation and 4) conceptual/verbal analysis. Dougherty and Bittner argue that for biology to progress as a science, it has to move to more rigorous mathematical modeling, or otherwise risk to be "empty talk".

In tumor biology research, the characterization of cellular signaling processes has largely focused on identifying the function of individual genes and proteins. Janes showed however the context-dependent nature of signaling driving cell decisions demonstrating the need for a more system based approach. The lack of attention for context dependency in preclinical research is also illustrated by the observation that preclinical testing rarely includes predictive biomarkers that, when advanced to clinical trials, will help to distinguish those patients who are likely to benefit from a drug.

Related Journals and Professional Organizations

Journals

Professional Organizations

 

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...