Search This Blog

Wednesday, January 16, 2019

Hermeneutics

From Wikipedia, the free encyclopedia

Hermeneutics (/ˌhɜːrməˈnjtɪks/) is the theory and methodology of interpretation, especially the interpretation of biblical texts, wisdom literature, and philosophical texts.

Modern hermeneutics includes both verbal and non-verbal communication as well as semiotics, presuppositions, and pre-understandings. Hermeneutics has been broadly applied in the humanities, especially in law, history and theology.

Hermeneutics was initially applied to the interpretation, or exegesis, of scripture, and has been later broadened to questions of general interpretation. The terms hermeneutics and exegesis are sometimes used interchangeably. Hermeneutics is a wider discipline which includes written, verbal, and non-verbal communication. Exegesis focuses primarily upon the word and grammar of texts.

Hermeneutic, as a count noun in the singular, refers to some particular method of interpretation.

Etymology

Hermeneutics is derived from the Greek word ἑρμηνεύω (hermēneuō, "translate, interpret"), from ἑρμηνεύς (hermeneus, "translator, interpreter"), of uncertain etymology (R. S. P. Beekes (2009) suggests a Pre-Greek origin). The technical term ἑρμηνεία (hermeneia, "interpretation, explanation") was introduced into philosophy mainly through the title of Aristotle's work Περὶ Ἑρμηνείας ("Peri Hermeneias"), commonly referred to by its Latin title De Interpretatione and translated in English as On Interpretation. It is one of the earliest (c. 360 B.C.) extant philosophical works in the Western tradition to deal with the relationship between language and logic in a comprehensive, explicit and formal way. 

The early usage of "hermeneutics" places it within the boundaries of the sacred. A divine message must be received with implicit uncertainty regarding its truth. This ambiguity is an irrationality; it is a sort of madness that is inflicted upon the receiver of the message. Only one who possesses a rational method of interpretation (i.e., a hermeneutic) could determine the truth or falsity of the message.

Folk etymology

Hermes, messenger of the gods.
 
Folk etymology places its origin with Hermes, the mythological Greek deity who was the 'messenger of the gods'. Besides being a mediator between the gods and between the gods and men, he led souls to the underworld upon death. 

Hermes was also considered to be the inventor of language and speech, an interpreter, a liar, a thief and a trickster. These multiple roles made Hermes an ideal representative figure for hermeneutics. As Socrates noted, words have the power to reveal or conceal and can deliver messages in an ambiguous way. The Greek view of language as consisting of signs that could lead to truth or to falsehood was the essence of Hermes, who was said to relish the uneasiness of those who received the messages he delivered.

In religious traditions

Talmudic hermeneutics

Summaries of the principles by which Torah can be interpreted date back to, at least, Hillel the Elder, although the thirteen principles set forth in the Baraita of Rabbi Ishmael are perhaps the best known. These principles ranged from standard rules of logic (e.g., a fortiori argument [known in Hebrew as קל וחומר —  kal v'chomer]) to more expansive ones, such as the rule that a passage could be interpreted by reference to another passage in which the same word appears (Gezerah Shavah). The rabbis did not ascribe equal persuasive power to the various principles.

Traditional Jewish hermeneutics differed from the Greek method in that the rabbis considered the Tanakh (the Jewish bibilical canon) to be without error. Any apparent inconsistencies had to be understood by means of careful examination of a given text within the context of other texts. There were different levels of interpretation: some were used to arrive at the plain meaning of the text, some expounded the law given in the text, and others found secret or mystical levels of understanding.

Vedic hermeneutics

Vedic hermeneutics involves the exegesis of the Vedas, the earliest holy texts of Hinduism. The Mimamsa was the leading hermeneutic school and their primary purpose was understanding what Dharma (righteous living) involved by a detailed hermeneutic study of the Vedas. They also derived the rules for the various rituals that had to be performed precisely.

The foundational text is the Mimamsa Sutra of Jaimini (ca. 3rd to 1st century BCE) with a major commentary by Śabara (ca. the 5th or 6th century CE). The Mimamsa sutra summed up the basic rules for Vedic interpretation.

Buddhist hermeneutics

Buddhist hermeneutics deals with the interpretation of the vast Buddhist literature, particularly those texts which are said to be spoken by the Buddha (Buddhavacana) and other enlightened beings. Buddhist hermeneutics is deeply tied to Buddhist spiritual practice and its ultimate aim is to extract skillful means of reaching spiritual enlightenment or nirvana. A central question in Buddhist hermeneutics is which Buddhist teachings are explicit, representing ultimate truth, and which teachings are merely conventional or relative.

Biblical hermeneutics

Biblical hermeneutics is the study of the principles of interpretation of the Bible. While Jewish and Christian biblical hermeneutics have some overlap, they have distinctly different interpretive traditions. 

The early patristic traditions of biblical exegesis had few unifying characteristics in the beginning but tended toward unification in later schools of biblical hermeneutics. 

Augustine offers hermeneutics and homiletics in his De doctrina christiana. He stresses the importance of humility in the study of Scripture. He also regards the duplex commandment of love in Matthew 22 as the heart of Christian faith. In Augustine’s hermeneutics, signs have an important role. God can communicate with the believer through the signs of the Scriptures. Thus, humility, love, and the knowledge of signs are an essential hermeneutical presupposition for a sound interpretation of the Scriptures. Although Augustine endorses some teaching of the Platonism of his time, he corrects and recasts it according to a theocentric doctrine of the Bible. Similarly, in a practical discipline, he modifies the classical theory of oratory in a Christian way. He underscores the meaning of diligent study of the Bible and prayer as more than mere human knowledge and oratory skills. As a concluding remark, Augustine encourages the interpreter and preacher of the Bible to seek a good manner of life and, most of all, to love God and neighbor.

There are traditionally fourfold sense of biblical hermeneutics: literal, moral, allegorical (spiritual), and anagogical.

Literal

Encyclopædia Britannica states that literal analysis means “a biblical text is to be deciphered according to the ‘plain meaning’ expressed by its linguistic construction and historical context.” The intention of the authors is believed to correspond to the literal meaning. Literal hermeneutics is often associated with the verbal inspiration of the Bible.

Moral

Moral interpretation searches for moral lessons which can be understood from writings within the Bible. Allegories are often placed in this category.

Allegorical

Allegorical interpretation states that biblical narratives have a second level of reference that is more than the people, events and things that are explicitly mentioned. One type of allegorical interpretation is known as typological, where the key figures, events, and establishments of the Old Testament are viewed as “types” (patterns). In the New Testament this can also include foreshadowing of people, objects, and events. According to this theory, readings like Noah’s Ark could be understood by using the Ark as a “type” of the Christian church that God designed from the start.

Anagogical

This type of interpretation is more often known as mystical interpretation. It purports to explain the events of the Bible and how they relate to or predict what the future holds. This is evident in the Jewish Kabbalah, which attempts to reveal the mystical significance of the numerical values of Hebrew words and letters. 

In Judaism, anagogical interpretation is also evident in the medieval Zohar. In Christianity, it can be seen in Mariology.

Philosophical hermeneutics

Modern hermeneutics

The discipline of hermeneutics emerged with the new humanist education of the 15th century as a historical and critical methodology for analyzing texts. In a triumph of early modern hermeneutics, the Italian humanist Lorenzo Valla proved in 1440 that the Donation of Constantine was a forgery. This was done through intrinsic evidence of the text itself. Thus hermeneutics expanded from its medieval role of explaining the true meaning of the Bible.

However, biblical hermeneutics did not die off. For example, the Protestant Reformation brought about a renewed interest in the interpretation of the Bible, which took a step away from the interpretive tradition developed during the Middle Ages back to the texts themselves. Martin Luther and John Calvin emphasized scriptura sui ipsius interpres (scripture interprets itself). Calvin used brevitas et facilitas as an aspect of theological hermeneutics.

The rationalist Enlightenment led hermeneutists, especially Protestant exegetists, to view Scriptural texts as secular classical texts. They interpreted Scripture as responses to historical or social forces so that, for example, apparent contradictions and difficult passages in the New Testament might be clarified by comparing their possible meanings with contemporary Christian practices.

Friedrich Schleiermacher (1768–1834) explored the nature of understanding in relation not just to the problem of deciphering sacred texts but to all human texts and modes of communication.

The interpretation of a text must proceed by framing its content in terms of the overall organization of the work. Schleiermacher distinguished between grammatical interpretation and psychological interpretation. The former studies how a work is composed from general ideas; the latter studies the peculiar combinations that characterize the work as a whole. He said that every problem of interpretation is a problem of understanding and even defined hermeneutics as the art of avoiding misunderstanding. Misunderstanding was to be avoided by means of knowledge of grammatical and psychological laws.

During Schleiermacher's time, a fundamental shift occurred from understanding not merely the exact words and their objective meaning, to an understanding of the writer's distinctive character and point of view.

19th- and 20th-century hermeneutics emerged as a theory of understanding (Verstehen) through the work of Friedrich Schleiermacher (Romantic hermeneutics and methodological hermeneutics), August Böckh (methodological hermeneutics), Wilhelm Dilthey (epistemological hermeneutics), Martin Heidegger (ontological hermeneutics, hermeneutic phenomenology, and transcendental hermeneutic phenomenology), Hans-Georg Gadamer (ontological hermeneutics), Leo Strauss (Straussian hermeneutics), Paul Ricœur (hermeneutic phenomenology), Walter Benjamin (Marxist hermeneutics), Ernst Bloch (Marxist hermeneutics), Jacques Derrida (radical hermeneutics, namely deconstruction), Richard Kearney (diacritical hermeneutics), Fredric Jameson (Marxist hermeneutics), and John Thompson (critical hermeneutics). 

Regarding the relation of hermeneutics with problems of analytic philosophy, there has been, particularly among analytic Heideggerians and those working on Heidegger’s philosophy of science, an attempt to try and situate Heidegger's hermeneutic project in debates concerning realism and anti-realism: arguments have been presented both for Heidegger's hermeneutic idealism (the thesis that meaning determines reference or, equivalently, that our understanding of the being of entities is what determines entities as entities) and for Heidegger's hermeneutic realism (the thesis that (a) there is a nature in itself and science can give us an explanation of how that nature works, and (b) that (a) is compatible with the ontological implications of our everyday practices).

Philosophers that worked to combine analytic philosophy with hermeneutics include Georg Henrik von Wright and Peter Winch. Roy J. Howard termed this approach analytic hermeneutics.

Other contemporary philosophers influenced by the hermeneutic tradition include Charles Taylor (engaged hermeneutics) and Dagfinn Føllesdal.

Dilthey (1833–1911)

Wilhelm Dilthey broadened hermeneutics even more by relating interpretation to historical objectification. Understanding moves from the outer manifestations of human action and productivity to the exploration of their inner meaning. In his last important essay, "The Understanding of Other Persons and Their Manifestations of Life" (1910), Dilthey made clear that this move from outer to inner, from expression to what is expressed, is not based on empathy. Empathy involves a direct identification with the Other. Interpretation involves an indirect or mediated understanding that can only be attained by placing human expressions in their historical context. Thus, understanding is not a process of reconstructing the state of mind of the author, but one of articulating what is expressed in his work. 

Dilthey divided sciences of the mind (human sciences) into three structural levels: experience, expression, and comprehension.
  • Experience means to feel a situation or thing personally. Dilthey suggested that we can always grasp the meaning of unknown thought when we try to experience it. His understanding of experience is very similar to that of phenomenologist Edmund Husserl.
  • Expression converts experience into meaning because the discourse has an appeal to someone outside of oneself. Every saying is an expression. Dilthey suggested that one can always return to an expression, especially to its written form, and this practice has the same objective value as an experiment in science. The possibility of returning makes scientific analysis possible, and therefore the humanities may be labeled as science. Moreover, he assumed that an expression may be "saying" more than the speaker intends because the expression brings forward meanings which the individual consciousness may not fully understand.
  • The last structural level of the science of the mind, according to Dilthey, is comprehension, which is a level that contains both comprehension and incomprehension. Incomprehension means, more or less, wrong understanding. He assumed that comprehension produces coexistence: "he who understands, understands others; he who does not understand stays alone."

Heidegger (1889–1976)

In the 20th century, Martin Heidegger's philosophical hermeneutics shifted the focus from interpretation to existential understanding as rooted in fundamental ontology, which was treated more as a direct — and thus more authentic — way of being-in-the-world (In-der-Welt-sein) than merely as "a way of knowing." For example, he called for a "special hermeneutic of empathy" to dissolve the classic philosophic issue of "other minds" by putting the issue in the context of the being-with of human relatedness. (Heidegger himself did not complete this inquiry.)

Advocates of this approach claim that some texts, and the people who produce them, cannot be studied by means of using the same scientific methods that are used in the natural sciences, thus drawing upon arguments similar to those of antipositivism. Moreover, they claim that such texts are conventionalized expressions of the experience of the author. Thus, the interpretation of such texts will reveal something about the social context in which they were formed, and, more significantly, will provide the reader with a means of sharing the experiences of the author.

The reciprocity between text and context is part of what Heidegger called the hermeneutic circle. Among the key thinkers who elaborated this idea was the sociologist Max Weber.

Gadamer (1900–2002) et al.

Hans-Georg Gadamer's hermeneutics is a development of the hermeneutics of his teacher, Heidegger. Gadamer asserted that methodical contemplation is opposite to experience and reflection. We can reach the truth only by understanding or mastering our experience. According to Gadamer, our understanding is not fixed but rather is changing and always indicating new perspectives. The most important thing is to unfold the nature of individual understanding.

Gadamer pointed out that prejudice is an element of our understanding and is not per se without value. Indeed, prejudices, in the sense of prejudgments of the thing we want to understand, are unavoidable. Being alien to a particular tradition is a condition of our understanding. He said that we can never step outside of our tradition — all we can do is try to understand it. This further elaborates the idea of the hermeneutic circle

Bernard Lonergan's (1904–1984) hermeneutics is less well known, but a case for considering his work as the culmination of the postmodern hermeneutical revolution that began with Heidegger was made in several articles by Lonergan specialist Frederick G. Lawrence.

Paul Ricœur (1913–2005) developed a hermeneutics that is based upon Heidegger's concepts. His work differs in many ways from that of Gadamer.

Karl-Otto Apel (b. 1922) elaborated a hermeneutics based on American semiotics. He applied his model to discourse ethics with political motivations akin to those of critical theory.

Jürgen Habermas (b. 1929) criticized the conservatism of previous hermeneutists, especially Gadamer, because their focus on tradition seemed to undermine possibilities for social criticism and transformation. He also criticized Marxism and previous members of the Frankfurt School for missing the hermeneutical dimension of critical theory.

Habermas incorporated the notion of the lifeworld and emphasized the importance for social theory of interaction, communication, labor, and production. He viewed hermeneutics as a dimension of critical social theory.

Andrés Ortiz-Osés (b. 1943) has developed his symbolic hermeneutics as the Mediterranean response to Northern European hermeneutics. His main statement regarding symbolic understanding of the world is that meaning is a symbolic healing of injury.

Two other important hermeneutic scholars are Jean Grondin (b. 1955) and Maurizio Ferraris (b. 1956). 

Mauricio Beuchot coined the term and discipline of analogic hermeneutics, which is a type of hermeneutics that is based upon interpretation and takes into account the plurality of aspects of meaning. He drew categories both from analytic and continental philosophy, as well as from the history of thought.

Two scholars who have published criticism of Gadamer's hermeneutics are the Italian jurist Emilio Betti and the American literary theorist E. D. Hirsch.

New hermeneutic

New hermeneutic is the theory and methodology of interpretation to understand Biblical texts through existentialism. The essence of new hermeneutic emphasizes not only the existence of language but also the fact that language is eventually in the history of individual life. This is called the event of language. Ernst Fuchs, Gerhard Ebeling, and James M. Robinson are the scholars who represent the new hermeneutics.

Marxist hermeneutics

The method of Marxist hermeneutics has been developed by the work of, primarily, Walter Benjamin and Fredric Jameson. Benjamin outlines his theory of the allegory in his study Ursprung des deutschen Trauerspiels ("Trauerspiel" literally means "mourning play" but is often translated as "tragic drama"). Fredric Jameson draws on Biblical hermeneutics, Ernst Bloch, and the work of Northrop Frye, to advance his theory of Marxist hermeneutics in his influential The Political Unconscious. Jameson's Marxist hermeneutics is outlined in the first chapter of the book, titled "On Interpretation" Jameson re-interprets (and secularizes) the fourfold system (or four levels) of Biblical exegesis (literal; moral; allegorical; anagogical) to relate interpretation to the Mode of Production, and eventually, history.

Objective hermeneutics

Karl Popper first used the term "objective hermeneutics" in his Objective Knowledge (1972).

In 1992, the Association for Objective Hermeneutics (AGOH) was founded in Frankfurt am Main by scholars of various disciplines in the humanities and social sciences. Its goal is to provide all scholars who use the methodology of objective hermeneutics with a means of exchanging information.

In one of the few translated texts of this German school of hermeneutics, its founders declared:
Our approach has grown out of the empirical study of family interactions as well as reflection upon the procedures of interpretation employed in our research. For the time being we shall refer to it as objective hermeneutics in order to distinguish it clearly from traditional hermeneutic techniques and orientations. The general significance for sociological analysis of objective hermeneutics issues from the fact that, in the social sciences, interpretive methods constitute the fundamental procedures of measurement and of the generation of research data relevant to theory. From our perspective, the standard, nonhermeneutic methods of quantitative social research can only be justified because they permit a shortcut in generating data (and research "economy" comes about under specific conditions). Whereas the conventional methodological attitude in the social sciences justifies qualitative approaches as exploratory or preparatory activities, to be succeeded by standardized approaches and techniques as the actual scientific procedures (assuring precision, validity, and objectivity), we regard hermeneutic procedures as the basic method for gaining precise and valid knowledge in the social sciences. However, we do not simply reject alternative approaches dogmatically. They are in fact useful wherever the loss in precision and objectivity necessitated by the requirement of research economy can be condoned and tolerated in the light of prior hermeneutically elucidated research experiences.

Applications

Archaeology

In archaeology, hermeneutics means the interpretation and understanding of material through analysis of possible meanings and social uses. 

Proponents argue that interpretation of artifacts is unavoidably hermeneutic because we cannot know for certain the meaning behind them. We can only apply modern values when interpreting. This is most commonly seen in stone tools, where descriptions such as "scraper" can be highly subjective and actually unproven until the development of microwear analysis some thirty years ago.

Opponents argue that a hermeneutic approach is too relativist and that their own interpretations are based on common-sense evaluation.

Architecture

There are several traditions of architectural scholarship that draw upon the hermeneutics of Heidegger and Gadamer, such as Christian Norberg-Schulz, and Nader El-Bizri in the circles of phenomenology. Lindsay Jones examines the way architecture is received and how that reception changes with time and context (e.g., how a building is interpreted by critics, users, and historians). Dalibor Vesely situates hermeneutics within a critique of the application of overly scientific thinking to architecture. This tradition fits within a critique of the Enlightenment and has also informed design-studio teaching. Adrian Snodgrass sees the study of history and Asian cultures by architects as a hermeneutical encounter with otherness. He also deploys arguments from hermeneutics to explain design as a process of interpretation. Along with Richard Coyne, he extends the argument to the nature of architectural education and design.

Environment

Environmental hermeneutics applies hermeneutics to environmental issues conceived broadly to subjects including "nature" and "wilderness" (both terms are matters of hermeneutical contention), landscapes, ecosystems, built environments (where it overlaps architectural hermeneutics), inter-species relationships, the relationship of the body to the world, and more.

International relations

Insofar as hermeneutics is a basis of both critical theory and constitutive theory (both of which have made important inroads into the postpositivist branch of international relations theory and political science), it has been applied to international relations.

Steve Smith refers to hermeneutics as the principal way of grounding a foundationalist yet postpositivist theory of international relations.

Radical postmodernism is an example of a postpositivist yet anti-foundationalist paradigm of international relations.

Law

Some scholars argue that law and theology are particular forms of hermeneutics because of their need to interpret legal tradition or scriptural texts. Moreover, the problem of interpretation has been central to legal theory since at least the 11th century.

In the Middle Ages and Italian Renaissance, the schools of glossatores, commentatores, and usus modernus distinguished themselves by their approach to the interpretation of "laws" (mainly Justinian's Corpus Juris Civilis). The University of Bologna gave birth to a "legal Renaissance" in the 11th century, when the Corpus Juris Civilis was rediscovered and systematically studied by men such as Irnerius and Johannes Gratian. It was an interpretative Renaissance. Subsequently, these were fully developed by Thomas Aquinas and Alberico Gentili

Since then, interpretation has always been at the center of legal thought. Friedrich Carl von Savigny and Emilio Betti, among others, made significant contributions to general hermeneutics. Legal interpretivism, most famously Ronald Dworkin's, may be seen as a branch of philosophical hermeneutics.

Political philosophy

Italian philosopher Gianni Vattimo and Spanish philosopher Santiago Zabala in their book Hermeneutic Communism, when discussing contemporary capitalist regimes, stated that, "A politics of descriptions does not impose power in order to dominate as a philosophy; rather, it is functional for the continued existence of a society of dominion, which pursues truth in the form of imposition (violence), conservation (realism), and triumph (history)."

Vattimo and Zabala also stated that they view interpretation as anarchy and affirmed that "existence is interpretation" and that "hermeneutics is weak thought."

Psychoanalysis

Psychoanalysts have made ample use of hermeneutics since Sigmund Freud first gave birth to their discipline. In 1900 Freud wrote that the title he chose for The Interpretation of Dreams 'makes plain which of the traditional approaches to the problem of dreams I am inclined to follow...[i.e.] "interpreting" a dream implies assigning a "meaning" to it.'

The French psychoanalyst Jacques Lacan later extended Freudian hermeneutics into other psychical realms. His early work from the 1930s–50s is particularly influenced by Heidegger, and Maurice Merleau-Ponty's hermeneutical phenomenology.

Psychology

Psychologists and computer scientists have recently become interested in hermeneutics, especially as an alternative to cognitivism

Hubert Dreyfus's critique of conventional artificial intelligence has been influential among psychologists who are interested in hermeneutic approaches to meaning and interpretation, as discussed by philosophers such as Martin Heidegger and Ludwig Wittgenstein

Hermeneutics is also influential in humanistic psychology.

Religion and theology

The understanding of a theological text depends upon the reader's particular hermeneutical viewpoint. Some theorists, such as Paul Ricœur, have applied modern philosophical hermeneutics to theological texts (in Ricœur's case, the Bible). 

Mircea Eliade, as a hermeneutist, understands religion as 'experience of the sacred', and interprets the sacred in relation to the profane. The Romanian scholar underlines that the relation between the sacred and the profane is not of opposition, but of complementarity, having interpreted the profane as a hierophany. The hermeneutics of the myth is a part of the hermeneutics of religion. Myth should not be interpreted as an illusion or a lie, because there is truth in myth to be rediscovered. Myth is interpreted by Mircea Eliade as 'sacred history'. He introduces the concept of 'total hermeneutics'.

Safety science

In the field of safety science, and especially in the study of human reliability, scientists have become increasingly interested in hermeneutic approaches.

It has been proposed by ergonomist Donald Taylor that mechanistic models of human behavior will only take us so far in terms of accident reduction, and that safety science must look at the meaning of accidents for human beings.

Other scholars in the field have attempted to create safety taxonomies that make use of hermeneutic concepts in terms of their categorisation of qualitative data.

Sociology

In sociology, hermeneutics is the interpretation and understanding of social events through analysis of their meanings for the human participants in the events. It enjoyed prominence during the 1960s and 1970s, and differs from other interpretive schools of sociology in that it emphasizes the importance of both context and form within any given social behavior. 

The central principle of sociological hermeneutics is that it is only possible to know the meaning of an act or statement within the context of the discourse or world view from which it originates. Context is critical to comprehension; an action or event that carries substantial weight to one person or culture may be viewed as meaningless or entirely different to another. For example, giving the "thumbs-up" gesture is widely accepted as a sign of a job well done in the United States, while other cultures view it as an insult. Similarly, putting a piece of paper into a box might be considered a meaningless act unless it is put into the context of democratic elections (the act of putting a ballot paper into a box). 

Friedrich Schleiermacher, widely regarded as the father of sociological hermeneutics believed that, in order for an interpreter to understand the work of another author, they must familiarize themselves with the historical context in which the author published their thoughts. His work led to the inspiration of Heidegger's "hermeneutic circle" a frequently referenced model that claims one's understanding of individual parts of a text is based on their understanding of the whole text, while the understanding of the whole text is dependent on the understanding of each individual part. Hermeneutics in sociology was also heavily influenced by German philosopher Hans-Georg Gadamer.

Criticism

Jürgen Habermas criticizes Gadamer's hermeneutics as being unsuitable for understanding society because it is unable to account for questions of social reality, like labor and domination.

Murray Rothbard and Hans Hermann-Hoppe, both economists of the Austrian school, have criticized the hermeneutical approach to economics.

Concept

From Wikipedia, the free encyclopedia

Concepts are mental representations, abstract objects or abilities that make up the fundamental building blocks of thoughts and beliefs. They play an important role in all aspects of cognition.

In contemporary philosophy, there are at least three prevailing ways to understand what a concept is:
Concepts can be organized into a hierarchy, higher levels of which are termed "superordinate" and lower levels termed "subordinate". Additionally, there is the "basic" or "middle" level at which people will most readily categorize a concept. For example, a basic-level concept would be "chair", with its superordinate, "furniture", and its subordinate, "easy chair".

Diagram
When the mind makes a generalization such as the concept of tree, it extracts similarities from numerous examples; the simplification enables higher-level thinking.
 
A concept is instantiated (reified) by all of its actual or potential instances, whether these are things in the real world or other ideas

Concepts are studied as components of human cognition in the cognitive science disciplines of linguistics, psychology and philosophy, where an ongoing debate asks whether all cognition must occur through concepts. Concepts are used as formal tools or models in mathematics, computer science, databases and artificial intelligence where they are sometimes called classes, schema or categories. In informal use the word concept often just means any idea.

Concepts in the representational theory of mind

Within the framework of the representational theory of mind, the structural position of concepts can be understood as follows: Concepts serve as the building blocks of what are called mental representations (colloquially understood as ideas in the mind). Mental representations, in turn, are the building blocks of what are called propositional attitudes (colloquially understood as the stances or perspectives we take towards ideas, be it "believing", "doubting", "wondering", "accepting", etc.). And these propositional attitudes, in turn, are the building blocks of our understanding of thoughts that populate everyday life, as well as folk psychology. In this way, we have an analysis that ties our common everyday understanding of thoughts down to the scientific and philosophical understanding of concepts.

Nature of concepts

A central question in the study of concepts is the question of what concepts are. Philosophers construe this question as one about the ontology of concepts – what they are really like. The ontology of concepts determines the answer to other questions, such as how to integrate concepts into a wider theory of the mind, what functions are allowed or disallowed by a concept's ontology, etc. There are two main views of the ontology of concepts: (1) Concepts are abstract objects, and (2) concepts are mental representations.

Platonist views of the mind construe concepts as abstract objects.

There is debate as to the relationship between concepts and natural language. However, it is necessary at least to begin by understanding that the concept "dog" is philosophically distinct from the things in the world grouped by this concept – or the reference class or extension. Concepts that can be equated to a single word are called "lexical concepts".

Study of concepts and conceptual structure falls into the disciplines of linguistics, philosophy, psychology, and cognitive science.

In the simplest terms, a concept is a name or label that regards or treats an abstraction as if it had concrete or material existence, such as a person, a place, or a thing. It may represent a natural object that exists in the real world like a tree, an animal, a stone, etc. It may also name an artificial (man-made) object like a chair, computer, house, etc. Abstract ideas and knowledge domains such as freedom, equality, science, happiness, etc., are also symbolized by concepts. It is important to realize that a concept is merely a symbol, a representation of the abstraction. The word is not to be mistaken for the thing. For example, the word "moon" (a concept) is not the large, bright, shape-changing object up in the sky, but only represents that celestial object. Concepts are created (named) to describe, explain and capture reality as it is known and understood.

A priori concepts

Kant maintained the view that human minds possess pure or a priori concepts. Instead of being abstracted from individual perceptions, like empirical concepts, they originate in the mind itself. He called these concepts categories, in the sense of the word that means predicate, attribute, characteristic, or quality. But these pure categories are predicates of things in general, not of a particular thing. According to Kant, there are twelve categories that constitute the understanding of phenomenal objects. Each category is that one predicate which is common to multiple empirical concepts. In order to explain how an a priori concept can relate to individual phenomena, in a manner analogous to an a posteriori concept, Kant employed the technical concept of the schema. He held that the account of the concept as an abstraction of experience is only partly correct. He called those concepts that result from abstraction "a posteriori concepts" (meaning concepts that arise out of experience). An empirical or an a posteriori concept is a general representation (Vorstellung) or non-specific thought of that which is common to several specific perceived objects (Logic, I, 1., §1, Note 1).

A concept is a common feature or characteristic. Kant investigated the way that empirical a posteriori concepts are created.
The logical acts of the understanding by which concepts are generated as to their form are:
  1. comparison, i.e., the likening of mental images to one another in relation to the unity of consciousness;
  2. reflection, i.e., the going back over different mental images, how they can be comprehended in one consciousness; and finally
  3. abstraction or the segregation of everything else by which the mental images differ ...
In order to make our mental images into concepts, one must thus be able to compare, reflect, and abstract, for these three logical operations of the understanding are essential and general conditions of generating any concept whatever. For example, I see a fir, a willow, and a linden. In firstly comparing these objects, I notice that they are different from one another in respect of trunk, branches, leaves, and the like; further, however, I reflect only on what they have in common, the trunk, the branches, the leaves themselves, and abstract from their size, shape, and so forth; thus I gain a concept of a tree.
— Logic, §6

Embodied content

In cognitive linguistics, abstract concepts are transformations of concrete concepts derived from embodied experience. The mechanism of transformation is structural mapping, in which properties of two or more source domains are selectively mapped onto a blended space. A common class of blends are metaphors. This theory contrasts with the rationalist view that concepts are perceptions (or recollections, in Plato's term) of an independently existing world of ideas, in that it denies the existence of any such realm. It also contrasts with the empiricist view that concepts are abstract generalizations of individual experiences, because the contingent and bodily experience is preserved in a concept, and not abstracted away. While the perspective is compatible with Jamesian pragmatism, the notion of the transformation of embodied concepts through structural mapping makes a distinct contribution to the problem of concept formation.

Ontology

Plato was the starkest proponent of the realist thesis of universal concepts. By his view, concepts (and ideas in general) are innate ideas that were instantiations of a transcendental world of pure forms that lay behind the veil of the physical world. In this way, universals were explained as transcendent objects. Needless to say this form of realism was tied deeply with Plato's ontological projects. This remark on Plato is not of merely historical interest. For example, the view that numbers are Platonic objects was revived by Kurt Gödel as a result of certain puzzles that he took to arise from the phenomenological accounts.

Gottlob Frege, founder of the analytic tradition in philosophy, famously argued for the analysis of language in terms of sense and reference. For him, the sense of an expression in language describes a certain state of affairs in the world, namely, the way that some object is presented. Since many commentators view the notion of sense as identical to the notion of concept, and Frege regards senses as the linguistic representations of states of affairs in the world, it seems to follow that we may understand concepts as the manner in which we grasp the world. Accordingly, concepts (as senses) have an ontological status (Morgolis:7). 

According to Carl Benjamin Boyer, in the introduction to his The History of the Calculus and its Conceptual Development, concepts in calculus do not refer to perceptions. As long as the concepts are useful and mutually compatible, they are accepted on their own. For example, the concepts of the derivative and the integral are not considered to refer to spatial or temporal perceptions of the external world of experience. Neither are they related in any way to mysterious limits in which quantities are on the verge of nascence or evanescence, that is, coming into or going out of existence. The abstract concepts are now considered to be totally autonomous, even though they originated from the process of abstracting or taking away qualities from perceptions until only the common, essential attributes remained.

Mental representations

In a physicalist theory of mind, a concept is a mental representation, which the brain uses to denote a class of things in the world. This is to say that it is literally, a symbol or group of symbols together made from the physical material of the brain. Concepts are mental representations that allow us to draw appropriate inferences about the type of entities we encounter in our everyday lives. Concepts do not encompass all mental representations, but are merely a subset of them. The use of concepts is necessary to cognitive processes such as categorization, memory, decision making, learning, and inference.

Concepts are thought to be stored in long term cortical memory, in contrast to episodic memory of the particular objects and events which they abstract, which are stored in hippocampus. Evidence for this separation comes from hippocampal damaged patients such as patient HM. The abstraction from the day's hippocampal events and objects into cortical concepts is often considered to be the computation underlying (some stages of) sleep and dreaming. Many people (beginning with Aristotle) report memories of dreams which appear to mix the day's events with analogous or related historical concepts and memories, and suggest that they were being sorted or organized into more abstract concepts. ("Sort" is itself another word for concept, and "sorting" thus means to organize into concepts.)

Notable theories on the structure of concepts

Classical theory

The classical theory of concepts, also referred to as the empiricist theory of concepts, is the oldest theory about the structure of concepts (it can be traced back to Aristotle), and was prominently held until the 1970s. The classical theory of concepts says that concepts have a defined structure. Adequate definitions of the kind required by this theory usually take the form of a list of features. These features must have two important qualities to provide a comprehensive definition. Features entailed by the definition of a concept must be both necessary and sufficient for membership in the class of things covered by a particular concept. A feature is considered necessary if every member of the denoted class has that feature. A feature is considered sufficient if something has all the parts required by the definition. For example, the classic example bachelor is said to be defined by unmarried and man. An entity is a bachelor (by this definition) if and only if it is both unmarried and a man. To check whether something is a member of the class, you compare its qualities to the features in the definition. Another key part of this theory is that it obeys the law of the excluded middle, which means that there are no partial members of a class, you are either in or out.

The classical theory persisted for so long unquestioned because it seemed intuitively correct and has great explanatory power. It can explain how concepts would be acquired, how we use them to categorize and how we use the structure of a concept to determine its referent class. In fact, for many years it was one of the major activities in philosophyconcept analysis. Concept analysis is the act of trying to articulate the necessary and sufficient conditions for the membership in the referent class of a concept. For example, Shoemaker's classic "Time Without Change" explored whether the concept of the flow of time can include flows where no changes take place, though change is usually taken as a definition of time.

Arguments against the classical theory

Given that most later theories of concepts were born out of the rejection of some or all of the classical theory, it seems appropriate to give an account of what might be wrong with this theory. In the 20th century, philosophers such as Wittgenstein and Rosch argued against the classical theory. There are six primary arguments summarized as follows:
  • It seems that there simply are no definitions – especially those based in sensory primitive concepts.
  • It seems as though there can be cases where our ignorance or error about a class means that we either don't know the definition of a concept, or have incorrect notions about what a definition of a particular concept might entail.
  • Quine's argument against analyticity in Two Dogmas of Empiricism also holds as an argument against definitions.
  • Some concepts have fuzzy membership. There are items for which it is vague whether or not they fall into (or out of) a particular referent class. This is not possible in the classical theory as everything has equal and full membership.
  • Rosch found typicality effects which cannot be explained by the classical theory of concepts, these sparked the prototype theory.
  • Psychological experiments show no evidence for our using concepts as strict definitions.

Prototype theory

Prototype theory came out of problems with the classical view of conceptual structure. Prototype theory says that concepts specify properties that members of a class tend to possess, rather than must possess. Wittgenstein, Rosch, Mervis, Berlin, Anglin, and Posner are a few of the key proponents and creators of this theory. Wittgenstein describes the relationship between members of a class as family resemblances. There are not necessarily any necessary conditions for membership, a dog can still be a dog with only three legs. This view is particularly supported by psychological experimental evidence for prototypicality effects. Participants willingly and consistently rate objects in categories like 'vegetable' or 'furniture' as more or less typical of that class. It seems that our categories are fuzzy psychologically, and so this structure has explanatory power. We can judge an item's membership to the referent class of a concept by comparing it to the typical member – the most central member of the concept. If it is similar enough in the relevant ways, it will be cognitively admitted as a member of the relevant class of entities. Rosch suggests that every category is represented by a central exemplar which embodies all or the maximum possible number of features of a given category. According to Lech, Gunturkun, and Suchan explain that categorization involves many areas of the brain, some of these are; visual association areas, prefrontal cortex, basal ganglia, and temporal lobe.

Theory-theory

Theory-theory is a reaction to the previous two theories and develops them further. This theory postulates that categorization by concepts is something like scientific theorizing. Concepts are not learned in isolation, but rather are learned as a part of our experiences with the world around us. In this sense, concepts' structure relies on their relationships to other concepts as mandated by a particular mental theory about the state of the world. How this is supposed to work is a little less clear than in the previous two theories, but is still a prominent and notable theory. This is supposed to explain some of the issues of ignorance and error that come up in prototype and classical theories as concepts that are structured around each other seem to account for errors such as whale as a fish (this misconception came from an incorrect theory about what a whale is like, combining with our theory of what a fish is). When we learn that a whale is not a fish, we are recognizing that whales don't in fact fit the theory we had about what makes something a fish. In this sense, the Theory–Theory of concepts is responding to some of the issues of prototype theory and classic theory.

Ideasthesia

According to the theory of ideasthesia (or "sensing concepts"), activation of a concept may be the main mechanism responsible for creation of phenomenal experiences. Therefore, understanding how the brain processes concepts may be central to solving the mystery of how conscious experiences (or qualia) emerge within a physical system e.g., the sourness of the sour taste of lemon. This question is also known as the hard problem of consciousness. Research on ideasthesia emerged from research on synesthesia where it was noted that a synesthetic experience requires first an activation of a concept of the inducer. Later research expanded these results into everyday perception.

There is a lot of discussion on the most effective theory in concepts. Another theory is semantic pointers, which use perceptual and motor representations and these representations are like symbols.

Etymology

The term "concept" is traced back to 1554–60 (Latin conceptum – "something conceived").

Innatism

From Wikipedia, the free encyclopedia

Innatism is a philosophical and epistemological doctrine that holds that the mind is born with ideas/knowledge, and that therefore the mind is not a "blank slate" at birth, as early empiricists such as John Locke claimed. It asserts that not all knowledge is gained from experience and the senses. Plato and Descartes are prominent philosophers in the development of innatism and the notion that the mind is already born with ideas, knowledge and beliefs. Both philosophers emphasize that experiences are the key to unlocking this knowledge but not the source of the knowledge itself. Essentially, no knowledge is derived exclusively from one's experiences as empiricists like John Locke suggested.

Difference from nativism

In general usage, the terms innatism and nativism are synonymous as they both refer to notions of preexisting ideas present in the mind. However, more correctly, innatism refers to the philosophy of Plato and Descartes, who assumed that a God or a similar being or process placed innate ideas and principles in the human mind.

Nativism represents an adaptation of this, grounded in the fields of genetics, cognitive psychology, and psycholinguistics. Nativists hold that innate beliefs are in some way genetically programmed to arise in our mind—that innate beliefs are the phenotypes of certain genotypes that all humans share in common.

Nativism

Nativism is a modern view rooted in innatism. The advocates of nativism are mainly philosophers who also work in the field of cognitive psychology or psycholinguistics: most notably Noam Chomsky and Jerry Fodor (although the latter has adopted a more critical attitude towards nativism in his later writings). The nativist's general objection against empiricism is still the same as was raised by the rationalists; the human mind of a newborn child is not a tabula rasa, but equipped with an inborn structure.

Innate idea

In philosophy and psychology, an innate idea is a concept or item of knowledge which is said to be universal to all humanity—that is, something people are born with rather than something people have learned through experience.

The issue is controversial, and can be said to be an aspect of a long-running nature versus nurture debate, albeit one localized to the question of understanding human cognition.

Philosophical debate

Although individual human beings obviously vary due to cultural, racial, linguistic and era-specific influences, innate ideas are said to belong to a more fundamental level of human cognition. For example, the philosopher René Descartes theorized that knowledge of God is innate in everybody as a product of the faculty of faith.

Other philosophers, most notably the empiricists, were critical of the theory and denied the existence of any innate ideas, saying all human knowledge was founded on experience, rather than a priori reasoning.

Philosophically, the debate over innate ideas is central to the conflict between rationalist and empiricist epistemologies. While rationalists believe that certain ideas exist independently of experience, empiricism claims that all knowledge is derived from experience.

Immanuel Kant was a German philosopher who is regarded as having ended the impasse in modern philosophy between rationalists and empiricists, and is widely held to have synthesized these two early modern traditions in his thought.

Plato

Plato argues that if there are certain concepts that we know to be true but did not learn from experience then it must be because we have an innate knowledge of it and this knowledge must have been gained before birth. In Plato's Meno, he recalls a situation in which Socrates, his mentor, questioned a slave boy about a geometry theorem. Though the slave boy had no previous experience with geometry, he was able to generate the right responses to the questions he was asked. Plato reasoned that this was possible as Socrates' questions sparked the innate knowledge of math the boy had had from birth.

Descartes

Descartes conveys the idea that innate knowledge or ideas is something inborn such as one would say, that a certain disease might be 'innate' to signify that a person might be at risk of contacting such a disease. He suggests that something that is 'innate' is effectively present from birth and while it may not reveal itself then, is more than likely to present itself later in life. Descartes comparison of innate knowledge to an innate disease, whose symptoms may only show up later in life, unless prohibited by a factor like age or puberty, suggests that if an event occurs prohibiting someone from exhibiting an innate behaviour or knowledge, it doesn’t mean the knowledge did not exist at all but rather it wasn’t expressed – they were not able to acquire that knowledge. In other words, innate beliefs, ideas and knowledge require experiences to be triggered or they may never be expressed. Experiences are not the source of knowledge as proposed by John Locke, but catalysts to the uncovering of knowledge.

John Locke

The main antagonist to the concept of innate ideas is John Locke, a contemporary of Leibniz. Locke argued that the mind is in fact devoid of all knowledge or ideas at birth; it is a blank sheet or tabula rasa. He argued that all our ideas are constructed in the mind via a process of constant composition and decomposition of the input that we receive through our senses. 

Locke, in An Essay Concerning Human Understanding, suggests that the concept of universal assent in fact proves nothing, except perhaps that everyone is in agreement; in short universal assent proves that there is universal assent and nothing else. Moreover, Locke goes on to suggest that in fact there is no universal assent. Even a phrase such as "What is, is" is not universally assented to; infants and severely handicapped adults do not generally acknowledge this truism. Locke also attacks the idea that an innate idea can be imprinted on the mind without the owner realizing it. For Locke, such reasoning would allow one to conclude the absurd: “all the Truths a Man ever comes to know, will, by this account, be, every one of them, innate.” To return to the musical analogy, we may not be able to recall the entire melody until we hear the first few notes, but we were aware of the fact that we knew the melody and that upon hearing the first few notes we would be able to recall the rest.

Locke ends his attack upon innate ideas by suggesting that the mind is a tabula rasa or "blank slate", and that all ideas come from experience; all our knowledge is founded in sensory experience.

Essentially, the same knowledge thought to be a priori by Leibniz is in fact, according to Locke, the result of empirical knowledge, which has a lost origin [been forgotten] in respect to the inquirer. However, the inquirer is not cognizant of this fact; thus, he experiences what he believes to be a priori knowledge.
  • The theory of innate knowledge is excessive. Even innatists accept that most of our knowledge is learned through experience, but if that can be extended to account for all knowledge, we learn color through seeing it, so therefore, there is no need for a theory about an innate understanding of colour.
  • No ideas are universally held. Do we all possess the idea of God? Do we all believe in justice and beauty? Do we all understand the law of identity? If not, it may not be the case that we have acquired these ideas through impressions/experience/social interaction (this is the children's and idiot's criticism).
  • Even if there are some universally agreed statements, it is just the ability of the human brain to organize learned ideas/words, that is, innate. An "ability to organize" is not the same as "possessing propositional knowledge" (e.g., a computer with no saved files has all the operations programmed in but has an empty memory).

Gottfried Wilhelm Leibniz

Gottfried Wilhelm Leibniz suggested that we are born with certain innate ideas, the most identifiable of these being mathematical truisms. The idea that 1 + 1 = 2 is evident to us without the necessity for empirical evidence. Leibniz argues that empiricism can only show us that concepts are true in the present; the observation of one apple and then another in one instance, and in that instance only, leads to the conclusion that one and another equals two. However, the suggestion that one and another will always equal two require an innate idea, as that would be a suggestion of things unwitnessed.

Leibniz called such concepts as mathematical truisms "necessary truths". Another example of such may be the phrase, "what is, is" or "it is impossible for the same thing to be and not to be". Leibniz argues that such truisms are universally assented to (acknowledged by all to be true); this being the case, it must be due to their status as innate ideas. Often there are ideas that are acknowledged as necessarily true but are not universally assented to. Leibniz would suggest that this is simply because the person in question has not become aware of the innate idea, not because they do not possess it. Leibniz argues that empirical evidence can serve to bring to the surface certain principles that are already innately embedded in our minds. This is similar to needing to hear only the first few notes in order to recall the rest of the melody.

Scientific ideas

In his Meno, Plato raises an important epistemological quandary: How is it that we have certain ideas which are not conclusively derivable from our environments? Noam Chomsky has taken this problem as a philosophical framework for the scientific enquiry into innatism. His linguistic theory, which derives from 18th century classical-liberal thinkers such as Wilhelm von Humboldt, attempts to explain in cognitive terms how we can develop knowledge of systems which are said, by supporters of innatism, to be too rich and complex to be derived from our environment. One such example is our linguistic faculty. Our linguistic systems contain a systemic complexity which supposedly could not be empirically derived: the environment seems too poor, variable and indeterminate, according to Chomsky, to explain the extraordinary ability to learn complex concepts possessed by very young children. Essentially, their accurate grammatical knowledge cannot have originated from their experiences as their experiences are not adequate. It follows that humans must be born with a universal innate grammar, which is determinate and has a highly organized directive component, and enables the language learner to ascertain and categorize language heard into a system. Chomsky states that the ability to learn how to properly construct sentences or know which sentences are grammatically incorrect is an ability gained from innate knowledge. Noam Chomsky cites as evidence for this theory, the apparent invariability, according to his views, of human languages at a fundamental level. In this way, linguistics may provide a window into the human mind, and establish scientific theories of innateness which otherwise would remain merely speculative.

One implication of Noam Chomsky's innatism, if correct, is that at least a part of human knowledge consists in cognitive predispositions, which are triggered and developed by the environment, but not determined by it. Chomsky suggests that we can look at how a belief is acquired as an input-output situation. He supports the doctrine of innatism as he states that human beliefs gathered from sensory experience are much richer and complex than the experience itself. He asserts that the extra information gathered is from the mind itself as it cannot solely be from experiences. Humans derive excess amount of information from their environment so some of that information must be predetermined.

Parallels can then be drawn, on a purely speculative level, between our moral faculties and language, as has been done by sociobiologists such as E. O. Wilson and evolutionary psychologists such as Steven Pinker. The relative consistency of fundamental notions of morality across cultures seems to produce convincing evidence for these theories. In psychology, notions of archetypes such as those developed by Carl Jung, suggest determinate identity perceptions.

Scientific evidence for innateness

Evidence for innatism is being found by neuroscientists working on the Blue Brain Project. They discovered that neurons transmit signals despite an individual's experience. It had been previously assumed that neuronal circuits are made when the experience of an individual is imprinted in the brain, making memories. Researchers at Blue Brain discovered a network of about fifty neurons which they believed were building blocks of more complex knowledge but contained basic innate knowledge that could be combined in different more complex ways to give way to acquired knowledge, like memory.

Scientists ran tests on the neuronal circuits of several rats and ascertained that if the neuronal circuits had only been formed based on an individual's experience, the tests would bring about very different characteristics for each rat. However, the rats all displayed similar characteristics which suggests that their neuronal circuits must have been established previously to their experiences – it must be inborn and created prior to their experiences. The research done in the Blue Brain project expresses that some the building blocks of all our knowledge, is genetic and we're born with it.

Learning vs. innate knowledge

There are two ways in which animals can gain knowledge. The first of these two ways is learning. This is when an animal gathers information about its surrounding environment and then proceeds to use this information. For example, if an animal eats something that hurts its stomach, it has learned not to eat this again. The second way that an animal can acquire knowledge is through innate knowledge. This knowledge is genetically inherited. The animal automatically knows it without any prior experience. An example of this is when a horse is born and can immediately walk. The horse has not learned this behavior; it simply knows how to do it. In some scenarios, innate knowledge is more beneficial than learned knowledge. However, in other scenarios the opposite is true.

Costs and benefits of learned and innate knowledge and the evolution of learning

In a changing environment, an animal must constantly be gaining new information in order to survive. However, in a stable environment this same individual need only to gather the information it needs once and rely on it for the duration of its life. Therefore, there are different scenarios in which learning or innate knowledge is better suited. Essentially, the cost of obtaining certain knowledge versus the benefit of having it determined whether an animal evolved to learn in a given situation or whether it innately knew the information. If the cost of gaining the knowledge outweighed the benefit of having it, then the individual would not have evolved to learn in this scenario; instead, non-learning would evolve. However, if the benefit of having certain information outweighed the cost of obtaining it, then the animal would be far more likely to evolve to have to learn this information.

Non-learning is more likely to evolve in two scenarios. If an environment is static and change does not or rarely occurs then learning would simply be unnecessary. Because there is no need for learning in this scenario – and because learning could prove to be disadvantageous due to the time it took to learn the information – non-learning evolves. However, if an environment were in a constant state of change then learning would also prove to be disadvantageous. Anything learned would immediately become irrelevant because of the changing environment. The learned information would no longer apply. Essentially, the animal would be just as successful if it took a guess as if it learned. In this situation, non-learning would evolve.

However, in environments where change occurs but is not constant, learning is more likely to evolve. Learning is beneficial in these scenarios because an animal can adapt to the new situation, but can still apply the knowledge that it learns for a somewhat extended period of time. Therefore, learning increases the chances of success as opposed to guessing and adapts to changes in the environment as opposed to innate knowledge.

Galaxy rotation curve

From Wikipedia, the free encyclopedia

Rotation curve of spiral galaxy Messier 33 (yellow and blue points with error bars), and a predicted one from distribution of the visible matter (gray line). The discrepancy between the two curves can be accounted for by adding a dark matter halo surrounding the galaxy. 
 
File:Galaxy rotation under the influence of dark matter.ogv
Left: A simulated galaxy without dark matter. Right: Galaxy with a flat rotation curve that would be expected under the presence of dark matter.
 
The rotation curve of a disc galaxy (also called a velocity curve) is a plot of the orbital speeds of visible stars or gas in that galaxy versus their radial distance from that galaxy's centre. It is typically rendered graphically as a plot, and the data observed from each side of a spiral galaxy are generally asymmetric, so that data from each side are averaged to create the curve. A significant discrepancy exists between the experimental curves observed, and a curve derived from theory. The theory of dark matter is currently postulated to account for the variance.

Description

The rotation curve of a disc galaxy (also called a velocity curve) is a plot of the orbital speeds of visible stars or gas in that galaxy versus their radial distance from that galaxy's centre. The rotation curves of spiral galaxies are asymmetric, so the observational data from each side of a galaxy are generally averaged. Rotation curve asymmetry appears to be normal rather than exceptional.

The rotational/orbital speeds of galaxies/stars do not follow the rules found in other orbital systems such as stars/planets and planets/moons that have most of their mass at the centre. Stars revolve around their galaxy's centre at equal or increasing speed over a large range of distances. In contrast, the orbital velocities of planets in planetary systems and moons orbiting planets decline with distance. In the latter cases, this reflects the mass distributions within those systems. The mass estimations for galaxies based on the light they emit are far too low to explain the velocity observations.

The galaxy rotation problem is the discrepancy between observed galaxy rotation curves and the theoretical prediction, assuming a centrally dominated mass associated with the observed luminous material. When mass profiles of galaxies are calculated from the distribution of stars in spirals and mass-to-light ratios in the stellar disks, they do not match with the masses derived from the observed rotation curves and the law of gravity. A solution to this conundrum is to hypothesize the existence of dark matter and to assume its distribution from the galaxy's center out to its halo.

Though dark matter is by far the most accepted explanation of the rotation problem, other proposals have been offered with varying degrees of success. Of the possible alternatives, the most notable is Modified Newtonian Dynamics (MOND), which involves modifying the laws of gravity.

History

In 1932, Jan Hendrik Oort became the first to report that measurements of the stars in the Solar neighborhood indicated that they moved faster than expected when a mass distribution based upon visible matter was assumed, but these measurements were later determined to be essentially erroneous. In 1939, Horace Babcock reported in his PhD thesis measurements of the rotation curve for Andromeda which suggested that the mass-to-luminosity ratio increases radially. He attributed that to either the absorption of light within the galaxy or to modified dynamics in the outer portions of the spiral and not to any form of missing matter. Babcock's measurements turned out to disagree substantially with those found later, and the first measurement of an extended rotation curve in good agreement with modern data was published in 1957 by Henk van de Hulst and collaborators, who studied M31 with the newly commissioned Dwingeloo 25 meter telescope. A companion paper by Maarten Schmidt showed that this rotation curve could be fit by a flattened mass distribution more extensive than the light. In 1959, Louise Volders used the same telescope to demonstrate that the spiral galaxy M33 also does not spin as expected according to Keplerian dynamics.

Reporting on NGC 3115, Jan Oort wrote that "the distribution of mass in the system appears to bear almost no relation to that of light... one finds the ratio of mass to light in the outer parts of NGC 3115 to be about 250". On page 302-303 of his journal article, he wrote that "The strongly condensed luminous system appears imbedded in a large and more or less homogeneous mass of great density" and although he went on to speculate that this mass may be either extremely faint dwarf stars or interstellar gas and dust, he had clearly detected the dark matter halo of this galaxy.

In the late 1960s and early 1970s, Vera Rubin, an astronomer at the Department of Terrestrial Magnetism at the Carnegie Institution of Washington, worked with a new sensitive spectrograph that could measure the velocity curve of edge-on spiral galaxies to a greater degree of accuracy than had ever before been achieved. Together with fellow staff-member Kent Ford, Rubin announced at a 1975 meeting of the American Astronomical Society the discovery that most stars in spiral galaxies orbit at roughly the same speed, and that this implied that galaxy masses grow approximately linearly with radius well beyond the location of most of the stars (the galactic bulge). Rubin presented her results in an influential paper in 1980. These results suggested that either Newtonian gravity does not apply universally or that, conservatively, upwards of 50% of the mass of galaxies was contained in the relatively dark galactic halo. Although initially met with skepticism, Rubin's results have been confirmed over the subsequent decades.

If Newtonian mechanics is assumed to be correct, it would follow that most of the mass of the galaxy had to be in the galactic bulge near the center and that the stars and gas in the disk portion should orbit the center at decreasing velocities with radial distance from the galactic center (the dashed line in Fig. 1). 

Observations of the rotation curve of spirals, however, do not bear this out. Rather, the curves do not decrease in the expected inverse square root relationship but are "flat", i.e. outside of the central bulge the speed is nearly a constant (the solid line in Fig. 1). It is also observed that galaxies with a uniform distribution of luminous matter have a rotation curve that rises from the center to the edge, and most low-surface-brightness galaxies (LSB galaxies) have the same anomalous rotation curve.

The rotation curves might be explained by hypothesizing the existence of a substantial amount of matter permeating the galaxy that is not emitting light in the mass-to-light ratio of the central bulge. The material responsible for the extra mass was dubbed "dark matter", the existence of which was first posited in the 1930s by Jan Oort in his measurements of the Oort constants and Fritz Zwicky in his studies of the masses of galaxy clusters. The existence of non-baryonic cold dark matter (CDM) is today a major feature of the Lambda-CDM model that describes the cosmology of the universe.

Halo density profiles

In order to accommodate a flat rotation curve, a density profile for a galaxy and its environs must be different than one that is centrally concentrated. Newton's version of Kepler's Third Law implies that the spherically symmetric, radial density profile ρ(r) is:
where v(r) is the radial orbital velocity profile and G is the gravitational constant. This profile closely matches the expectations of a singular isothermal sphere profile where if v(r) is approximately constant then the density ρr−2 to some inner "core radius" where the density is then assumed constant. Observations do not comport with such a simple profile, as reported by Navarro, Frenk, and White in a seminal 1996 paper.

The authors then remarked, that a "gently changing logarithmic slope" for a density profile function could also accommodate approximately flat rotation curves over large scales. They found the famous Navarro–Frenk–White profile which is consistent both with N-body simulations and observations given by
where the central density, ρ0, and the scale radius, Rs, are parameters that vary from halo to halo. Because the slope of the density profile diverges at the center, other alternative profiles have been proposed, for example, the Einasto profile which has exhibited better agreement with certain dark matter halo simulations.

Observations of orbit velocities in spiral galaxies suggest a mass structure according to:
with Φ the galaxy gravitational potential.

Since observations of galaxy rotation do not match the distribution expected from application of Kepler's laws, they do not match the distribution of luminous matter. This implies that spiral galaxies contain large amounts of dark matter or, in alternative, the existence of exotic physics in action on galactic scales. The additional invisible component becomes progressively more conspicuous in each galaxy at outer radii and among galaxies in the less luminous ones.

Cosmology tells us that about 26% of the mass of the Universe is composed of dark matter, a hypothetical type of matter which does not emit or interact with electromagnetic radiation. Dark matter dominates the gravitational potential of galaxies and cluster of galaxies. Galaxies are baryonic condensations of stars and gas (namely H and He) that lie at the centers of much larger dark haloes of dark matter, affected by a gravitational instability caused by primordial density fluctuations.

The main goal has become to understand the nature and the history of these ubiquitous dark haloes by investigating the properties of the galaxies they contain (i.e. their luminosity, kinematics, sizes, and morphology). The measurement of the kinematics (their positions, velocities and accelerations) of the observable stars and gas has become a tool to investigate the nature of dark matter, as to its content and distribution relative to that of the various baryonic components of those galaxies.

Further investigations

Comparison of rotating disc galaxies in the distant Universe and the present day.
 
The rotational dynamics of galaxies are well characterized by their position on the Tully–Fisher relation, which shows that for spiral galaxies the rotational velocity is uniquely related to its total luminosity. A consistent way to predict the rotational velocity of a spiral galaxy is to measure its bolometric luminosity and then read its rotation rate from its location on the Tully–Fisher diagram. Conversely, knowing the rotational velocity of a spiral galaxy gives its luminosity. Thus the magnitude of the galaxy rotation is related to the galaxy's visible mass.

While precise fitting of the bulge, disk, and halo density profiles is a rather complicated process, it is straightforward to model the observables of rotating galaxies through this relationship. So, while state-of-the-art cosmological and galaxy formation simulations of dark matter with normal baryonic matter included can be matched to galaxy observations, there is not yet any straightforward explanation as to why the observed scaling relationship exists. Additionally, detailed investigations of the rotation curves of low-surface-brightness galaxies (LSB galaxies) in the 1990s and of their position on the Tully–Fisher relation showed that LSB galaxies had to have dark matter halos that are more extended and less dense than those of HSB galaxies and thus surface brightness is related to the halo properties. Such dark-matter-dominated dwarf galaxies may hold the key to solving the dwarf galaxy problem of structure formation.

Very importantly, the analysis of the inner parts of low and high surface brightness galaxies showed that the shape of the rotation curves in the centre of dark-matter dominated systems indicates a profile different from the NFW spatial mass distribution profile. This so-called cuspy halo problem is a persistent problem for the standard cold dark matter theory. Simulations involving the feedback of stellar energy into the interstellar medium in order to alter the predicted dark matter distribution in the innermost regions of galaxies are frequently invoked in this context.

Alternatives to dark matter

There have been a number of attempts to solve the problem of galaxy rotation by modifying gravity without invoking dark matter. One of the most discussed is Modified Newtonian Dynamics (MOND), originally proposed by Mordehai Milgrom in 1983, which modifies the Newtonian force law at low accelerations to enhance the effective gravitational attraction. MOND has had a considerable amount of success in predicting the rotation curves of low-surface-brightness galaxies, matching the baryonic Tully–Fisher relation, and the velocity dispersions of the small satellite galaxies of the Local Group.

Using data from the Spitzer Photometry and Accurate Rotation Curves (SPARC) database, a group has found the radial acceleration traced by rotation curves could be predicted just from the observed baryon distribution (that is, including stars and gas but not dark matter). The same relation provided a good fit for 2693 samples in 153 rotating galaxies, with diverse shapes, masses, sizes, and gas fractions. Brightness in the near IR, where the more stable light from red giants dominates, was used to estimate the density contribution due to stars more consistently. The results are consistent with MOND, and place limits on alternative explanations involving dark matter alone. However, cosmological simulations within a Lambda-CDM framework that include baryonic feedback effects reproduce the same relation, without the need to invoke new dynamics (such as MOND). Thus, a contribution due to dark matter itself can be fully predictable from that of the baryons, once the feedback effects due to the dissipative collapse of baryons is taken into account. 

MOND is not a relativistic theory, although relativistic theories which reduce to MOND have been proposed, such as tensor–vector–scalar gravity, scalar–tensor–vector gravity (STVG), and the f(R) theory of Capozziello and De Laurentis.

Inequality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Inequality...