Search This Blog

Thursday, January 23, 2020

Cognitive semantics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Cognitive_semantics

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently (different cultures), not necessarily some difference between a person's conceptual world and the real world (wrong beliefs).

The main tenets of cognitive semantics are:
  • That grammar manifests a conception of the world held in a culture;
  • That knowledge of language is acquired and contextual;
  • That the ability to use language draws upon general cognitive resources and not a special language module.
As part of the field of cognitive linguistics, the cognitive semantics approach rejects the traditional separation of linguistics into phonology, morphology, syntax, pragmatics, etc. Instead, it divides semantics into meaning-construction and knowledge representation. Therefore, cognitive semantics studies much of the area traditionally devoted to pragmatics as well as semantics.

The techniques native to cognitive semantics are typically used in lexical studies such as those put forth by Leonard Talmy, George Lakoff, Dirk Geeraerts, and Bruce Wayne Hawkins. Some cognitive semantic frameworks, such as that developed by Talmy, take into account syntactic structures as well.

Points of contrast

As a field, semantics is interested in three big questions: what does it mean for units of language, called lexemes, to have "meaning"? What does it mean for sentences to have meaning? Finally, how is it that meaningful units fit together to compose complete sentences? These are the main points of inquiry behind studies into lexical semantics, structural semantics, and theories of compositionality (respectively). In each category, traditional theories seem to be at odds with those accounts provided by cognitive semanticists.

Classic theories in semantics (in the tradition of Alfred Tarski and Donald Davidson) have tended to explain the meaning of parts in terms of necessary and sufficient conditions, sentences in terms of truth-conditions, and composition in terms of propositional functions. Each of these positions is tightly related to the others. According to these traditional theories, the meaning of a particular sentence may be understood as the conditions under which the proposition conveyed by the sentence hold true. For instance, the expression "snow is white" is true if and only if snow is, in fact, white. Lexical units can be understood as holding meaning either by virtue of set of things they may apply to (called the "extension" of the word), or in terms of the common properties that hold between these things (called its "intension"). The intension provides an interlocutor with the necessary and sufficient conditions that let a thing qualify as a member of some lexical unit's extension. Roughly, propositional functions are those abstract instructions that guide the interpreter in taking the free variables in an open sentence and filling them in, resulting in a correct understanding of the sentence as a whole.

Meanwhile, cognitive semantic theories are typically built on the argument that lexical meaning is conceptual. That is, meaning is not necessarily reference to the entity or relation in some real or possible world. Instead, meaning corresponds with a concept held in the mind based on personal understanding. As a result, semantic facts like "All bachelors are unmarried males" are not treated as special facts about our language practices; rather, these facts are not distinct from encyclopaedic knowledge. In treating linguistic knowledge as being a piece with everyday knowledge, the question is raised: how can cognitive semantics explain paradigmatically semantic phenomena, like category structure? Set to the challenge, researchers have drawn upon theories from related fields, like cognitive psychology and cognitive anthropology. One proposal is to treat in order to explain category structure in terms of nodes in a knowledge network. One example of a theory from cognitive science that has made its way into the cognitive semantic mainstream is the theory of prototypes, which cognitive semanticists generally argue is the cause of polysemy.

Cognitive semanticists argue that truth-conditional semantics is unduly limited in its account of full sentence meaning. While they are not on the whole hostile to truth-conditional semantics, they point out that it has limited explanatory power. That is to say, it is limited to indicative sentences, and does not seem to offer any straightforward or intuitive way of treating (say) commands or expressions. By contrast, cognitive semantics seeks to capture the full range of grammatical moods by also making use of the notions of framing and mental spaces.

Another trait of cognitive semantics is the recognition that meaning is not fixed but a matter of construal and conventionalization. The processes of linguistic construal, it is argued, are the same psychological processes involved in the processing of encyclopaedic knowledge and in perception. This view has implications for the problem of compositionality. An account in cognitive semantics called the dynamic construal theory makes the claim that words themselves are without meaning: they have, at best, "default construals," which are really just ways of using words. Along these lines, cognitive semantics argues that compositionality can only be intelligible if pragmatic elements like context and intention are taken into consideration.

The structure of concepts

Cognitive semantics has sought to challenge traditional theories in two ways: first, by providing an account of the meaning of sentences by going beyond truth-conditional accounts; and second, by attempting to go beyond accounts of word meaning that appeal to necessary and sufficient conditions. It accomplishes both by examining the structure of concepts. 

Frame semantics

Frame semantics, developed by Charles J. Fillmore, attempts to explain meaning in terms of their relation to general understanding, not just in the terms laid out by truth-conditional semantics. Fillmore explains meaning in general (including the meaning of lexemes) in terms of "frames". By "frame" is meant any concept that can only be understood if a larger system of concepts is also understood. 

Fillmore: framing

Many pieces of linguistic evidence motivate the frame-semantic project. First, it has been noted that word meaning is an extension of our bodily and cultural experiences. For example, the notion of restaurant is associated with a series of concepts, like food, service, waiters, tables, and eating. These rich-but-contingent associations cannot be captured by an analysis in terms of necessary and sufficient conditions, yet they still seem to be intimately related to our understanding of "restaurant".
Second, and more seriously, these conditions are not enough to account for asymmetries in the ways that words are used. According to a semantic feature analysis, there is nothing more to the meanings of "boy" and "girl" than:
  1. BOY [+MALE], [+YOUNG]
  2. GIRL [+FEMALE], [+YOUNG]
And there is surely some truth to this proposal. Indeed, cognitive semanticists understand the instances of the concept held by a given certain word may be said to exist in a schematic relation with the concept itself. And this is regarded as a legitimate approach to semantic analysis, so far as it goes. 

However, linguists have found that language users regularly apply the terms "boy" and "girl" in ways that go beyond mere semantic features. That is, for instance, people tend to be more likely to consider a young female a "girl" (as opposed to "woman"), than they are to consider a borderline-young male a "boy" (as opposed to "man"). This fact suggests that there is a latent frame, made up of cultural attitudes, expectations, and background assumptions, which is part of word meaning. These background assumptions go up and beyond those necessary and sufficient conditions that correspond to a semantic feature account. Frame semantics, then, seeks to account for these puzzling features of lexical items in some systematic way.

Third, cognitive semanticists argue that truth-conditional semantics is incapable of dealing adequately with some aspects of the meanings at the level of the sentence. Take the following:
  1. You didn't spare me a day at the seaside; you deprived me of one.
In this case, the truth-conditions of the claim expressed by the antecedent in the sentence are not being denied by the proposition expressed after the clause. Instead, what is being denied is the way that the antecedent is framed.

Finally, with the frame-semantic paradigm's analytical tools, the linguist is able to explain a wider range of semantic phenomena than they would be able to with only necessary and sufficient conditions. Some words have the same definitions or intensions, and the same extensions, but have subtly different domains. For example, the lexemes land and ground are synonyms, yet they naturally contrast with different things -- sea and air, respectively.

As we have seen, the frame semantic account is by no means limited to the study of lexemes—with it, researchers may examine expressions at more complex levels, including the level of the sentence (or, more precisely, the utterance). The notion of framing is regarded as being of the same cast as the pragmatic notion of background assumptions. Philosopher of language John Searle explains the latter by asking readers to consider sentences like "The cat is on the mat". For such a sentence to make any sense, the interpreter makes a series of assumptions: i.e., that there is gravity, the cat is parallel to the mat, and the two touch. For the sentence to be intelligible, the speaker supposes that the interpreter has an idealized or default frame in mind. 

Langacker: profile and base

An alternate strain of Fillmore's analysis can be found in the work of Ronald Langacker, who makes a distinction between the notions of profile and base. The profile is the concept symbolized by the word itself, while the base is the encyclopedic knowledge that the concept presupposes. For example, let the definition of "radius" be "a line segment that joins the center of a circle with any point on its circumference". If all we know of the concept radius is its profile, then we simply know that it is a line segment that is attached to something called the "circumference" in some greater whole called the "circle". That is to say, our understanding is fragmentary until the base concept of circle is firmly grasped. 

When a single base supports a number of different profiles, then it can be called a "domain". For instance, the concept profiles of arc, center, and circumference are all in the domain of circle, because each uses the concept of circle as a base. We are then in a position to characterize the notion of a frame as being either the base of the concept profile, or (more generally) the domain that the profile is a part of.

Categorization and cognition

Membership of a graded class

A major divide in the approaches to cognitive semantics lies in the puzzle surrounding the nature of category structure. As mentioned in the previous section, semantic feature analyses fall short of accounting for the frames that categories may have. An alternative proposal would have to go beyond the minimalistic models given by classical accounts, and explain the richness of detail in meaning that language speakers attribute to categories.

Prototype theories, investigated by Eleanor Rosch, have given some reason to suppose that many natural lexical category structures are graded, i.e., they have prototypical members that are considered to be "better fit" the category than other examples. For instance, robins are generally viewed as better examples of the category "bird" than, say, penguins. If this view of category structure is the case, then categories can be understood to have central and peripheral members, and not just be evaluated in terms of members and non-members.

In a related vein, George Lakoff, following the later Ludwig Wittgenstein, noted that some categories are only connected to one another by way of family resemblances. While some classical categories may exist, i.e., which are structured by necessary and sufficient conditions, there are at least two other kinds: generative and radial

Generative categories can be formed by taking central cases and applying certain principles to designate category membership. The principle of similarity is one example of a rule that might generate a broader category from given prototypes. 

Radial categories are categories motivated by conventions, but not predictable from rules. The concept of "mother", for example, may be explained in terms of a variety of conditions that may or may not be sufficient. Those conditions may include: being married, has always been female, gave birth to the child, supplied half the child's genes, is a caregiver, is married to the genetic father, is one generation older than the child, and is the legal guardian. Any one of the above conditions might not be met: for instance, a "single mother" does not need to be married, and a "surrogate mother" does not necessarily provide nurturance. When these aspects collectively cluster together, they form a prototypical case of what it means to be a mother, but nevertheless they fail to outline the category crisply. Variations upon the central meaning are established by convention by the community of language users. 

For Lakoff, prototype effects can be explained in large part due to the effects of idealized cognitive models. That is, domains are organized with an ideal notion of the world that may or may not fit reality. For example, the word "bachelor" is commonly defined as "unmarried adult male". However, this concept has been created with a particular ideal of what a bachelor is like: an adult, uncelibate, independent, socialized, and promiscuous. Reality might either strain the expectations of the concept, or create false positives. That is, people typically want to widen the meaning of "bachelor" to include exceptions like "a sexually active seventeen-year-old who lives alone and owns his own firm" (not technically an adult but seemingly still a bachelor), and this can be considered a kind of straining of the definition. Moreover, speakers would tend to want to exclude from the concept of bachelor certain false positives, such as those adult unmarried males that don't bear much resemblance to the ideal: i.e., the Pope, or Tarzan. Prototype effects may also be explained as a function of either basic-level categorization and typicality, closeness to an ideal, or stereotyping.

So viewed, prototype theory seems to give an account of category structure. However, there are a number of criticisms of this interpretation of the data. Indeed, Rosch and Lakoff, themselves chief advocates of prototype theory, have emphasized in their later works that the findings of prototype theory do not necessarily tell us anything about category structure. Some theorists in the cognitive semantics tradition have challenged both classical and prototype accounts of category structure by proposing the dynamic construal account, where category structure is always created "on-line"—and so, that categories have no structure outside of the context of use.

Mental spaces

Propositional attitudes in Fodor's presentation of truth-conditional semantics
 
In traditional semantics, the meaning of a sentence is the situation it represents, and the situation can be described in terms of the possible world that it would be true of. Moreover, sentence meanings may be dependent upon propositional attitudes: those features that are relative to someone's beliefs, desires, and mental states. The role of propositional attitudes in truth-conditional semantics is controversial. However, by at least one line of argument, truth-conditional semantics seems to be able to capture the meaning of belief-sentences like "Frank believes that the Red Sox will win the next game" by appealing to propositional attitudes. The meaning of the overall proposition is described as a set of abstract conditions, wherein Frank holds a certain propositional attitude, and the attitude is itself a relationship between Frank and a particular proposition; and this proposition is the possible world where the Red Sox win the next game.

Still, many theorists have grown dissatisfied with the inelegance and dubious ontology behind possible-worlds semantics. An alternative can be found in the work of Gilles Fauconnier. For Fauconnier, the meaning of a sentence can be derived from "mental spaces". Mental spaces are cognitive structures entirely in the minds of interlocutors. In his account, there are two kinds of mental space. The base space is used to describe reality (as it is understood by both interlocutors). Space builders (or built space) are those mental spaces that go beyond reality by addressing possible worlds, along with temporal expressions, fictional constructs, games, and so on. Additionally, Fauconnier semantics distinguishes between roles and values. A semantic role is understood to be description of a category, while values are the instances that make up the category. (In this sense, the role-value distinction is a special case of the type-token distinction.) 

Fauconnier argues that curious semantic constructions can be explained handily by the above apparatus. Take the following sentence:
  1. In 1929, the lady with white hair was blonde.
The semanticist must construct an explanation for the obvious fact that the above sentence is not contradictory. Fauconnier constructs his analysis by observing that there are two mental spaces (the present-space and the 1929-space). His access principle supposes that "a value in one space can be described by the role its counterpart in another space has, even if that role is invalid for the value in the first space". So, to use the example above, the value in 1929-space is the blonde, while she is being described with the role of the lady with white hair in present-day space.

Conceptualization and construal

As we have seen, cognitive semantics gives a treatment of issues in the construction of meaning both at the level of the sentence and the level of the lexeme in terms of the structure of concepts. However, it is not entirely clear what cognitive processes are at work in these accounts. Moreover, it is not clear how we might go about explaining the ways that concepts are actively employed in conversation. It appears to be the case that, if our project is to look at how linguistic strings convey different semantic content, we must first catalogue what cognitive processes are being used to do it. Researchers can satisfy both requirements by attending to the construal operations involved in language processing—that is to say, by investigating the ways that people structure their experiences through language.

Language is full of conventions that allow for subtle and nuanced conveyances of experience. To use an example that is readily at hand, framing is all-pervasive, and it may extend across the full breadth of linguistic data, extending from the most complex utterances, to tone, to word choice, to expressions derived from the composition of morphemes. Another example is image-schemata, which are ways that we structure and understand the elements of our experience driven by any given sense.

According to linguists William Croft and D. Alan Cruse, there are four broad cognitive abilities that play an active part in the construction of construals. They are: attention/salience, judgment/comparison, situatedness, and constitution/gestalt. Each general category contains a number of subprocesses, each of which helps to explain the ways we encode experience into language in some unique way.

Semantics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Semantics
Semantics (from Ancient Greek: σημαντικός sēmantikós, "significant") is the linguistic and philosophical study of meaning in language, programming languages, formal logics, and semiotics. It is concerned with the relationship between signifiers—like words, phrases, signs, and symbols—and what they stand for in reality, their denotation

In International scientific vocabulary semantics is also called semasiology. The word semantics was first used by Michel Bréal, a French philologist. It denotes a range of ideas—from the popular to the highly technical. It is often used in ordinary language for denoting a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal enquiries, over a long period of time, especially in the field of formal semantics. In linguistics, it is the study of the interpretation of signs or symbols used in agents or communities within particular circumstances and contexts. Within this view, sounds, facial expressions, body language, and proxemics have semantic (meaningful) content, and each comprises several branches of study. In written language, things like paragraph structure and punctuation bear semantic content; other forms of language bear other semantic content.

The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics, etymology and others. Independently, semantics is also a well-defined field in its own right, often with synthetic properties. In the philosophy of language, semantics and reference are closely connected. Further related fields include philology, communication, and semiotics. The formal study of semantics can therefore be manifold and complex.

Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to their meaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and the users of the language. Semantics as a field of study also has significant ties to various representational theories of meaning including truth theories of meaning, coherence theories of meaning, and correspondence theories of meaning. Each of these is related to the general philosophical study of reality and the representation of meaning. In 1960s psychosemantic studies became popular after Osgood's massive cross-cultural studies using his semantic differential (SD) method that used thousands of nouns and adjective bipolar scales. A specific form of the SD, Projective Semantics method uses only most common and neutral nouns that correspond to the 7 groups (factors) of adjective-scales most consistently found in cross-cultural studies (Evaluation, Potency, Activity as found by Osgood, and Reality, Organization, Complexity, Limitation as found in other studies). In this method, seven groups of bipolar adjective scales corresponded to seven types of nouns so the method was thought to have the object-scale symmetry (OSS) between the scales and nouns for evaluation using these scales. For example, the nouns corresponding to the listed 7 factors would be: Beauty, Power, Motion, Life, Work, Chaos, Law. Beauty was expected to be assessed unequivocally as “very good” on adjectives of Evaluation-related scales, Life as “very real” on Reality-related scales, etc. However, deviations in this symmetric and very basic matrix might show underlying biases of two types: scales-related bias and objects-related bias. This OSS design meant to increase the sensitivity of the SD method to any semantic biases in responses of people within the same culture and educational background.

Linguistics

In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of discourse (termed texts, or narratives). The study of semantics is also closely linked to the subjects of representation, reference and denotation. The basic study of semantics is oriented to the examination of the meaning of signs, and the study of relations between different linguistic units and compounds: homonymy, synonymy, antonymy, hypernymy, hyponymy, meronymy, metonymy, holonymy, paronyms. A key concern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning. Traditionally, semantics has included the study of sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax. 

Montague grammar

In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the lambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject (John) and a predicate (ate every bagel); Montague demonstrated that the meaning of the sentence altogether could be decomposed into the meanings of its parts and in relatively few rules of combination. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s.

Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as:

Prototype theory

Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members. One may compare it with Jung's archetype, though the concept of archetype sticks to static concept. Some post-structuralists are against the fixed or static meaning of the words. Derrida, following Nietzsche, talked about slippages in fixed meanings. 

Systems of categories are not objectively out there in the world but are rooted in people's experience. These categories evolve as learned concepts of the world – meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the "grounding of our conceptual systems in shared embodiment and bodily experience". A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate.

Theories in semantics


Formal semantics

Originates from Montague's work (see above). A highly formalized theory of natural language semantics in which expressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these to another. The truth of a sentence, and its logical relation to other sentences, is then evaluated relative to a model. 

Truth-conditional semantics

Pioneered by the philosopher Donald Davidson, another formalized theory, which aims to associate each natural language sentence with a meta-language description of the conditions under which it is true, for example: 'Snow is white' is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixed meanings assigned to the individual words and fixed rules for how to combine them. In practice, truth-conditional semantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional semantics seeks to connect language with statements about the real world (in the form of meta-language statements), rather than with abstract models.

Conceptual semantics

This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them. With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in. The way this is gone about is by looking at the internal structure of words. These small parts that make up the internal structure of words are termed semantic primitives.

Cognitive semantics

Cognitive semantics approaches meaning from the perspective of cognitive linguistics. In this framework, language is explained via general human cognitive abilities rather than a domain-specific language module. The techniques native to cognitive semantics are typically used in lexical studies such as those put forth by Leonard Talmy, George Lakoff, Dirk Geeraerts, and Bruce Wayne Hawkins. Some cognitive semantic frameworks, such as that developed by Talmy, take into account syntactic structures as well. Semantics, through modern researchers can be linked to the Wernicke's area of the brain and can be measured using the event-related potential (ERP). ERP is the rapid electrical response recorded with small disc electrodes which are placed on a persons scalp. 

Lexical semantics

A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fully reflected by its context. Here, the meaning of a word is constituted by its contextual relations. Therefore, a distinction between degrees of participation as well as modes of participation are made. In order to accomplish this distinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is labeled as a semantic constituent. Semantic constituents that cannot be broken down into more elementary constituents are labeled minimal semantic constituents.

Cross-cultural semantics

Various fields or disciplines have long been contributing to cross-cultural semantics. Are words like love, truth, and hate universals? Is even the word sense – so central to semantics – a universal, or a concept entrenched in a long-standing but culture-specific tradition? These are the kind of crucial questions that are discussed in cross-cultural semantics. Translation theory, ethnolinguistics, linguistic anthropology and cultural linguistics specialize in the field of comparing, contrasting, and translating words, terms and meanings from one language to another (see Herder, W. von Humboldt, Boas, Sapir, and Whorf). But philosophy, sociology, and anthropology have long established traditions in contrasting the different nuances of the terms and concepts we use. And online encyclopaedias such as the Stanford encyclopedia of philosophy, https://plato.stanford.edu, and more and more Wikipedia itself have greatly facilitated the possibilities of comparing the background and usages of key cultural terms. In recent years the question of whether key terms are translatable or untranslatable has increasingly come to the fore of global discussions, especially since the publication of Barbara Cassin’s Dictionary of Untranslatables: A Philosophical Lexicon, in 2014.

Computational semantics

Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithms and architectures are described. Within this framework the algorithms and architectures are also analyzed in terms of decidability, time/space complexity, data structures that they require and communication protocols.

Computer science

In computer science, the term semantics refers to the meaning of language constructs, as opposed to their form (syntax). According to Euzenat, semantics "provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared."

Programming languages

The semantics of programming languages and other languages is an important issue and area of study in computer science. Like the syntax of a language, its semantics can be defined exactly. 

For instance, the following statements use different syntaxes, but cause the same instructions to be executed, namely, perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x':

Statement Programming languages
x += y C, C++, C#, Java, JavaScript, Python, Ruby, etc.
$x += $y Perl, PHP
x := x + y Ada, ALGOL, ALGOL 68, BCPL, Dylan, Eiffel, Modula-2, Oberon, OCaml, Object Pascal (Delphi), Pascal, SETL, Simula, Smalltalk, Standard ML, VHDL, etc.
MOV EAX,[y]
ADD [x],EAX
Assembly languages: Intel 8086
ldr r2, [y]
ldr r3, [x]
add r3, r3, r2
str r3, [x]
Assembly languages: ARM
LET X = X + Y BASIC: early
x = x + y BASIC: most dialects; Fortran, MATLAB, Lua
Set x = x + y Caché ObjectScript
ADD Y TO X. ABAP
ADD Y TO X GIVING X COBOL
set /a x=%x%+%y% Batch
(incf x y) Common Lisp
/x y x add def PostScript
y @ x +! Forth
x =: x + y J
Various ways have been developed to describe the semantics of programming languages formally, building on mathematical logic:
  • Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.
  • Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.
  • Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions. Thus there may be aspects of the executions that are ignored.

Semantic models

The Semantic Web refers to the extension of the World Wide Web via embedding added semantic metadata, using semantic data modeling techniques such as Resource Description Framework (RDF) and Web Ontology Language (OWL). On the Semantic Web, terms such as semantic network and semantic data model are used to describe particular types of data model characterized by the use of directed graphs in which the vertices denote concepts or entities in the world and their properties, and the arcs denote relationships between them. These can formally be described as description logic concepts and roles, which correspond to OWL classes and properties.

Psychology

In psychology, semantic memory is memory for meaning – in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience – while episodic memory is memory for the ephemeral details – the individual features, or the unique particulars of experience. The term 'episodic memory' was introduced by Tulving and Schacter in the context of 'declarative memory' which involved simple association of factual or objective information concerning its object. Word meaning is measured by the company they keep, i.e. the relationships among words themselves in a semantic network. The memories may be transferred intergenerationally or isolated in one generation due to a cultural disruption. Different generations may have different experiences at similar points in their own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an otherwise homogeneous culture. In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the network are few in number and kind, and include part of, kind of, and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, artificial neural networks and predicate calculus techniques.

Ideasthesia is a psychological phenomenon in which activation of concepts evokes sensory experiences. For example, in synesthesia, activation of a concept of a letter (e.g., that of the letter A) evokes sensory-like experiences (e.g., of red color).

Slang

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Slang

Slang is language (words, phrases, and usages) of an informal register that members of particular in-groups favor (over the common vocabulary of a standard language) in order to establish group identity, exclude outsiders, or both.

Etymology of the word slang

In its earliest attested use (1756), the word slang referred to the vocabulary of "low" or "disreputable" people. By the early nineteenth century, it was no longer exclusively associated with disreputable people, but continued to be applied to usages below the level of standard educated speech. The origin of the word is uncertain, although it appears to be connected with thieves' cant. A Scandinavian origin has been proposed (compare, for example, Norwegian slengenavn, which means "nickname"), but based on "date and early associations" is discounted by the Oxford English Dictionary. Jonathon Green, however, agrees with the possibility of a Scandinavian origin, suggesting the same root as that of sling, which means "to throw", and noting that slang is thrown language – a quick, honest way to make your point.

Defining slang

Linguists have no simple and clear definition of slang, but agree that it is a constantly changing linguistic phenomenon present in every subculture worldwide. Some argue that slang exists because we must come up with ways to define new experiences that have surfaced with time and modernity. Attempting to remedy the lack of a clear definition, however, Bethany K. Dumas and Jonathan Lighter argue that an expression should be considered "true slang" if it meets at least two of the following criteria:
  • It lowers, if temporarily, "the dignity of formal or serious speech or writing"; in other words, it is likely to be considered in those contexts a "glaring misuse of register".
  • Its use implies that the user is familiar with whatever is referred to, or with a group of people who are familiar with it and use the term.
  • "It's a taboo term in ordinary discourse with people of a higher social status or greater responsibility."
  • It replaces "a well-known conventional synonym." This is done primarily to avoid discomfort caused by the conventional synonym or discomfort or annoyance caused by having to elaborate further.
Michael Adams remarks that, "[Slang] is liminal language... it is often impossible to tell, even in context, which interests and motives it serves... slang is on the edge." Slang dictionaries, collecting thousands of slang entries, offer a broad, empirical window into the motivating forces behind slang".
While many forms of lexicon may be considered low-register or "sub-standard", slang remains distinct from colloquial and jargon terms because of its specific social contexts. While viewed as inappropriate in formal usage, colloquial terms are typically considered acceptable in speech across a wide range of contexts, while slang tends to be perceived as infelicitous in many common communicative situations. Jargon refers to language used by personnel in a particular field, or language used to represent specific terms within a field to those with a particular interest. Although jargon and slang can both be used to exclude non-group members from the conversation, the purpose of jargon is said to be optimizing conversation using terms that imply technical understanding. On the other hand, slang tends to emphasize social and contextual understanding. 

While colloquialisms and jargon may seem like slang because they reference a particular group, they do not necessarily fit the same definition, because they do not represent a particular effort to replace the general lexicon of a standard language. Colloquialisms are considered more acceptable and more expected in standard usage than slang is, and jargon is often created to talk about aspects of a particular field that are not accounted for in the general lexicon. However, this differentiation is not consistently applied by linguists; the terms "slang" and "jargon" are sometimes treated as synonymous and the scope of "jargon" is at times extended to mean all forms of socially-restricted language.

It is often difficult to differentiate slang from colloquialisms and even high-register lexicon, because slang generally becomes accepted into common vocabulary over time. Words such as "spurious" and "strenuous" were once perceived as slang, though they are now considered general, even high-register words. The literature on slang even discusses mainstream acknowledgment of a slang term as changing its status as true slang, because it has been accepted by the media and is thus no longer the special insider speech of a particular group. Nevertheless, a general test for whether a word is a slang word or not is whether it would be acceptable in an academic or legal setting, as both are arenas in which standard lexicon is considered necessary and/or whether the term has been entered in the Oxford English Dictionary, which some scholars claim changes its status as slang.

Examples of slang (cross-linguistic)


Formation of slang

It is often difficult to collect etymologies for slang terms, largely because slang is a phenomenon of speech, rather than written language and etymologies which are typically traced via corpus.

Eric Partridge, cited as the first to report on the phenomenon of slang in a systematic and linguistic way, postulated that a term would likely be in circulation for a decade before it would be written down. Nevertheless, it seems that slang generally forms via deviation from a standard form. This "spawning" of slang occurs in much the same way that any general semantic change might occur. The difference here is that the slang term's new meaning takes on a specific social significance having to do with the group the term indexes.

Coleman also suggests that slang is differentiated within more general semantic change in that it typically has to do with a certain degree of “playfulness". The development of slang is considered to be a largely “spontaneous, lively, and creative” speech process.

Still, while a great deal of slang takes off, even becoming accepted into the standard lexicon, much slang dies out, sometimes only referencing a group. An example of this is the term "groovy" which is a relic of 1960's and 70's American "hippy" slang. Nevertheless, for a slang term to become a slang term, people must use it, at some point in time, as a way to flout standard language. Additionally, slang terms may be borrowed between groups, such as the term "gig" which was originally coined by jazz musicians in the 1930s and then borrowed into the same hippy slang of the 1960s. 'The word "groovy" has remained a part of subculture lexicon since its popularization. It is still in common use today by a significant population. The word "gig" to refer to a performance very likely originated well before the 1930s, and remained a common term throughout the 1940s and 1950s before becoming a vaguely associated with the "hippy slang of the 1960s". The word "gig" is now a widely accepted synonym for a concert, recital, or performance of any type. "Hippy" is more commonly spelled "hippie".

Generally, slang terms undergo the same processes of semantic change that words in the regular lexicon do.

Slang often will form from words with previously differing meanings, one example is the often used and popular slang word "lit", which was created by a generation labeled "Generation Z". The word itself used to be associated with something being on fire or being "lit" up until 1988 when it was first used in writing to indicate a person who was drunk in the book "Warbirds: Diary of an Unknown Aviator". Since this time "lit" has gained popularity through Rap songs such as ASAP Rocky's "Get Lit" in 2011. As the popularity of the word has increased so too has the number of different meanings associated with the word. Now "lit" describes a person who is drunk and/or high, as well as an event that is especially awesome and "hype".

Words and phrases from popular Hollywood films and television series frequently become slang.

Social implications


Indexicality

Slang is usually associated with a particular group and plays a role in constructing our identities. While slang outlines social space, attitudes about slang partly construct group identity and identify individuals as members of groups. Therefore, using the slang of a particular group will associate an individual with that group. Using Silverstein's notion of different orders of indexicality, it can be said that a slang term can be a second-order index to this particular group. Employing a slang term, however, can also give an individual the qualities associated with the term's group of origin, whether or not the individual is actually trying to identify as a member of the group. This allocation of qualities based on abstract group association is known as third-order indexicality.

As outlined by Elisa Mattiello in her book, a slang term can take on various levels of identification. Giving the examples of the terms "foxy" and "shagadelic", Mattiello explains that neither term makes sense given a standard interpretation of English:
  • "foxy", although clearly a "denominal adjective" from its -y suffix, does not make sense semantically, as it is a synonym with sexy and has nothing to do with foxes;
  • "shagadelic" is a combination of a slang term with a slang suffix and therefore is considered an "extra-grammatical" creation.
Nevertheless, Matiello concludes that those agents who identify themselves as "young men" have "genuinely coined" these terms and choose to use them over "canonical" terms —like beautiful or sexy—because of the indexicalized social identifications the former convey.

First and second order indexicality

In terms of first and second order indexicality, the usage of speaker-oriented terms by male adolescents indicated their membership to their age group, to reinforce connection to their peer group, and to exclude outsiders.

Higher-order indexicality

In terms of higher order indexicality, anyone using these terms may desire to appear fresher, undoubtedly more playful, faddish, and colourful than someone who employs the standard English term "beautiful". This appearance relies heavily on the hearer's third-order understanding of the term's associated social nuances and presupposed use-cases.

Subculture associations

Often, distinct subcultures will create slang that members will use in order to associate themselves with the group, or to delineate outsiders.

Slang terms are often known only within a clique or ingroup. For example, Leet ("Leetspeak" or "1337") was originally popular only among certain Internet subcultures, such as software crackers and online video gamers. During the 1990s, and into the early 21st century, however, Leet became increasingly more commonplace on the Internet, and it has spread outside Internet-based communication and into spoken languages. Other types of slang include SMS language used on mobile phones, and "chatspeak", (e.g., "LOL", an acronym meaning "laughing out loud" or "laugh out loud" or ROFL, "rolling on the floor laughing"), which are widely used in instant messaging on the Internet.

As subcultures are also often forms of counterculture and counterculture itself can be defined as going against a standard, it follows that slang has come to be associated with counterculture.

Social media and Internet slang

Slang is often taken from social media as a sign of social awareness and shared knowledge of popular culture. This particular branch of slang has become more prevalent since the early 2000s as a result of the rise in popularity of social networking services, including Facebook, Twitter, and Instagram. This has created new vocabularies associated with each new social media venue, such as the use of the term “friending” on Facebook, which is a verbification of “friend” used to describe the process of adding a new person to one's list of friends on the website, despite the existence of an analogous term “befriend“. This term is much older than Facebook, but has only recently entered the popular lexicon. Other examples of the slang found in social media include a general trend toward shortened words or acronyms. These are especially associated with services such as Twitter, which now has a 280 character limit for each message and therefore requires a briefer, more condensed manner of communication. This includes the use of hashtags which explicitly state the main content of a message or image, such as #food or #photography.

Debates about slang

Some critics believe that when slang becomes more commonplace it effectively eradicates the "proper" use of a certain language. However, academic (descriptive) linguists believe that language is not static but ever-changing and that slang terms are valid words within a language's lexicon. While prescriptivists study and promote the socially preferable or "correct" ways to speak, according to a language's normative grammar and syntactical words, descriptivists focus on studying language to further understand the subconscious rules of how individuals speak, which makes slang important in understanding such rules. Noam Chomsky, a founder of anthropological linguistic thought, challenged structural and prescriptive grammar and began to study sounds and morphemes functionally, as well as their changes within a language over time.

Transgenerational design

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Transgenerational_design

Transgenerational design is the practice of making products and environments compatible with those physical and sensory impairments associated with human aging and which limit major activities of daily living. The term transgenerational design was coined in 1986, by Syracuse University industrial design professor James J. Pirkl to describe and identify products and environments that accommodate, and appeal to, the widest spectrum of those who would use them—the young, the old, the able, the disabled—without penalty to any group. The transgenerational design concept emerged from his federally funded design-for-aging research project, Industrial design Accommodations: A Transgenerational Perspective. The project's two seminal 1988 publications provided detailed information about the aging process; informed and sensitized industrial design professionals and design students about the realities of human aging; and offered a useful set of guidelines and strategies for designing products that accommodate the changing needs of people of all ages and abilities.

Overview

The transgenerational design concept establishes a common ground for those who are committed to integrating age and ability within the consumer population. Its underlying principle is that people, including those who are aged or impaired, have an equal right to live in a unified society.

Transgenerational design practice recognizes that human aging is a continuous, dynamic process that starts at birth and ends with death, and that throughout the aging process, people normally experience occurrences of illness, accidents and declines in physical and sensory abilities that impair one's independence and lifestyle. But most injuries, impairments and disabilities typically occur more frequently as one grows older and experiences the effects of senescence (biological aging). Four facts clarify the interrelationship of age with physical and sensory vulnerability:
  1. young people become old
  2. young people can become disabled
  3. old people can become disabled
  4. disabled people become old
Within each situation, consumers expect products and services to fulfill and enhance their lifestyle, both physically and symbolically. Transgenerational design focuses on serving their needs through what Cagan and Vogel call "a value oriented product development process". They note that a product is "deemed of value to a customer if it offers a strong effect on lifestyle, enabling features, and meaningful ergonomics" resulting in products that are "useful, usable, and desirable" during both short and long term use by people of all ages and abilities.

Transgenerational design is "framed as a market-aware response to population aging that fulfills the need for products and environments that can be used by both young and old people living and working in the same environment".

Benefits

Transgenerational design benefits all ages and abilities by creating a harmonious bond between products and the people that use them. It satisfies the psychological, physiological, and sociological factors desired—and anticipated—by users of all ages and abilities:
Transgenerational design addresses each element and accommodates the user—regardless of age or ability—by providing a sympathetic fit and unencumbered ease of use. Such designs provide greater accessibility by offering wider options and more choices, thereby preserving and extending one's independence, and enhancing the quality of life for all ages and abilities—at no group's expense.

Transgenerational designs accommodate rather than discriminate and sympathize rather than stigmatize. They do this by:
  • bridging the transitions across life's stages
  • responding to the widest range of individual differences
  • helping people remain active and independent
  • adapting to changing sensory and physical needs
  • maintaining one's dignity and self-respect
  • enabling one to choose the appropriate means to accomplish activities of daily living

History

Transgenerational design emerged during the mid-1980s coincident with the conception of universal design, an outgrowth of the disability rights movement and earlier barrier-free concepts. In contrast, transgenerational design grew out of the Age Discrimination Act of 1975 (ADA), which prohibited "discrimination on the basis of age in programs and activities receiving Federal financial assistance", or excluding, denying or providing different or lesser services on the basis of age. The ensuing political interest and debate over the Act's 1978 amendments, which abolished mandatory retirement at age 65, made the issues of aging a major public policy concern by injecting it into the mainstream of societal awareness.

Background

At the start of the 1980s, the oldest members of the population, having matured during the great depression, were being replaced by a generation of Baby Boomers, steadily reaching middle age and approaching the threshold of retirement. Their swelling numbers signaled profound demographic changes ahead that would steadily expand the aging population throughout the world.

Advancements in medical research were also changing the image of old age—from a social problem of the sick, poor, and senile, whose solutions depend on public policy—to the emerging reality of an active aging population having vigor, resources, and time to apply both.

Responding to the public's growing awareness, the media, public policy, and some institutions began to recognize the impending implications. Time and Newsweek devoted cover stories to the "Greying of America". Local radio stations began replacing their rock-and-roll formats with music targeted to more mature tastes. The Collegiate Forum (Dow Jones & Co., Inc.) devoted its Fall 1982 issue entirely to articles on the aging work force. A National Research Conference on Technology and Aging, and the Office of Technological Assessment of the House of Representatives, initiated a major examination of the impact of science and technology on older Americans”.

In 1985, the National Endowment for the Arts, the Administration on Aging, the Farmer's Home Administration, and the Department of Housing and Urban Development signed an agreement to improve building, landscape, product and graphic design for older Americans, which included new research applications for old age that recognized the potential for making products easier to use by the elderly, and therefore more appealing and profitable.

Development

In 1987, recognizing the implications of population aging, Syracuse University’s Department of Design, All-University Gerontology Center, and Center for Instructional Development initiated and collaborated on an interdisciplinary project, Industrial Design Accommodations: A Transgenerational Perspective. The year-long project, supported by a Federal grant, joined the knowledge base of gerontology with the professional practice of industrial design

The project defined "the three aspects of aging as physiological, sociological, and psychological; and divided the designer’s responsibility into aesthetic, technological, and humanistic concerns". The strong interrelationship between the physiological aspects of aging and industrial design's humanistic aspects established the project's instructional focus and categorized the physiological aspects of aging as the sensory and physical factors of vision, hearing, touch, and movement. This interrelationship was translated into a series of reference tables, which related specific physical and sensory factors of aging, and were included in the resulting set of design guidelines to:
  • sensitize designers and design students to the aging process
  • provide them with appropriate knowledge about this process
  • accommodate the changing needs of our transgenerational population
The project produced and published two instructional manuals—one for instructors and one for design professionals—each containing a detailed set of "design guidelines and strategies for designing transgenerationalproducts". Under terms of the grant, instructional manuals were distributed to all academic programs of industrial design recognized by the National Association of Schools of Art and Design (NASAD). 

Chronology

  • 1988: The term ‘transgenerational design’ first appears to have been publicly recognized and acknowledged by the Bristol-Myers Company in its Annual Report, which stated, "The trend towards transgenerational design seems to be catching on in some fields", noting that “transgenerational design has the added advantage of circumventing the stigmatizing label of being ‘old’ ”.
  • 1989: The results of the 1987 Federal grant project were first presented at the national conference, Exploration: Technological Innovations for an Aging Population, supported in part by the American Association of Retired Persons (AARP) and the National Institute on Aging. The proceedings focused “on current efforts to address the impact of technology and an aging population, identification of high impact issues and problems, innovative ideas, and potential solutions”.
  • Also in 1989 Design News, the Japanese design magazine, introduced “the new concept of transgenerational design (for) coping with the needs of an aging population and its strategy”, stating that “the impact will soon be felt by all global institutions” and “alter the present course of industrial design practice and education”.
  • 1990: The OXO company introduced the first group of 15 Good Grips kitchen tools to the U.S. Market. “These ergonomically-designed, transgenerational tools set a new standard for the industry and raised the bar to consumer expectation for comfort and performance”. Sam Farber, OXOs founder, stated that “population trends demand transgenerational products, products that will be useful to you throughout the course of your life” because “it extends the life of a product and its materials by anticipating the whole experience of the user”.
  • 1991: The Fall issue of the Design Management Journal addressed the issue of “Responsible Design” and introduced the transgenerational design concept in the article, “Transgenerational Design: A Strategy Whose Time Has Arrived”. The article presented a description, the rationale, and examples of early transgenerational products, and offered “insights on the rationale and benefits of such a transgenerational approach”.
  • 1993: The September–October issue of ‘’AARP The Magazine’’ exposed the transgenerational design concept to the readers in a featured article, “This Bold House”, describing the concept, details, and benefits of a transgenerational house. The article noted that “easy-grip handles, flat thresholds, and adjustable-height vanities are just the beginning in the world’s most accessible house,” providing families of all ages and abilities with “what they will want and need their whole lives”.
  • In November, the transgenerational design concept was introduced in presentations to the European design community at the international symposiums, “Designing for Our Future Selves”, held at the Royal College of Art in London and the Netherlands Design Institute in Rotterdam.
  • 1994: The book, Transgenerational Design: Products for an Aging Population (Pirkl 1994), may be regarded as the prime mover of the widespread acceptance and practice of the transgenerational design concept. It presented the first specialized content and photographic examples of transgenerational products and environments, offering “practical strategies in response to population aging, along with case study examples based on applying a better understanding of age-related capabilities”. It introduced the transgenerational design concept to the international design and gerontology communities, broadening the conventional idea of “environmental support” to include the product environment, sparking scholarly discussions and comparisons with other emerging concepts: (universal design, design for all, inclusive design, and gerontechnology).
  • 1995: The transgenerational design concept was presented at the first of the ‘’International Guest Lecture Series by World Experts’’, sponsored by the European Design for Aging Network (DAN) held consecutively at five international symposiums, “Designing for Our Future Selves”: Royal College of Art, London, November 15; Eindhoven University of Technology, Eindhoven, November 16–19; The Netherlands Design Institute, Amsterdam, November 21; University of Art and Design, Helsinki, November 23–25; and National College of Art and Design, Dublin, November 26–29.
  • 2000: “The Transgenerational House: A Case Study in Accessible Design and Construction” was presented in June at ‘’Designing for the 21st Century: An International Conference on Universal Design’’, held at the Rhode Island College of Art and Design, Providence, RI.
  • 2007: Architectural Graphic Standards, published by the American Institute of Architects and commonly referred to as the “architects bible”, presented a “Transgenerational House” case study in its "Inclusive Design" section. Described as an “intricate exploration in how the execution of detailed thought can create a living environment that serves the young and old alike, across generations”, the study includes plans for the room layout, kitchen, laundry, master bath, adjustable-height vanity, and roll-in shower.
  • 2012: The proliferation of transgenerational design has diminished the tendency to associate age and disability with deficit, decline and incompetence by providing a market-aware response to population aging and the need for living and work environments used by young and old people living and working in the same environment.
Continuing to emerge as a growing strategy for developing products, services and environments that accommodate people of all ages and abilities, "transgenerational design has been adopted by major corporations, like Intel, Microsoft and Kodak” who are “looking at product development the same way as designing products for people with visual, hearing and physical impairments,” so that people of any age can use them.

Discussions between designers and marketers are indicating that successful transgenerational design “requires the right balance of upfront research work, solid human factors analysis, extensive design exploration, testing and a lot of thought to get it right”, and that “transgenerational design is applicable to any consumer products company—from appliance manufacturers to electronics companies, furniture makers, kitchen and bath and mainstream consumer products companies”

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...