Universal grammar (UG) in linguistics, is the theory of the genetic component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that a certain set of structural rules are innate to humans, independent of sensory experience. With more linguistic stimuli received in the course of psychological development, children then adopt specific syntactic rules that conform to UG. It is sometimes known as "mental grammar", and stands contrasted with other "grammars", e.g. prescriptive, descriptive and pedagogical. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare. It is a matter of empirical investigation to determine precisely what properties are universal and what linguistic capacities are innate.
Argument
As Chomsky puts it, "Evidently, development of language in the individual must involve three factors: (1) genetic endowment, which sets limits on the attainable languages, thereby making language acquisition possible; (2) external data, converted to the experience that selects one or another language within a narrow range; (3) principles not specific to the Faculty of Language."
Occasionally, aspects of universal grammar seem describable in terms of general details regarding cognition. For example, if a predisposition to categorize events and objects as different classes of things is part of human cognition, and directly results in nouns and verbs showing up in all languages, then it could be assumed that rather than this aspect of universal grammar being specific to language, it is more generally a part of human cognition. To distinguish properties of languages that can be traced to other facts regarding cognition from properties of languages that cannot, the abbreviation UG* can be used. UG is the term often used by Chomsky for those aspects of the human brain which cause language to be the way that it is (i.e. are universal grammar in the sense used here) but here for discussion, it is used for those aspects which are furthermore specific to language (thus UG, as Chomsky uses it, is just an abbreviation for universal grammar, but UG* as used here is a subset of universal grammar).
In the same article, Chomsky casts the theme of a larger research program in terms of the following question: "How little can be attributed to UG while still accounting for the variety of 'I-languages' attained, relying on third factor principles?" (I-languages meaning internal languages, the brain states that correspond to knowing how to speak and understand a particular language, and third factor principles meaning (3) in the previous quote).
Chomsky has speculated that UG might be extremely simple and abstract, for example only a mechanism for combining symbols in a particular way, which he calls "merge". The following quote shows that Chomsky does not use the term "UG" in the narrow sense UG* suggested above:
"The conclusion that merge falls within UG holds whether such recursive generation is unique to FL (faculty of language) or is appropriated from other systems."
In other words, merge is seen as part of UG because it causes language to be the way it is, universal, and is not part of the environment or general properties independent of genetics and environment. Merge is part of universal grammar whether it is specific to language, or whether, as Chomsky suggests, it is also used for an example in mathematical thinking.
The distinction is important because there is a long history of argument about UG*, whereas most people working on language agree that there is universal grammar. Many people assume that Chomsky means UG* when he writes UG (and in some cases he might actually mean UG* [though not in the passage quoted above]).
Some students of universal grammar study a variety of grammars to extract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a variety of traits, such as the phonemes found in languages, the word orders which languages choose, and the reasons why children exhibit certain linguistic behaviors.
Later linguists who have influenced this theory include Chomsky and Richard Montague, developing their version of this theory as they considered issues of the argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The application of the idea of universal grammar to the study of second language acquisition (SLA) is represented mainly in the work of McGill linguist Lydia White.
Syntacticians generally hold that there are parametric points of variation between languages, although heated debate occurs over whether UG constraints are essentially universal due to being "hard-wired" (Chomsky's principles and parameters approach), a logical consequence of a specific syntactic architecture (the generalized phrase structure approach) or the result of functional constraints on communication (the functionalist approach).
Relation to the evolution of language
In an article titled, "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.The first hypothesis states that the faculty of language in the broad sense (FLb) is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.
The second hypothesis states that the FLb is a derived, uniquely human, adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.
The third hypothesis states that only the faculty of language in the narrow sense (FLn) is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion is recently evolved solely in humans. This is the hypothesis which most closely aligns to the typical theory of universal grammar championed by Chomsky.
History
The idea of a universal grammar can be traced back to Roger Bacon's observations in his c. 1245 Overview of Grammar and c. 1268 Greek Grammar that all languages are built upon a common grammar, even though it may undergo incidental variations; and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar".The idea rose to prominence and influence, in modern linguistics with theories from Chomsky and Montague in the 1950s–1970s, as part of the "linguistics wars".
During the early 20th century, in contrast, language was usually understood from a behaviourist perspective, suggesting that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success. In other words, children learned their mother tongue by simple imitation, through listening and repeating what adults said. For example, when a child says "milk" and the mother will smile and give her some as a result, the child will find this outcome rewarding, thus enhancing the child's language development.
Chomsky's theory
Chomsky argued that the human brain contains a limited set of constraints for organizing language. This implies in turn that all languages have a common structural basis: the set of rules known as "universal grammar".Speakers proficient in a language know which expressions are acceptable in their language and which are unacceptable. The key puzzle is how speakers come to know these restrictions of their language, since expressions that violate those restrictions are not present in the input, indicated as such. Chomsky argued that this poverty of stimulus means that Skinner's behaviourist perspective cannot explain language acquisition. The absence of negative evidence—evidence that an expression is part of a class of ungrammatical sentences in a given language—is the core of his argument. For example, in English, an interrogative pronoun like what cannot be related to a predicate within a relative clause:
- *"What did John meet a man who sold?"
Presence of creole languages
The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory. Creoles are languages that develop and form when disparate societies come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as a pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole. Unlike pidgins, creoles have native speakers (those with acquisition from early childhood) and make use of a full, systematic grammar.According to Bickerton, the idea of universal grammar is supported by creole languages because certain features are shared by virtually all in the category. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creoles can be seen in the fact that questions are created simply by changing the intonation of a declarative sentence, not its word order or content.
However, extensive work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, they tend to standardize the language that they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation (and in the real-life situation of a deaf child whose parents are or were disfluent signers), children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar. Further, it seems to follow that creoles would share features with the languages from which they are derived, and thus look similar in terms of grammar.
Many researchers of universal grammar argue against a concept of relexification, which says that a language replaces its lexicon almost entirely with that of another. This goes against universalist ideas of a universal grammar, which has an innate grammar.
Criticisms
Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language. Similarly, Jeffrey Elman argues that the unlearnability of languages assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, that is not in keeping with any actual grammar. In keeping with these points, James Hurford argues that the postulate of a language acquisition device (LAD) essentially amounts to the trivial claim that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories.Morten H. Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate universal grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics".
Hinzen summarizes the most common criticisms of universal grammar:
- Universal grammar has no coherent formulation and is indeed unnecessary.
- Universal grammar is in conflict with biology: it cannot have evolved by standardly accepted neo-Darwinian evolutionary principles.
- There are no linguistic universals: universal grammar is refuted by abundant variation at all levels of linguistic organization, which lies at the heart of human faculty of language.
Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure. This implies that word learning is a probabilistic, error-driven process, rather than a process of fast mapping, as many nativists assume.
In the domain of field research, the Pirahã language is claimed to be a counterexample to the basic tenets of universal grammar. This research has been led by Daniel Everett. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and colour terms. According to the writings of Everett, the Pirahã showed these linguistic shortcomings not because they were simple-minded, but because their culture—which emphasized concrete matters in the present and also lacked creation myths and traditions of art making—did not necessitate it. Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of universal grammar. Other linguists have attempted to reassess Pirahã to see if it did indeed use recursion. In a corpus analysis of the Pirahã language, linguists failed to disprove Everett's arguments against universal grammar and the lack of recursion in Pirahã. However, they also stated that there was "no strong evidence for the lack of recursion either" and they provided "suggestive evidence that Pirahã may have sentences with recursive structures".
Daniel Everett has gone as far as claiming that universal grammar does not exist. In his words, "universal grammar doesn't seem to work, there doesn't seem to be much evidence for [it]. And what can we put in its place? A complex interplay of factors, of which culture, the values human beings share, plays a major role in structuring the way that we talk and the things that we talk about." Michael Tomasello, a developmental psychologist, also supports this claim, arguing that "although many aspects of human linguistic competence have indeed evolved biologically, specific grammatical principles and constructions have not. And universals in the grammatical structure of different languages have come from more general processes and constraints of human cognition, communication, and vocal-auditory processing, operating during the conventionalization and transmission of the particular grammatical constructions of particular linguistic communities."