UNL
is designed to establish a simple foundation for representing the most
central aspects of information and meaning in a machine- and
human-language-independent form. As a language-independent formalism,
UNL aims to code, store, disseminate and retrieve information
independently of the original language in which it was expressed. In
this sense, UNL seeks to provide tools for overcoming the language
barrier in a systematic way.
At first glance, UNL seems to be a kind of interlingua, into
which source texts are converted before being translated into target
languages. It can, in fact, be used for this purpose, and very
efficiently, too. However, its real strength is knowledge representation
and its primary objective is to provide an infrastructure for handling
knowledge that already exists or can exist in any given language.
Nevertheless, it is important to note that at present it would be
foolish to claim to represent the “full” meaning of any word, sentence,
or text for any language. Subtleties of intention and interpretation
make the “full meaning,” however we might conceive it, too variable and
subjective for any systematic treatment. Thus UNL avoids the pitfalls
of trying to represent the “full meaning” of sentences or texts,
targeting instead the “core” or “consensual” meaning most often
attributed to them. In this sense, much of the subtlety of poetry,
metaphor, figurative language, innuendo, and other complex, indirect
communicative behaviors is beyond the current scope and goals of UNL.
Instead, UNL targets direct communicative behavior and literal meaning
as a tangible, concrete basis for most human communication in practical,
day-to-day settings.
Structure
In
the UNL approach, information conveyed by natural language is
represented sentence by sentence as a hypergraph composed of a set of
directed binary labeled links (referred to as relations) between nodes or hypernodes (the Universal Words, or simply UWs), which stand for concepts. UWs can also be annotated with attributes representing context information.
As an example, the English sentence ‘The sky was blue?!’ can be represented in UNL as follows:
In the example above, "sky(icl>natural world)" and
"blue(icl>color)", which represent individual concepts, are UWs;
"aoj" (= attribute of an object) is a directed binary semantic relation
linking the two UWs; and "@def", "@interrogative", "@past",
"@exclamation" and "@entry" are attributes modifying UWs.
UWs are intended to represent universal concepts, but are
expressed in English words or in any other natural language in order to
be humanly readable. They consist of a "headword" (the UW root) and a
"constraint list" (the UW suffix between parentheses), where the
constraints are used to disambiguate the general concept conveyed by the
headword. The set of UWs is organized in the UNL Ontology, in which
high-level concepts are related to lower-level ones through the
relations "icl" (= is a kind of), "iof" (= is an instance of) and "equ"
(= is equal to).
Relations are intended to represent semantic links between words
in every existing language. They can be ontological (such as "icl" and
"iof," referred to above), logical (such as "and" and "or"), and
thematic (such as "agt" = agent, "ins" = instrument, "tim" = time, "plc"
= place, etc.). There are currently 46 relations in the UNL Specs. They
jointly define the UNL syntax.
Attributes represent information that cannot be conveyed by UWs
and relations. Normally, they represent information concerning time
("@past", "@future", etc.), reference ("@def", "@indef", etc.), modality
("@can", "@must", etc.), focus ("@topic", "@focus", etc.), and so on.
Within the UNL Program, the process of representing natural language sentences in UNL graphs is called UNLization, and the process of generating natural language sentences out of UNL graphs is called NLization.
UNLization, which involves natural language analysis and understanding,
is intended to be carried out semi-automatically (i.e., by humans with
computer aids); and NLization is intended to be carried out fully
automatically.
History
The UNL Programme started in 1996, as an initiative of the Institute of Advanced Studies of the United Nations University
in Tokyo, Japan. In January 2001, the United Nations University set up
an autonomous organization, the UNDL Foundation, to be responsible for
the development and management of the UNL Programme. The foundation, a
non-profit international organisation, has an independent identity from
the United Nations University, although it has special links with the
UN. It inherited from the UNU/IAS the mandate of implementing the UNL
Programme so that it can fulfil its mission.
The programme has already crossed important milestones. The
overall architecture of the UNL System has been developed with a set of
basic software and tools necessary for its functioning. These are being
tested and improved. A vast amount of linguistic resources from the
various native languages already under development, as well as from the
UNL expression, has been accumulated in the last few years. Moreover,
the technical infrastructure for expanding these resources is already in
place, thus facilitating the participation of many more languages in
the UNL system from now on. A growing number of scientific papers and
academic dissertations on the UNL are being published every year.
The most visible accomplishment so far is the recognition by the
Patent Co-operation Treaty (PCT) of the innovative character and
industrial applicability of the UNL, which was obtained in May 2002
through the World Intellectual Property Organisation (WIPO). Acquiring
the patents (US patents 6,704,700 and 7,107,206) for the UNL is a
completely novel achievement within the United Nations.
A heritage language is a minority language (either immigrant or indigenous)
learned by its speakers at home as children, but never fully developed
because of insufficient input from the social environment: in fact, the
community of speakers grows up with a dominant language in which they become more competent. Polinsky & Kagan label it as a continuum (taken from Valdés definition of heritage language) that ranges from fluent speakers to barely-speaking individuals of the home language.
In some countries or cultures in which they determine one's mother
tongue by the ethnic group, a heritage language would be linked to the
native language.
The term can also refer to the language of a person's family or
community that the person does not speak or understand, but identifies
with culturally.
Definitions and use
Heritage language is a language which is predominantly spoken by "nonsocietal" groups and linguistic minorities.
In various fields, such as foreign language education and linguistics, the definitions of heritage language
become more specific and divergent. In foreign language education,
heritage language is defined in terms of a student's upbringing and
functional proficiency in the language: a student raised in a home where
a non-majority language is spoken is a heritage speaker of that
language if they possess some proficiency in it.
Under this definition, individuals that have some cultural connection
with the language but do not speak it are not considered heritage
students. This restricted definition became popular in the mid 1990s
with the publication of Standards for Foreign Language Learning by the American Council on the Teaching of Foreign Languages.
Among linguists, heritage language is an end-state language that
is defined based on the temporal order of acquisition and, often, the
language dominance in the individual. A heritage speaker acquires the heritage language as their first language through natural input in the home environment and acquires the majority language as a second language,
usually when they start school and talk about different topics with
people in school, or by exposure through media (written texts, internet,
popular culture etc.).
As exposure to the heritage language decreases and exposure to the
majority language increases, the majority language becomes the
individual’s dominant language and acquisition of the heritage language
changes. The results of these changes can be seen in divergence of the heritage language from monolingual norms in the areas of phonology, lexical knowledge (knowledge of vocabulary or words), morphology, syntax, semantics and code-switching,
although mastery of the heritage language may vary from purely
receptive skills in only informal spoken language to native-like fluency.
Controversy in definition
As
stated by Polinsky and Kagan: "The definition of a heritage speaker in
general and for specific languages continues to be debated. The debate
is of particular significance in such languages as Chinese, Arabic, and languages of India and the Philippines,
where speakers of multiple languages or dialects are seen as heritage
speakers of a single standard language taught for geographic, cultural
or other reasons (Mandarin Chinese, Classical Arabic, Hindi, or Tagalog, respectively)."
One idea that prevails in the literature is that "[heritage] languages include indigenous languages
that are often endangered. . . as well as world languages that are
commonly spoken in many other regions of the world (Spanish in the
United States, Arabic in France)".
However, that view is not shared universally. In Canada, for example,
First Nations languages are not classified as heritage languages by some
groups whereas they are so classified by others.
The label heritage is given to a language based
principally on the social status of its speakers and not necessarily on
any linguistic property. Thus, while Spanish typically comes in second in terms of native speakers worldwide and has official status in a number of countries, it is considered a heritage language in the English-dominant United States and Canada. Outside the United States and Canada, heritage language definitions and use vary.
Speakers of the same heritage language raised in the same
community may differ significantly in terms of their language abilities,
yet be considered heritage speakers under this definition. Some
heritage speakers may be highly proficient in the language, possessing
several registers,
while other heritage speakers may be able to understand the language
but not produce it. Other individuals that simply have a cultural
connection with a minority language but do not speak it may consider it
to be their heritage language.
It is held by some that ownership does not necessarily depend on
usership: “Some Aboriginal people distinguish between usership and
ownership. There are even those who claim that they own a language
although they only know one single word of it: its name.”
Proficiency
Heritage learners
have a fluent command of the dominant language and are comfortable
using it in formal setting because of their exposure to the language
through formal education.
Their command of the heritage language, however, varies widely. Some
heritage learners may lose some fluency in the first language after they
begin formal education in the dominant language.
Others may use the heritage language consistently at home and with
family but receive little or no formal training in the heritage language
and thus may struggle with literacy skills or with using it in broader
settings outside of the home.
An additional factor that affects the acquisition of learners is
whether they show willingness or reluctance towards learning the
heritage language.
One factor that has been shown to influence the loss of fluency
in the heritage language is age. Studies have shown that younger
bilingual children are more susceptible to fluency loss than older
bilingual children.
The older the child is when the dominant language is introduced, the
less likely he/she is going to lose ability in using his/her first
language (the heritage language).
This is because the older the child is, the more exposure and knowledge
of use the child will have had with the heritage language, and thus the
heritage language will remain as their primary language.
Researchers found that this phenomenon primarily deals with the
memory network of an individual. Once a memory network is organized, it
is difficult for the brain to reorganize information contrary to the
initial information, because the previous information was processed
first.
This phenomenon becomes a struggle for adults who are trying to learn a
different language. Once an individual has learned a language fluently,
they will be heavily influenced by the grammatical rules and
pronunciations of their first language they learned, while learning a
new language.
An emerging effective way of measuring the proficiency of a
heritage speaker is by speech rate. A study of gender restructuring in
heritage Russian showed that heritage speakers fell into two groups:
those who maintained the three-gender system and those who radically
reanalyzed the system as a two-gender system. The heritage speakers who
reanalyzed the three-gender system as a two-gender system had a strong
correlation with a slower speech rate. The correlation is
straightforward—lower proficiency speakers have more difficulty
accessing lexical items; therefore, their speech is slowed down.
Although speech rate has been shown to be an effective way of
measuring proficiency of heritage speakers, some heritage speakers are
reluctant to produce any heritage language whatsoever. Lexical
proficiency is an alternative method that is also effective in measuring
proficiency.
In a study with heritage Russian speakers, there was a strong
correlation between the speaker's knowledge of lexical items (measured
using a basic word list of about 200) and the speaker's control over
grammatical knowledge such as agreement, temporal marking, and
embedding.
Some heritage speakers explicitly study the language to gain
additional proficiency. The learning trajectories of heritage speakers
are markedly different from the trajectories of second language learners
with little or no previous exposure to a target language. For instance,
heritage learners typically show a phonological advantage over second
language learners in both perception and production of the heritage
language, even when their exposure to the heritage language was
interrupted very early in life.
Heritage speakers also tend to distinguish, rather than conflate,
easily confusable sounds in the heritage language and the dominant
language more reliably than second language learners. In morphosyntax as well, heritage speakers have been found to be more native-like than second language learners, although they are typically significantly different from native speakers.
Many linguists frame this change in heritage language acquisition as “incomplete acquisition” or "attrition."
"Incomplete acquisition," loosely defined by Montrul, is "the outcome
of language acquisition that is not complete in childhood."
In this incomplete acquisition, there are particular properties of the
language that were not able to reach age-appropriate levels of
proficiency after the dominant language has been introduced. Attrition,
as defined by Montrul, is the loss of a certain property of a language
after one has already mastered it with native-speaker level accuracy.
These two cases of language loss have been used by Montrul and many
other linguists to describe the change in heritage language acquisition.
However, this is not the only viewpoint of linguists to describe
heritage language acquisition.
One argument against incomplete acquisition is that the input
that heritage speakers receive is different from monolinguals (the input
may be affected by cross-generational attrition, among other factors),
thus the comparison of heritage speakers against monolinguals is weak.
This argument by Pascual and Rothman claims that the acquisition of the
heritage language is therefore not incomplete, but complete and simply
different from monolingual acquisition of a language.
Another argument argues for a shift in focus on the result of
incomplete acquisition of a heritage language to the process of heritage
language acquisition. In this argument, the crucial factor in changes
to heritage language acquisition is the extent to which the heritage
speaker activates and processes the heritage language.
This new model thus moves away from language acquisition that is
dependent on the exposure to input of the language and moves towards
dependence on the frequency of processing for production and
comprehension of the heritage language.
Some colleges and universities offer courses prepared for
speakers of heritage languages. For example, students who grow up
learning some Spanish in the home may enroll in a course that will build
on their Spanish abilities.
A first language, native language or mother/father/parent tongue (also known as arterial language or L1), is a language that a person has been exposed to from birth or within the critical period. In some countries, the term native language or mother tongue refers to the language of one's ethnic group rather than one's first language.
Sometimes, the term "mother tongue" or "mother language"(or "father tongue" / "father language")
is used for the language that a person learned as a child (usually from
their parents). Children growing up in bilingual homes can, according
to this definition, have more than one mother tongue or native language.
The first language of a child is part of that child's personal, social and cultural identity.
Another impact of the first language is that it brings about the
reflection and learning of successful social patterns of acting and
speaking. It is basically responsible for differentiating the linguistic competence of acting. While some argue that there is no such thing as a "native speaker" or a "mother tongue", it is important
to understand the key terms as well as to understand what it means to
be a "non-native" speaker, and the implications that can have on one's
life. Research suggests that while a non-native speaker may develop
fluency in a targeted language after about two years of immersion, it
can take between five and seven years for that child to be on the same
working level as their native speaking counterparts.
One
of the more widely accepted definitions of native speakers is that they
were born in a particular country (and) raised to speak the language of
that country during the critical period of their development.
The person qualifies as a "native speaker" of a language by being born
and immersed in the language during youth, in a family in which the
adults shared a similar language experience to the child.
Native speakers are considered to be an authority on their given
language because of their natural acquisition process regarding the
language, as opposed to having learned the language later in life. That
is achieved by personal interaction with the language and speakers of
the language. Native speakers will not necessarily be knowledgeable
about every grammatical rule of the language, but they will have good
"intuition" of the rules through their experience with the language.
The designation "native language", in its general usage, is
thought to be imprecise and subject to various interpretations that are
biased linguistically, especially with respect to bilingual children
from ethnic minority groups. Many scholars
have given definitions of 'native language' based on common usage, the
emotional relation of the speaker towards the language, and even its
dominance in relation to the environment. However, all three criteria
lack precision. For many children whose home language differs from the
language of the environment (the 'official' language), it is debatable
which language is their "native language".
Defining "native language"
Based
on origin: the language(s) one learned first (the language(s) in which
one has established the first long-lasting verbal contacts).
Based on internal identification: the language(s) one identifies with/as a speaker of;
Based on external identification: the language(s) one is identified with/as a speaker of, by others.
Based on competence: the language(s) one knows best.
Based on function: the language(s) one uses most.
In some countries, such as Kenya, India,
and various East Asian and Central Asian countries, "mother language"
or "native language" is used to indicate the language of one's ethnic group
in both common and journalistic parlance ("I have no apologies for not
learning my mother tongue"), rather than one's first language. Also, in Singapore, "mother tongue" refers to the language of one's ethnic group regardless of actual proficiency, and the "first language" refers to English, which was established on the island under the British Empire, and is the lingua franca
for most post-independence Singaporeans because of its use as the
language of instruction in government schools and as a working language.
In the context of population censuses conducted on the Canadian population, Statistics Canada defines mother tongue as "the first language learned at home in childhood and still understood by the individual at the time of the census."
It is quite possible that the first language learned is no longer a
speaker's dominant language. That includes young immigrant children
whose families have moved to a new linguistic environment as well as
people who learned their mother tongue as a young child at home (rather
than the language of the majority of the community), who may have lost,
in part or in totality, the language they first acquired. According to Ivan Illich, the term "mother tongue" was first used by Catholic monks to designate a particular language they used, instead of Latin,
when they were "speaking from the pulpit". That is, the "holy mother
the Church" introduced this term and colonies inherited it from
Christianity as a part of colonialism. J. R. R. Tolkien, in his 1955 lecture "English and Welsh",
distinguishes the "native tongue" from the "cradle tongue". The latter
is the language one learns during early childhood, and one's true
"native tongue" may be different, possibly determined by an inherited
linguistic taste
and may later in life be discovered by a strong emotional affinity to a
specific dialect (Tolkien personally confessed to such an affinity to
the Middle English of the West Midlands in particular).
Children brought up speaking more than one language can have more than one native language, and be bilingual or multilingual. By contrast, a second language is any language that one speaks other than one's first language.
A
related concept is bilingualism. One definition is that a person is
bilingual if they are equally proficient in two languages. Someone who
grows up speaking Spanish and then learns English for four years is
bilingual only if they speak the two languages with equal fluency. Pearl
and Lambert were the first to test only "balanced" bilinguals—that is, a
child who is completely fluent in two languages and feels that neither
is their "native" language because they grasp both so perfectly. This
study found that
balanced bilinguals perform significantly better in tasks
that require flexibility (they constantly shift between the two known
languages depending on the situation),
they are more aware of the arbitrary nature of language,
they choose word associations based on logical rather than phonetic preferences.
Multilingualism
One can have two or more native languages, thus being a native bilingual or indeed multilingual.
The order in which these languages are learned is not necessarily the
order of proficiency. For instance, if a French-speaking couple have a
child who learned French first but then grew up in an English-speaking
country, the child would likely be most proficient in English. Other
examples are India, Indonesia, the Philippines, Kenya, Malaysia, Singapore, and South Africa, where most people speak more than one language.
Defining "native speaker"
Defining
what constitutes a native speaker is difficult, and there is no test
which can identify one. It is not known whether native speakers are a
defined group of people, or if the concept should be thought of as a
perfect prototype to which actual speakers may or may not conform.
An article titled "The Native Speaker: An Achievable Model?" published by the Asian EFL Journal
states that there are six general principles that relate to the
definition of "native speaker". The principles, according to the study,
are typically accepted by language experts across the scientific field. A
native speaker is defined according to the following guidelines:
The individual acquired the language in early childhood and maintains the use of the language.
The individual has intuitive knowledge of the language.
The individual is able to produce fluent, spontaneous discourse.
The individual is communicatively competent in different social contexts.
The individual identifies with or is identified by a language community.
Noam Chomsky is usually associated with the term universal grammar in the 20th and 21st centuries
Universal grammar (UG), in modern linguistics, is the theory of the genetic component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that a certain set of structural rules are innate to humans, independent of sensory experience. With more linguistic stimuli received in the course of psychological development, children then adopt specific syntactic rules that conform to UG. It is sometimes known as "mental grammar", and stands contrasted with other "grammars", e.g. prescriptive, descriptive and pedagogical. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages.
However, the latter has not been firmly established, as some linguists
have argued languages are so diverse that such universality is rare.
It is a matter of empirical investigation to determine precisely what
properties are universal and what linguistic capacities are innate.
Argument
The theory of universal grammar proposes that if human beings are brought up under normal conditions (not those of extreme sensory deprivation), then they will always develop language with certain properties (e.g., distinguishing nouns from verbs, or distinguishing function words from content words).
The theory proposes that there is an innate, genetically determined
language faculty that knows these rules, making it easier and faster for
children to learn to speak than it otherwise would be.
This faculty does not know the vocabulary of any particular language
(so words and their meanings must be learned), and there remain several
parameters which can vary freely among languages (such as whether
adjectives come before or after nouns) which must also be learned.
Evidence in favor of this idea can be found in studies like Valian
(1986), which show that children of surprisingly young ages understand
syntactic categories and their distribution before this knowledge shows
up in production.
As Chomsky puts it, "Evidently, development of language in the
individual must involve three factors: genetic endowment, which sets
limits on the attainable languages, thereby making language acquisition
possible; external data, converted to the experience that selects one or
another language within a narrow range; [and] principles not specific
to the Faculty of Language."
Occasionally, aspects of universal grammar seem describable in
terms of general details regarding cognition. For example, if a
predisposition to categorize events and objects as different classes of
things is part of human cognition, and directly results in nouns and
verbs showing up in all languages, then it could be assumed that rather
than this aspect of universal grammar being specific to language, it is
more generally a part of human cognition. To distinguish properties of
languages that can be traced to other facts regarding cognition from
properties of languages that cannot, the abbreviation UG* can be used.
UG is the term often used by Chomsky for those aspects of the human
brain which cause language to be the way that it is (i.e. are universal
grammar in the sense used here), but here for the purposes of
discussion, it is used for those aspects which are furthermore specific
to language (thus UG, as Chomsky uses it, is just an abbreviation for
universal grammar, but UG* as used here is a subset of universal
grammar).
In the same article, Chomsky casts the theme of a larger research
program in terms of the following question: "How little can be
attributed to UG while still accounting for the variety of 'I-languages'
attained, relying on third factor principles?"
(I-languages meaning internal languages, the brain states that
correspond to knowing how to speak and understand a particular language,
and third factor principles meaning "principles not specific to the
Faculty of Language" in the previous quote).
Chomsky has speculated that UG might be extremely simple and
abstract, for example only a mechanism for combining symbols in a
particular way, which he calls "merge". The following quote shows that Chomsky does not use the term "UG" in the narrow sense UG* suggested above.
"The conclusion that merge falls within UG holds whether such
recursive generation is unique to FL (faculty of language) or is
appropriated from other systems."
In other words, merge is seen as part of UG because it causes
language to be the way it is, universal, and is not part of the
environment or general properties independent of genetics and
environment. Merge is part of universal grammar whether it is specific
to language, or whether, as Chomsky suggests, it is also used for an
example in mathematical thinking.
The distinction is the result of the long history of argument
about UG*: whereas some people working on language agree that there is
universal grammar, many people assume that Chomsky means UG* when he
writes UG (and in some cases he might actually mean UG* [though not in
the passage quoted above]).
Some students of universal grammar study a variety of grammars to extract generalizations called linguistic universals,
often in the form of "If X holds true, then Y occurs." These have been
extended to a variety of traits, such as the phonemes found in
languages, the word orders which different languages choose, and the
reasons why children exhibit certain linguistic behaviors.
Other linguists who have influenced this theory include Richard Montague, who developed his version of this theory as he considered issues of the argument from poverty of the stimulus
to arise from the constructivist approach to linguistic theory. The
application of the idea of universal grammar to the study of second
language acquisition (SLA) is represented mainly in the work of McGill
linguist Lydia White.
Syntacticians generally hold that there are parametric points of
variation between languages, although heated debate occurs over whether
UG constraints are essentially universal due to being "hard-wired"
(Chomsky's principles and parameters approach), a logical consequence of a specific syntactic architecture (the generalized phrase structure approach) or the result of functional constraints on communication (the functionalist approach).
Relation to the evolution of language
In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" Hauser, Chomsky, and Fitch present the three leading hypotheses for how
language evolved and brought humans to the point where they have a
universal grammar.
The first hypothesis states that the faculty of language in the
broad sense (FLb) is strictly homologous to animal communication. This
means that homologous aspects of the faculty of language exist in
non-human animals.
The second hypothesis states that the FLb is a derived and
uniquely human adaptation for language. This hypothesis holds that
individual traits were subject to natural selection and came to be
specialized for humans.
The third hypothesis states that only the faculty of language in
the narrow sense (FLn) is unique to humans. It holds that while
mechanisms of the FLb are present in both human and non-human animals,
the computational mechanism of recursion is recently evolved solely in
humans. This is the hypothesis which most closely aligns to the typical theory of universal grammar championed by Chomsky.
History
The
term "universal grammar" predates Noam Chomsky, but pre-Chomskyan ideas
of universal grammar are different. For Chomsky, UG is "[the] theory of
the genetically based language faculty", which makes UG a theory of language acquisition, and part of the innateness hypothesis.
Earlier grammarians and philosophers thought about universal grammar in
the sense of a universally shared property or grammar of all languages.
The closest analog to their understanding of universal grammar in the
late 20th century are Greenberg's linguistic universals.
The idea of a universal grammar can be traced back to Roger Bacon's observations in his c. 1245Overview of Grammar and c. 1268Greek Grammar that all languages are built upon a common grammar, even though it may undergo incidental variations; and the 13th century speculative grammarians
who, following Bacon, postulated universal rules underlying all
grammars. The concept of a universal grammar or language was at the core
of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld, who built on the works of René Descartes. They tried to describe a general grammar for languages, coming to the conclusion that grammar has to be universal.
There is a Scottish school of universal grammarians from the 18th
century, as distinguished from the philosophical language project, which
included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar".
This tradition was continued in the late 19th century by Wilhelm Wundt and in the early 20th century by linguist Otto Jespersen.
Jespersen disagreed with early grammarians on their formulation of
"universal grammar", arguing that they tried to derive too much from
Latin, and that a UG based on Latin was bound to fail considering the
breadth of worldwide linguistic variation.
He does not fully dispense with the idea of a "universal grammar", but
reduces it to universal syntactic categories or super-categories, such
as number, tenses, etc.
Jespersen does not discuss whether these properties come from facts
about general human cognition or from a language specific endowment
(which would be closer to the Chomskyan formulation). As this work
predates molecular genetics, he does not discuss the notion of a genetically conditioned universal grammar.
During the rise of behaviorism, the idea of a universal grammar
(in either sense) was discarded. In the early 20th century, language was
usually understood from a behaviourist
perspective, suggesting that language acquisition, like any other kind
of learning, could be explained by a succession of trials, errors, and
rewards for success.
In other words, children learned their mother tongue by simple
imitation, through listening and repeating what adults said. For
example, when a child says "milk" and the mother will smile and give her
child milk as a result, the child will find this outcome rewarding,
thus enhancing the child's language development.
UG reemerged to prominence and influence in modern linguistics with the
theories of Chomsky and Montague in the 1950s–1970s, as part of the "linguistics wars".
In 2016 Chomsky and Berwick co-wrote their book titled Why Only Us,
where they defined both the minimalist program and the strong
minimalist thesis and its implications to update their approach to UG
theory. According to Berwick and Chomsky, the strong minimalist thesis
states that "The optimal situation would be that UG reduces to the
simplest computational principles which operate in accord with
conditions of computational efficiency. This conjecture is ... called
the Strong Minimalist Thesis (SMT)."
The significance of SMT is to significantly shift the previous emphasis
on universal grammars to the concept which Chomsky and Berwick now call
"merge". "Merge" is defined in their 2016 book when they state "Every
computational system has embedded within it somewhere an operation that
applies to two objects X and Y already formed, and constructs from them a
new object Z. Call this operation Merge." SMT dictates that "Merge will
be as simple as possible: it will not modify X or Y or impose any
arrangement on them; in particular, it will leave them unordered, an
important fact... Merge is therefore just set formation: Merge of X and Y
yields the set {X, Y}."
Chomsky's theory
Chomsky argued that the human brain
contains a limited set of constraints for organizing language. This
implies in turn that all languages have a common structural basis: the
set of rules known as "universal grammar".
Speakers proficient in a language know which expressions are
acceptable in their language and which are unacceptable. The key puzzle
is how speakers come to know these restrictions of their language, since
expressions that violate those restrictions are not present in the
input, indicated as such. Chomsky argued that this poverty of stimulus
means that Skinner's behaviourist perspective cannot explain language
acquisition. The absence of negative evidence—evidence that an
expression is part of a class of ungrammatical sentences in a given
language—is the core of his argument. For example, in English, an interrogative pronoun like what cannot be related to a predicate within a relative clause:
*"What did John meet a man who sold?"
Such expressions are not available to language learners: they are, by
hypothesis, ungrammatical. Speakers of the local language do not use
them, and would note them as unacceptable to language learners.
Universal grammar offers an explanation for the presence of the poverty
of the stimulus, by making certain restrictions into universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
Presence of creole languages
The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory.
Creoles are languages that develop and form when disparate societies
come together and are forced to devise a new system of communication.
The system used by the original speakers is typically an inconsistent
mix of vocabulary items, known as a pidgin.
As these speakers' children begin to acquire their first language, they
use the pidgin input to effectively create their own original language,
known as a creole. Unlike pidgins, creoles have native speakers (those with acquisition from early childhood) and make use of a full, systematic grammar.
According to Bickerton, the idea of universal grammar is
supported by creole languages because certain features are shared by
virtually all in the category. For example, their default point of
reference in time (expressed by bare verb stems) is not the present
moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish).
Another similarity among creoles can be seen in the fact that questions
are created simply by changing the intonation of a declarative
sentence, not its word order or content.
However, extensive work by Carla Hudson-Kam and Elissa Newport
suggests that creole languages may not support a universal grammar at
all. In a series of experiments, Hudson-Kam and Newport looked at how
children and adults learn artificial grammars. They found that children
tend to ignore minor variations in the input when those variations are
infrequent, and reproduce only the most frequent forms. In doing so,
they tend to standardize the language that they hear around them.
Hudson-Kam and Newport hypothesize that in a pidgin-development
situation (and in the real-life situation of a deaf child whose parents
are or were disfluent signers), children systematize the language they
hear, based on the probability and frequency of forms, and not that
which has been suggested on the basis of a universal grammar.
Further, it seems to follow that creoles would share features with the
languages from which they are derived, and thus look similar in terms of
grammar.
Many researchers of universal grammar argue against a concept of relexification,
which says that a language replaces its lexicon almost entirely with
that of another. This goes against universalist ideas of a universal
grammar, which has an innate grammar.
Criticisms
Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific.
He argues that the grammatical "rules" linguists posit are simply
post-hoc observations about existing languages, rather than predictions
about what is possible in a language. Similarly, Jeffrey Elman
argues that the unlearnability of languages assumed by universal
grammar is based on a too-strict, "worst-case" model of grammar, that is
not in keeping with any actual grammar. In keeping with these points,
James Hurford argues that the postulate of a language acquisition device
(LAD) essentially amounts to the trivial claim that languages are
learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories.
Morten H. Christiansen
and Nick Chater have argued that the relatively fast-changing nature of
language would prevent the slower-changing genetic structures from ever
catching up, undermining the possibility of a genetically hard-wired
universal grammar. Instead of an innate universal grammar, they claim,
"apparently arbitrary aspects of linguistic structure may result from
general learning and processing biases deriving from the structure of
thought processes, perceptuo-motor factors, cognitive limitations, and
pragmatics".
Hinzen summarizes the most common criticisms of universal grammar:
Universal grammar has no coherent formulation and is indeed unnecessary.
Universal grammar is in conflict with biology: it cannot have evolved by standardly accepted neo-Darwinian evolutionary principles.
There are no linguistic universals: universal grammar is refuted by
abundant variation at all levels of linguistic organization, which lies
at the heart of human faculty of language.
In addition, it has been suggested that people learn about
probabilistic patterns of word distributions in their language, rather
than hard and fast rules (see Distributional hypothesis).
For example, children overgeneralize the past tense marker "ed" and
conjugate irregular verbs incorrectly, producing forms like goed and eated and correct these errors over time.
It has also been proposed that the poverty of the stimulus problem can
be largely avoided, if it is assumed that children employ similarity-based generalization
strategies in language learning, generalizing about the usage of new
words from similar words that they already know how to use.
Language acquisition
researcher Michael Ramscar has suggested that when children erroneously
expect an ungrammatical form that then never occurs, the repeated
failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure. This implies that word learning is a probabilistic, error-driven process, rather than a process of fast mapping, as many nativists assume.
In the domain of field research, the Pirahã language is claimed to be a counterexample to the basic tenets of universal grammar. This research has been led by Daniel Everett. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and colour terms.
According to the writings of Everett, the Pirahã showed these
linguistic shortcomings not because they were simple-minded, but because
their culture—which emphasized concrete matters in the present and also
lacked creation myths and traditions of art making—did not necessitate
it.
Some other linguists have argued, however, that some of these
properties have been misanalyzed, and that others are actually expected
under current theories of universal grammar.
Other linguists have attempted to reassess Pirahã to see if it did
indeed use recursion. In a corpus analysis of the Pirahã language,
linguists failed to disprove Everett's arguments against universal
grammar and the lack of recursion in Pirahã. However, they also stated
that there was "no strong evidence for the lack of recursion either" and
they provided "suggestive evidence that Pirahã may have sentences with
recursive structures".
Daniel Everett has argued that even if a universal grammar is not
impossible in principle, it should not be accepted because we have
equally or more plausible theories that are simpler. In his words,
"universal grammar doesn't seem to work, there doesn't seem to be much
evidence for [it]. And what can we put in its place? A complex interplay
of factors, of which culture, the values human beings share, plays a
major role in structuring the way that we talk and the things that we
talk about." Michael Tomasello,
a developmental psychologist, also supports this claim, arguing that
"although many aspects of human linguistic competence have indeed
evolved biologically, specific grammatical principles and constructions
have not. And universals in the grammatical structure of different
languages have come from more general processes and constraints of human
cognition, communication, and vocal-auditory processing, operating
during the conventionalization and transmission of the particular
grammatical constructions of particular linguistic communities."
The main purpose of theories of second-language acquisition (SLA) is to shed light on how people who already know one language learn a second language. The field of second-language acquisition involves various contributions, such as linguistics, sociolinguistics, psychology, cognitive science, neuroscience, and education.
These multiple fields in second-language acquisition can be grouped as
four major research strands: (a) linguistic dimensions of SLA, (b)
cognitive (but not linguistic) dimensions of SLA, (c) socio-cultural
dimensions of SLA, and (d) instructional dimensions of SLA. While the
orientation of each research strand is distinct, they are in common in
that they can guide us to find helpful condition to facilitate
successful language learning. Acknowledging the contributions of each
perspective and the interdisciplinarity between each field, more and
more second language researchers are now trying to have a bigger lens on
examining the complexities of second language acquisition.
History
As second-language acquisition began as an interdisciplinary field, it is hard to pin down a precise starting date.
However, there are two publications in particular that are seen as
instrumental to the development of the modern study of SLA: (1) Corder's 1967 essay The Significance of Learners' Errors, and (2) Selinker's 1972 article Interlanguage.
Corder's essay rejected a behaviorist account of SLA and suggested that
learners made use of intrinsic internal linguistic processes;
Selinker's article argued that second-language learners possess their
own individual linguistic systems that are independent from both the
first and second languages.
In the 1970s the general trend in SLA was for research exploring the ideas of Corder and Selinker, and refuting behaviorist theories of language acquisition. Examples include research into error analysis, studies in transitional stages of second-language ability, and the "morpheme studies" investigating the order in which learners acquired linguistic features. The 70s were dominated by naturalistic studies of people learning English as a second language.
By the 1980s, the theories of Stephen Krashen had become the prominent paradigm in SLA. In his theories, often collectively known as the Input Hypothesis, Krashen suggested that language acquisition is driven solely by comprehensible input,
language input that learners can understand. Krashen's model was
influential in the field of SLA and also had a large influence on
language teaching, but it left some important processes in SLA
unexplained. Research in the 1980s was characterized by the attempt to
fill in these gaps. Some approaches included White's descriptions of learner competence, and Pienemann's use of speech processing models and lexical functional grammar
to explain learner output. This period also saw the beginning of
approaches based in other disciplines, such as the psychological
approach of connectionism.
In the 2000s research was focused on much the same areas as in
the 1990s, with research split into two main camps of linguistic and
psychological approaches. VanPatten
and Benati do not see this state of affairs as changing in the near
future, pointing to the support both areas of research have in the wider
fields of linguistics and psychology, respectively.
Universal grammar
From the field of linguistics, the most influential theory by far has been Chomsky's theory of Universal Grammar (UG). The core of this theory lies on the existence of an innate universal grammar, grounded on the poverty of the stimulus. The UG model of principles, basic properties which all languages
share, and parameters, properties which can vary between languages, has
been the basis for much second-language research.
From a UG perspective, learning the grammar of a second language is simply a matter of setting the correct parameters. Take the pro-drop parameter, which dictates whether or not sentences must have a subject in order to be grammatically correct. This parameter can have two values: positive, in which case sentences do not necessarily need a subject, and negative, in which case subjects must be present. In German the sentence "Er spricht" (he speaks) is grammatical, but the sentence "Spricht" (speaks) is ungrammatical. In Italian, however, the sentence "Parla" (speaks) is perfectly normal and grammatically correct.
A German speaker learning Italian would only need to deduce that
subjects are optional from the language he hears, and then set his pro-drop
parameter for Italian accordingly. Once he has set all the parameters
in the language correctly, then from a UG perspective he can be said to
have learned Italian, i.e. he will always produce perfectly correct
Italian sentences.
Universal Grammar also provides a succinct explanation for much
of the phenomenon of language transfer. Spanish learners of English who
make the mistake "Is raining" instead of "It is raining" have not yet
set their pro-drop parameters correctly and are still using the same setting as in Spanish.
The main shortcoming of Universal Grammar in describing
second-language acquisition is that it does not deal at all with the
psychological processes involved with learning a language. UG
scholarship is only concerned with whether parameters are set or not,
not with how they are set. Schachter (1988) is a useful critique of research testing the role of Universal Grammar in second language acquisition.
Input hypothesis
Learners' most direct source of information
about the target language is the target language itself. When they come
into direct contact with the target language, this is referred to as
"input." When learners process that language in a way that can
contribute to learning, this is referred to as "intake". However, it
must be at a level that is comprehensible to them. In his monitor theory,
Krashen advanced the concept that language input should be at the "i+1"
level, just beyond what the learner can fully understand; this input is
comprehensible, but contains structures that are not yet fully
understood. This has been criticized on the basis that there is no clear
definition of i+1, and that factors other than structural difficulty
(such as interest or presentation) can affect whether input is actually
turned into intake. The concept has been quantified, however, in
vocabulary acquisition research; Nation reviews various studies which
indicate that about 98% of the words in running text should be
previously known in order for extensive reading to be effective.
In his Input Hypothesis, Krashen proposes that language
acquisition takes place only when learners receive input just beyond
their current level of L2 competence. He termed this level of input
“i+1.” However, in contrast to emergentist and connectionist theories,
he follows the innate approach by applying Chomsky's Government and binding theory and concept of Universal grammar
(UG) to second-language acquisition. He does so by proposing a Language
Acquisition Device that uses L2 input to define the parameters of the
L2, within the constraints of UG, and to increase the L2 proficiency of
the learner. In addition, Krashen (1982)’s Affective Filter Hypothesis
holds that the acquisition of a second language is halted if the learner
has a high degree of anxiety when receiving input. According to this
concept, a part of the mind filters out L2 input and prevents intake by
the learner, if the learner feels that the process of SLA is
threatening. As mentioned earlier, since input is essential in Krashen’s
model, this filtering action prevents acquisition from progressing.
A great deal of research has taken place on input enhancement,
the ways in which input may be altered so as to direct learners'
attention to linguistically important areas. Input enhancement might
include bold-faced vocabulary words or marginal glosses in a reading text. Research here is closely linked to research on pedagogical effects, and comparably diverse.
Monitor model
Other concepts have also been influential in the speculation about
the processes of building internal systems of second-language
information. Some thinkers hold that language processing handles
distinct types of knowledge. For instance, one component of the Monitor
Model, propounded by Krashen, posits a distinction between “acquisition”
and “learning.”
According to Krashen, L2 acquisition is a subconscious process of
incidentally “picking up” a language, as children do when becoming
proficient in their first languages. Language learning, on the other
hand, is studying, consciously and intentionally, the features of a
language, as is common in traditional classrooms. Krashen sees these two
processes as fundamentally different, with little or no interface
between them. In common with connectionism, Krashen sees input as
essential to language acquisition.
Further, Bialystok and Smith make another distinction in
explaining how learners build and use L2 and interlanguage knowledge
structures.
They argue that the concept of interlanguage should include a
distinction between two specific kinds of language processing ability.
On one hand is learners’ knowledge of L2 grammatical structure and
ability to analyze the target language objectively using that knowledge,
which they term “representation,” and, on the other hand is the ability
to use their L2 linguistic knowledge, under time constraints, to
accurately comprehend input and produce output in the L2, which they
call “control.” They point out that often non-native speakers of a
language have higher levels of representation than their native-speaking
counterparts have, yet have a lower level of control. Finally,
Bialystok has framed the acquisition of language in terms of the
interaction between what she calls “analysis” and “control.”
Analysis is what learners do when they attempt to understand the rules
of the target language. Through this process, they acquire these rules
and can use them to gain greater control over their own production.
Monitoring is another important concept in some theoretical
models of learner use of L2 knowledge. According to Krashen, the Monitor
is a component of an L2 learner's language processing device that uses
knowledge gained from language learning to observe and regulate the
learner's own L2 production, checking for accuracy and adjusting
language production when necessary.
Interaction hypothesis
Long's interaction hypothesis proposes that language acquisition is strongly facilitated by the use of the target language in interaction. Similarly to Krashen's Input Hypothesis, the Interaction Hypothesis claims that comprehensible input
is important for language learning. In addition, it claims that the
effectiveness of comprehensible input is greatly increased when learners
have to negotiate for meaning.
Interactions often result in learners receiving negative evidence.
That is, if learners say something that their interlocutors do not
understand, after negotiation the interlocutors may model the correct
language form. In doing this, learners can receive feedback on their production and on grammar that they have not yet mastered. The process of interaction may also result in learners receiving more input from their interlocutors than they would otherwise. Furthermore, if learners stop to clarify things that they do not understand, they may have more time to process the input they receive. This can lead to better understanding and possibly the acquisition of new language forms. Finally, interactions may serve as a way of focusing learners' attention on a difference between their knowledge of the target language
and the reality of what they are hearing; it may also focus their
attention on a part of the target language of which they are not yet
aware.
Output hypothesis
In the 1980s, Canadian SLA researcher Merrill Swain
advanced the output hypothesis, that meaningful output is as necessary
to language learning as meaningful input. However, most studies have
shown little if any correlation between learning and quantity of output.
Today, most scholars
contend that small amounts of meaningful output are important to
language learning, but primarily because the experience of producing
language leads to more effective processing of input.
Critical Period Hypothesis
In
1967, Eric Lenneberg argued the existence of a critical period
(approximately 2-13 years old) for the acquisition of a first language.
This has attracted much attention in the realm of second language
acquisition. For instance, Newport (1990) extended the argument of
critical period hypothesis by pointing to a possibility that when a
learner is exposed to an L2 might also contribute to their second
language acquisition. Indeed, she revealed the correlation between age
of arrival and second language performance. In this regard, second
language learning might be affected by a learner's maturational state.
Competition model
Some of the major cognitive theories of how learners organize
language knowledge are based on analyses of how speakers of various
languages analyze sentences for meaning. MacWhinney, Bates, and Kliegl
found that speakers of English, German, and Italian showed varying
patterns in identifying the subjects of transitive sentences containing
more than one noun.
English speakers relied heavily on word order; German speakers used
morphological agreement, the animacy status of noun referents, and
stress; and speakers of Italian relied on agreement and stress.
MacWhinney et al. interpreted these results as supporting the
Competition Model, which states that individuals use linguistic cues to
get meaning from language, rather than relying on linguistic universals.
According to this theory, when acquiring an L2, learners sometimes
receive competing cues and must decide which cue(s) is most relevant for
determining meaning.
Connectionism and second-language acquisition
Connectionism
These findings also relate to Connectionism. Connectionism attempts to
model the cognitive language processing of the human brain, using
computer architectures that make associations between elements of
language, based on frequency of co-occurrence in the language input. Frequency has been found to be a factor in various linguistic domains of language learning.
Connectionism posits that learners form mental connections between
items that co-occur, using exemplars found in language input. From this
input, learners extract the rules of the language through cognitive
processes common to other areas of cognitive skill acquisition. Since
connectionism denies both innate rules and the existence of any innate
language-learning module, L2 input is of greater importance than it is
in processing models based on innate approaches, since, in
connectionism, input is the source of both the units and the rules of
language.
Noticing hypothesis
Attention is another characteristic that some believe to have a role
in determining the success or failure of language processing. Richard Schmidt
states that although explicit metalinguistic knowledge of a language is
not always essential for acquisition, the learner must be aware of L2
input in order to gain from it.
In his “noticing hypothesis,” Schmidt posits that learners must notice
the ways in which their interlanguage structures differ from target
norms. This noticing of the gap allows the learner's internal language
processing to restructure the learner's internal representation of the
rules of the L2 in order to bring the learner's production closer to the
target. In this respect, Schmidt's understanding is consistent with the
ongoing process of rule formation found in emergentism and
connectionism.
Processability
Some theorists and researchers have contributed to the cognitive
approach to second-language acquisition by increasing understanding of
the ways L2 learners restructure their interlanguage knowledge systems
to be in greater conformity to L2 structures. Processability theory
states that learners restructure their L2 knowledge systems in an order
of which they are capable at their stage of development.
For instance, In order to acquire the correct morphological and
syntactic forms for English questions, learners must transform
declarative English sentences. They do so by a series of stages,
consistent across learners. Clahsen proposed that certain processing
principles determine this order of restructuring.
Specifically, he stated that learners first, maintain declarative word
order while changing other aspects of the utterances, second, move words
to the beginning and end of sentences, and third, move elements within
main clauses before subordinate clauses.
Automaticity
Thinkers
have produced several theories concerning how learners use their
internal L2 knowledge structures to comprehend L2 input and produce L2
output. One idea is that learners acquire proficiency in an L2 in the
same way that people acquire other complex cognitive skills.
Automaticity is the performance of a skill without conscious control. It
results from the gradated process of proceduralization. In the field of
cognitive psychology, Anderson expounds a model of skill acquisition,
according to which persons use procedures to apply their declarative
knowledge about a subject in order to solve problems.
On repeated practice, these procedures develop into production rules
that the individual can use to solve the problem, without accessing
long-term declarative memory. Performance speed and accuracy improve as
the learner implements these production rules. DeKeyser tested the
application of this model to L2 language automaticity.
He found that subjects developed increasing proficiency in performing
tasks related to the morphosyntax of an artificial language,
Autopractan, and performed on a learning curve typical of the
acquisition of non-language cognitive skills. This evidence conforms to
Anderson's general model of cognitive skill acquisition, supports the
idea that declarative knowledge can be transformed into procedural
knowledge, and tends to undermine the idea of Krashen that knowledge gained through language “learning” cannot be used to initiate speech production.
Declarative/procedural model
An example of declarative knowledge, procedural knowledge, and conditional knowledge
Michael T. Ullman
has used a declarative/procedural model to understand how language
information is stored. This model is consistent with a distinction made
in general cognitive science between the storage and retrieval of facts,
on the one hand, and understanding of how to carry out operations, on
the other. It states that declarative knowledge consists of arbitrary
linguistic information, such as irregular verb forms, that are stored in
the brain's declarative memory. In contrast, knowledge about the rules of a language, such as grammatical word order is procedural knowledge and is stored in procedural memory. Ullman reviews several psycholinguistic and neurolinguistic studies that support the declarative/procedural model.
Memory and second-language acquisition
Perhaps
certain psychological characteristics constrain language processing.
One area of research is the role of memory. Williams conducted a study
in which he found some positive correlation between verbatim memory
functioning and grammar learning success for his subjects.
This suggests that individuals with less short-term memory capacity
might have a limitation in performing cognitive processes for
organization and use of linguistic knowledge.
Semantic theory
For
the second-language learner, the acquisition of meaning is arguably the
most important task. Meaning is at the heart of a language, not the
exotic sounds or elegant sentence structure. There are several types of
meanings: lexical, grammatical, semantic, and pragmatic. All the
different meanings contribute to the acquisition of meaning resulting in
the integrated second language possession:
Lexical meaning – meaning that is stored in our mental lexicon;
Grammatical meaning – comes into consideration when calculating
the meaning of a sentence. usually encoded in inflectional morphology
(ex. - ed for past simple, -‘s for third person possessive)
Semantic meaning – word meaning;
Pragmatic meaning – meaning that depends on context, requires
knowledge of the world to decipher; for example, when someone asks on
the phone, “Is Mike there?” he doesn’t want to know if Mike is
physically there; he wants to know if he can talk to Mike.
Sociocultural theory
Larsen-Freeman
Sociocultural theory was originally coined by Wertsch in 1985 and derived from the work of Lev Vygotsky and the Vygotsky Circle
in Moscow from the 1920s onwards. Sociocultural theory is the notion
that human mental function is from participating cultural mediation
integrated into social activities.
The central thread of sociocultural theory focuses on diverse social,
historical, cultural, and political contexts where language learning
occurs and how learners negotiate or resist the diverse options that
surround them. More recently, in accordance with this sociocultural thread,
Larsen-Freeman (2011) created the triangle form that shows the interplay
of four Important concepts in language learning and education: (a)
teacher, (b) learner, (c) language or culture and (d) context.
In this regard, what makes sociocultural theory different from other
theories is that it argues that second learning acquisition is not a
universal process. On the contrary, it views learners as active
participants by interacting with others and also the culture of the
environment.
Complex Dynamic Systems Theory
Second language acquisition has been usually investigated by applying traditional cross-sectional studies. In these designs usually a pre-testpost-test
method is used. However, in the 2000s a novel angle emerged in the
field of second language research. These studies mainly adopt Dynamic systems theory perspective to analyse longitudinal time-series data. Scientists such as Larsen-Freeman, Verspoor, de Bot, Lowie, van Geert claim that second language acquisition can be best captured by applying longitudinalcase study research design rather than cross-sectional designs. In these studies variability is seen a key indicator of development, self-organization from a Dynamic systems parlance. The interconnectedness of the systems is usually analysed by moving correlations.