Você está na página 1de 8

Tool Module: Chomskys Universal Grammar

During the first half of the 20th century, linguists who theorized about the human ability to speak
did so from the behaviourist perspective that prevailed at that time. They therefore held that
language learning, like any other kind of learning, could be explained by a succession of trials,
errors, and rewards for success. In other words, children learned their mother tongue by simple
imitation, listening to and repeating what adults said.
This view became radically questioned, however, by the
American linguist Noam Chomsky. For Chomsky,
acquiring language cannot be reduced to simply
developing an inventory of responses to stimuli, because
every sentence that anyone produces can be a totally new
combination of words. When we speak, we combine a
finite number of elementsthe words of our languageto
create an infinite number of larger structuressentences.
Moreover, language is governed by a large number of rules
and principles, particularly those of syntax, which
determine the order of words in sentences. The term
generative grammarrefers to the set of rules that enables
us to understand sentences but of which we are usually
totally unaware. It is because of generative grammar that
everyone says thats how you say it rather than how
thats you it say, or that the words Boband him
cannot mean the same person in the sentence Bob loves
him. but can do so in Bob knows that his father loves
him. (Note in passing that generative grammar has
nothing to do with grammar textbooks, whose purpose is
simply to explain what is grammatically correct and
incorrect in a given language.)
Even before the age of 5, children can, without having had
any formal instruction, consistently produce and interpret
sentences that they have never encountered before. It is
this extraordinary ability to use language despite having
had only very partial exposure to the allowable syntactic
variants that led Chomsky to formulate his poverty of the
stimulus argument, which was the foundation for the new
approach that he proposed in the early 1960s.


In Chomskys view, the reason that children so easily master the complex operations of language
is that they have innate knowledge of certain principles that guide them in developing the
grammar of their language. In other words, Chomskys theory is that language learning is
facilitated by a predisposition that our brains have for certain structures of language.
But what language? For Chomskys theory to hold true, all of the languages in the world must
share certain structural properties. And indeed, Chomsky and other generative linguists like him
have shown that the 5000 to 6000 languages in the world, despite their very different grammars,
do share a set of syntactic rules and principles. These linguists believe that this universal
grammar is innate and is embedded somewhere in the neuronal circuitry of the human brain.
And that would be why children can select, from all the sentences that come to their minds, only
those that conform to a deep structure encoded in the brains circuits.
Universal grammar
Universal grammar, then, consists of a set of unconscious constraints that let us decide whether a
sentence is correctly formed. This mental grammar is not necessarily the same for all languages.
But according to Chomskyian theorists, the process by which, in any given language, certain
sentences are perceived as correct while others are not, is universal and independent of meaning.
Thus, we immediately perceive that the sentence Robert book reads the is not correct English,
even though we have a pretty good idea of what it means. Conversely, we recognize that a
sentence such as Colorless green ideas sleep furiously. is grammatically correct English, even
though it is nonsense.
A pair of dice offers a useful metaphor to explain what Chomsky means when he refers to
universal grammar as a set of constraints. Before we throw the pair of dice, we know that the
result will be a number from 2 to 12, but nobody would take a bet on its being 3.143. Similarly, a
newborn baby has the potential to speak any of a number of languages, depending on what
country it is born in, but it will not just speak them any way it likes: it will adopt certain
preferred, innate structures. One way to describe these structures would be that they are not
things that babies and children learn, but rather things that happen to them. Just as babies
naturally develop arms and not wings while they are still in the womb, once they are born they
naturally learn to speak, and not to chirp or neigh.

Observations that support the Chomskyian view of language
Until Chomsky propounded his theory of universal grammar in the 1960s, the empiricist school
that had dominated thinking about language since the Enlightenment held that when children
came into the world, their minds were like a blank slate. Chomskys theory had the impact of a
large rock thrown into this previously tranquil, undisturbed pond of empiricism.
Subsequent research in the cognitive sciences, which combined the tools of psychology,
linguistics, computer science, and philosophy, soon lent further support to the theory of universal
grammar. For example, researchers found that babies only a few days old could distinguish the
phonemes of any language and seemed to have an innate mechanism for processing the sounds
of the human voice.
Thus, from birth, children would appear to have certain linguistic abilities that predispose them
not only to acquire a complex language, but even to create one from whole cloth if the situation
requires. One example of such a situation dates back to the time of plantations and slavery. On
many plantations, the slaves came from many different places and so had different mother
tongues. They therefore developed what are known as pidgin languages to communicate with
one another. Pidgin languages are not languages in the true sense, because they employ words so
chaoticallythere is tremendous variation in word order, and very little grammar. But these
slaves children, though exposed to these pidgins at the age when children normally acquire their
first language, were not content to merely imitate them. Instead, the children spontaneously
introduced grammatical complexity into their speech, thus in the space of one generation creating
new languages, known as creoles.
Chomsky and the evolution of language
Many authors, adopting the approach of evolutionary psychology, believe that language has been
shaped by natural selection. In their view, certain random genetic mutations were thus selected
over many thousands of years to provide certain individuals with a decisive adaptive advantage.
Whether the advantage that language provided was in co-ordinating hunting parties, warning of
danger, or communicating with sexual partners remains uncertain, however.
Chomsky, for his part, does not see our linguistic faculties as having originated from any
particular selective pressure, but rather as a sort of fortuitous accident. He bases this view,
among other things, on studies which found that recursivitythe ability to embed one clause
inside another, as in the person who was singing yesterday had a lovely voicemight be the
only specifically human component of language. According to the authors of these studies,
recursivity originally developed not to help us communicate, but rather to help us solve other
problems connected, for example, with numerical quantification or social relations, and humans
did not become capable of complex language until recursivity was linked with the other motor
and perceptual abilities needed for this purpose. (Thus recursivity would meet the definition of a
spandrel offered by Stephen Jay Gould.) According to Chomsky and his colleagues, there is
nothing to indicate that this linkage was achieved through natural selection. They believe that it
might simply be the result of some other kind of neuronal reorganization.

The minimalist program
In the 1990s, Chomskys research focused on what he called the minimalist program, which
attempted to demonstrate that the brains language faculties are the minimum faculties that could
be expected, given certain external conditions that are imposed on us independently. In other
words, Chomsky began to place less emphasis on something such as a universal grammar
embedded in the human brain, and more emphasis on a large number of plastic cerebral circuits.
And along with this plasticity would come an infinite number of concepts. The brain would then
proceed to associate sounds and concepts, and the rules of grammar that we observe would in
fact be only the consequences, or side effects, of the way that language works. Analogously, we
can, for example, use rules to describe the way a muscle operates, but these rules do nothing but
explain what happens in the muscle; they do not explain the mechanisms that the brain uses to
generate these rules.
Criticisms of Chomskys theories
Chomsky thus continues to believe that language is pre-organized in some way or other within
the neuronal structure of the human brain, and that the environment only shapes the contours of
this network into a particular language. His approach thus remains radically opposed to that of
Skinner or Piaget, for whom language is constructed solely through simple interaction with the
environment. This latter, behaviourist model, in which the acquisition of language is nothing but
a by-product of general cognitive development based on sensorimotor interaction with the world,
would appear to have been abandoned as the result of Chomskys theories.
Since Chomsky first advanced these theories, however, evolutionary biologists have undermined
them with the proposition that it may be only the brains general abilities that are pre-
organized. These biologists believe that to try to understand language, we must approach it not
from the standpoint of syntax, but rather from that of evolution and the biological structures that
have resulted from it. According to Philip Lieberman, for example, language is not an instinct
encoded in the cortical networks of a language organ, but rather a learned skill based on a
functional language system distributed across numerous cortical and subcortical structures.
Though Lieberman does recognize that human language is by far the most sophisticated form of
animal communication, he does not believe that it is a qualitatively different form, as Chomsky
claims. Lieberman sees no need to posit a quantum leap in evolution or a specific area of the
brain that would have been the seat of this innovation. On the contrary, he says that language can
be described as a neurological system composed of several separate functional abilities.
For Lieberman and other authors, such as Terrence Deacon, it is the neural circuits of this
system, and not some language organ, that constitute a genetically predetermined set that
limits the possible characteristics of a language. In other words, these authors believe that our
ancestors invented modes of communication that were compatible with the brains natural
abilities. And the constraints inherent in these natural abilities would then have manifested
themselves in the universal structures of language.
Another approach that offers an alternative to Chomskys universal grammar is generative
semantics, developed by linguist George Lakoff of the University of California at Berkeley. In
contrast to Chomsky, for whom syntax is independent of such things as meaning, context,
knowledge, and memory, Lakoff shows that semantics, context, and other factors can come into
play in the rules that govern syntax. In addition, metaphor, which earlier authors saw as a simple
linguistic device, becomes for Lakoff a conceptual construct that is essential and central to the
development of thought.
Lastly, even among those authors who embrace Chomskys universal grammar, there are various
conflicting positions, in particular about how this universal grammar may have emerged. Steven
Pinker, for instance, takes an adaptationist position that departs considerably from the exaptation
thesis proposed by Chomsky.
Tool Module: Chomskys Universal Grammar
During the first half of the 20th century, linguists who theorized about the human ability to speak
did so from the behaviourist perspective that prevailed at that time. They therefore held that
language learning, like any other kind of learning, could be explained by a succession of trials,
errors, and rewards for success. In other words, children learned their mother tongue by simple
imitation, listening to and repeating what adults said.
This view became radically questioned, however, by the
American linguist Noam Chomsky. For Chomsky,
acquiring language cannot be reduced to simply
developing an inventory of responses to stimuli, because
every sentence that anyone produces can be a totally new
combination of words. When we speak, we combine a
finite number of elementsthe words of our languageto
create an infinite number of larger structuressentences.
Moreover, language is governed by a large number of rules
and principles, particularly those of syntax, which
determine the order of words in sentences. The term
generative grammarrefers to the set of rules that enables
us to understand sentences but of which we are usually
totally unaware. It is because of generative grammar that
everyone says thats how you say it rather than how
thats you it say, or that the words Boband him
cannot mean the same person in the sentence Bob loves
him. but can do so in Bob knows that his father loves
him. (Note in passing that generative grammar has
nothing to do with grammar textbooks, whose purpose is
simply to explain what is grammatically correct and
incorrect in a given language.)
Even before the age of 5, children can, without having had
any formal instruction, consistently produce and interpret
sentences that they have never encountered before. It is
this extraordinary ability to use language despite having
had only very partial exposure to the allowable syntactic
variants that led Chomsky to formulate his poverty of the
stimulus argument, which was the foundation for the new
approach that he proposed in the early 1960s.


In Chomskys view, the reason that children so easily master the complex operations of language
is that they have innate knowledge of certain principles that guide them in developing the
grammar of their language. In other words, Chomskys theory is that language learning is
facilitated by a predisposition that our brains have for certain structures of language.
But what language? For Chomskys theory to hold true, all of the languages in the world must
share certain structural properties. And indeed, Chomsky and other generative linguists like him
have shown that the 5000 to 6000 languages in the world, despite their very different grammars,
do share a set of syntactic rules and principles. These linguists believe that this universal
grammar is innate and is embedded somewhere in the neuronal circuitry of the human brain.
And that would be why children can select, from all the sentences that come to their minds, only
those that conform to a deep structure encoded in the brains circuits.
Universal grammar
Universal grammar, then, consists of a set of unconscious constraints that let us decide whether a
sentence is correctly formed. This mental grammar is not necessarily the same for all languages.
But according to Chomskyian theorists, the process by which, in any given language, certain
sentences are perceived as correct while others are not, is universal and independent of meaning.
Thus, we immediately perceive that the sentence Robert book reads the is not correct English,
even though we have a pretty good idea of what it means. Conversely, we recognize that a
sentence such as Colorless green ideas sleep furiously. is grammatically correct English, even
though it is nonsense.
A pair of dice offers a useful metaphor to explain what Chomsky means when he refers to
universal grammar as a set of constraints. Before we throw the pair of dice, we know that the
result will be a number from 2 to 12, but nobody would take a bet on its being 3.143. Similarly, a
newborn baby has the potential to speak any of a number of languages, depending on what
country it is born in, but it will not just speak them any way it likes: it will adopt certain
preferred, innate structures. One way to describe these structures would be that they are not
things that babies and children learn, but rather things that happen to them. Just as babies
naturally develop arms and not wings while they are still in the womb, once they are born they
naturally learn to speak, and not to chirp or neigh.

Observations that support the Chomskyian view of language
Until Chomsky propounded his theory of universal grammar in the 1960s, the empiricist school
that had dominated thinking about language since the Enlightenment held that when children
came into the world, their minds were like a blank slate. Chomskys theory had the impact of a
large rock thrown into this previously tranquil, undisturbed pond of empiricism.
Subsequent research in the cognitive sciences, which combined the tools of psychology,
linguistics, computer science, and philosophy, soon lent further support to the theory of universal
grammar. For example, researchers found that babies only a few days old could distinguish the
phonemes of any language and seemed to have an innate mechanism for processing the sounds
of the human voice.
Thus, from birth, children would appear to have certain linguistic abilities that predispose them
not only to acquire a complex language, but even to create one from whole cloth if the situation
requires. One example of such a situation dates back to the time of plantations and slavery. On
many plantations, the slaves came from many different places and so had different mother
tongues. They therefore developed what are known as pidgin languages to communicate with
one another. Pidgin languages are not languages in the true sense, because they employ words so
chaoticallythere is tremendous variation in word order, and very little grammar. But these
slaves children, though exposed to these pidgins at the age when children normally acquire their
first language, were not content to merely imitate them. Instead, the children spontaneously
introduced grammatical complexity into their speech, thus in the space of one generation creating
new languages, known as creoles.
Chomsky and the evolution of language
Many authors, adopting the approach of evolutionary psychology, believe that language has been
shaped by natural selection. In their view, certain random genetic mutations were thus selected
over many thousands of years to provide certain individuals with a decisive adaptive advantage.
Whether the advantage that language provided was in co-ordinating hunting parties, warning of
danger, or communicating with sexual partners remains uncertain, however.
Chomsky, for his part, does not see our linguistic faculties as having originated from any
particular selective pressure, but rather as a sort of fortuitous accident. He bases this view,
among other things, on studies which found that recursivitythe ability to embed one clause
inside another, as in the person who was singing yesterday had a lovely voicemight be the
only specifically human component of language. According to the authors of these studies,
recursivity originally developed not to help us communicate, but rather to help us solve other
problems connected, for example, with numerical quantification or social relations, and humans
did not become capable of complex language until recursivity was linked with the other motor
and perceptual abilities needed for this purpose. (Thus recursivity would meet the definition of a
spandrel offered by Stephen Jay Gould.) According to Chomsky and his colleagues, there is
nothing to indicate that this linkage was achieved through natural selection. They believe that it
might simply be the result of some other kind of neuronal reorganization.

The minimalist program
In the 1990s, Chomskys research focused on what he called the minimalist program, which
attempted to demonstrate that the brains language faculties are the minimum faculties that could
be expected, given certain external conditions that are imposed on us independently. In other
words, Chomsky began to place less emphasis on something such as a universal grammar
embedded in the human brain, and more emphasis on a large number of plastic cerebral circuits.
And along with this plasticity would come an infinite number of concepts. The brain would then
proceed to associate sounds and concepts, and the rules of grammar that we observe would in
fact be only the consequences, or side effects, of the way that language works. Analogously, we
can, for example, use rules to describe the way a muscle operates, but these rules do nothing but
explain what happens in the muscle; they do not explain the mechanisms that the brain uses to
generate these rules.
Criticisms of Chomskys theories
Chomsky thus continues to believe that language is pre-organized in some way or other within
the neuronal structure of the human brain, and that the environment only shapes the contours of
this network into a particular language. His approach thus remains radically opposed to that of
Skinner or Piaget, for whom language is constructed solely through simple interaction with the
environment. This latter, behaviourist model, in which the acquisition of language is nothing but
a by-product of general cognitive development based on sensorimotor interaction with the world,
would appear to have been abandoned as the result of Chomskys theories.
Since Chomsky first advanced these theories, however, evolutionary biologists have undermined
them with the proposition that it may be only the brains general abilities that are pre-
organized. These biologists believe that to try to understand language, we must approach it not
from the standpoint of syntax, but rather from that of evolution and the biological structures that
have resulted from it. According to Philip Lieberman, for example, language is not an instinct
encoded in the cortical networks of a language organ, but rather a learned skill based on a
functional language system distributed across numerous cortical and subcortical structures.
Though Lieberman does recognize that human language is by far the most sophisticated form of
animal communication, he does not believe that it is a qualitatively different form, as Chomsky
claims. Lieberman sees no need to posit a quantum leap in evolution or a specific area of the
brain that would have been the seat of this innovation. On the contrary, he says that language can
be described as a neurological system composed of several separate functional abilities.
For Lieberman and other authors, such as Terrence Deacon, it is the neural circuits of this
system, and not some language organ, that constitute a genetically predetermined set that
limits the possible characteristics of a language. In other words, these authors believe that our
ancestors invented modes of communication that were compatible with the brains natural
abilities. And the constraints inherent in these natural abilities would then have manifested
themselves in the universal structures of language.
Another approach that offers an alternative to Chomskys universal grammar is generative
semantics, developed by linguist George Lakoff of the University of California at Berkeley. In
contrast to Chomsky, for whom syntax is independent of such things as meaning, context,
knowledge, and memory, Lakoff shows that semantics, context, and other factors can come into
play in the rules that govern syntax. In addition, metaphor, which earlier authors saw as a simple
linguistic device, becomes for Lakoff a conceptual construct that is essential and central to the
development of thought.
Lastly, even among those authors who embrace Chomskys universal grammar, there are various
conflicting positions, in particular about how this universal grammar may have emerged. Steven
Pinker, for instance, takes an adaptationist position that departs considerably from the exaptation
thesis proposed by Chomsky.

Você também pode gostar