Language without Grammar
to appear in the Handbook of Cognitive Linguistics and Second Language Acquisition, edited by Nick Ellis & Peter Robinson, published by Erlbaum
University of Hawaii
Sentences have systematic properties. Subjects occur in a structurally
higher position than direct objects. Only some word orders are acceptable. Verbs
agree with certain nominals, but not others. Relative clauses are formed in
particular ways. Reflexive pronouns have a narrowly circumscribed set of
possible antecedents. And so forth.
Comment: This, and the "so forth" represent very good evidence that a skilled language learner should go straight for the jugular, dive into authentic material armed with a dictionary and simply plough through it.
The factors to which emergentists turn for their explanations vary
considerably, ranging from features of physiology and perception, to processing
and working memory, to pragmatics and social interaction, to properties of the
input and of the learning mechanisms.
This sort of approach offers a way to think about language without
grammar. What it basically says is that language and languages are the way they
are because of what happens when words with particular properties are assembled
in real time in the course of actual speech and comprehension. A preliminary
illustration of how this might work involves the design of sentence structure.
Responsibility for the actual mechanics of sentence formation falls to a
computational system, which operates on words and morphemes drawn from the
lexicon, combining them in particular ways to construct phrases and sentences.
The computational system corresponds roughly to what one might think of as
The particular computational system that I propose is indistinguishable in
its structure and functioning from a processor. It operates in a linear manner, it
combines elements, and it checks to make sure that lexical requirements are being
satisfied. However, unlike classic processors, it is entirely unconstrained by
grammatical principles, obeying a single efficiency-related imperative that is
independent of language—it must minimize the burden on working memory, the
pool of resources that supports operations on representation (e.g., Carpenter,
Miyake, and Just, 1994; Robinson, 2002).
As I see them, ‘syntactic structures’ are nothing but a fleeting residual
record of how the computational system goes about combining words.
A metaphor may help clarify this point. Traditional UG-based approaches
to language focus on the ARCHITECTURE of sentences, positing principles that lay
down an intricate innate grammatical blueprint for language. As I see it though,
there are no architects. There are just carpenters, who design as they build, limited
only by the material available to them (words with particular properties) and by
the need to complete their work as quickly and as efficiently as possible.
On this view then, there is no grammar per se. There is a lexicon that
includes an inventory of words and information about the particular arguments
that they require. And there is a computational system, which is just a processor
that combines words one at a time in a linear fashion. The processor is driven by
efficiency considerations that are designed to ease the burden on working
memory, but it has no special properties beyond this.
This idea runs against long-standing views within linguistics, calling into
question one of the few points on which there is a (near) consensus—the
existence of grammar. This cannot be taken lightly. After all, grammar—and
especially Universal Grammar—offers powerful explanations for a wide and
varied range of problems that arise in the study of syntax, typology, acquisition,
and other areas central to the field. The remainder of this paper is devoted to a
consideration of these matters.