This preview shows page 1-2-3 out of 9 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

DiscussionYes, we still need Universal GrammarJeffrey Lidza,b,*, Lila R. Gleitmana,baDepartment of Linguistics, Northwestern University, 2016 Sheridan Road, Evanston, IL 60208, USAbDepartment of Psychology, University of Pennsylvania, Philadelphia, PA 19104, USAReceived 16 March 2004; accepted 16 March 2004AbstractIn a recent paper [Lidz, J., Gleitman, H., & Gleitman, L. (2003). Understanding how input matters:Verb learning and the footprint of universal grammar. Cognition, 87, 151–178], we provided cross-linguistic evidence in favor of the following linked assertions: (i) Verb argument structure is acorrelate of verb meaning; (ii) However, argument structure is not directly available to learners as acue for reconstructing verb meaning, owing to the complexity of form-meaning mappings within andacross languages; (iii) A major correlate of argument-structure, namely, noun phrase number, isstatistically available on the surface in all languages, and serves as a quasi-universal derivative cue tothe meanings of verbs; (iv) this cue is privileged, in the sense that it is used early and selectively bylearners, despite within- and cross-language differences in its availability. Goldberg [Goldberg, A.(2004). But do we need Universal Grammar? Comment on Lidz, Gleitman and Gleitman 2003.Cognition ] suggests that this cue is not linguistic, that it is too sicklied o’er with exceptions andprovisos to be useful to learners, and that conversational conspiracies can in any case serve as thealternative theoretical framework for a theory of predicate learning, and language acquisition moregenerally. In the present reply, we review and further explicate our original position, to wit: A largepart of any generative grammar is a formal statement of the complex alignments between predicate-argument structures and the surface forms (linear strings of words) of sentences. Because the severalrules for alignment interact, the surface outcomes reveal individual systematicities only abstractly.Therefore, learning would be impossible if infants could not analyze probabilistically availablepatterns to recover their principled linguistic sources. This statistics-based discovery procedure is incertain relevant regards specific to language learning. Finally, we argue that while pragmatics andtheory of mind properties in learner and tutor necessarily frame language acquisition, that these havenot been shown—and probably cannot be shown—to be sufficient to this computational problem.q 2004 Elsevier B.V. All rights reserved.Keywords: Universal grammar; Argument structure; Statistical learning0022-2860/$ - see front matter q 2004 Elsevier B.V. All rights reserved.doi:10.1016/j.cognition.2004.03.004Cognition 94 (2004) 85–93www.elsevier.com/locate/COGNIT*Corresponding author. Department of Linguistics, Northwestern University, 2016 Sheridan Road, Evanston,IL 60208, USA.E-mail address: [email protected] (J. Lidz).Since the inception of cognitive science, the mystery of language acquisition has servedas the central criterion of explanation for a theory of language itself. A linguistic theorymust ultimately explain how language acquisition is possible given the level ofabstraction that is involved in getting from utterances to grammatical representations;that is, in acquiring the system that underlies form-meaning pairing for any possiblesentence of the language (“the grammar”). By “abstraction” in the present context, weare referring to the recognition and analysis of the complex and variable ways thatpropositional content lines up with linguistic expression within and across languages.Not to put too fine a point on it, languages are in the business of expressing, in theirbasic sentences, who did what to whom. Children must discover the lawful machin eryby which the language spoken to them accomplishes this. Looked at from the outside,this task seems daunting just because languages do not express the components of aproposition simply or uniformly.One of the easiest ways to see the complexity in the mapping of language forms to theirmeanings in this regard is to look at Table 1 of the article accompanying the present one inwhich Adele Goldberg takes the present authors to task for unduly mystifying the problemof language learning (Goldberg, 2004). We see immediately in this useful sample ofEnglish-language constructions that a speaker can address a listener without making thataddressee-reference—the “who” of the matter—explicit at all (“Shut the door! ”) and thatone can in many instances implicate an event part icipant without ever explicitlyintroducing him or her linguistically (“The tiger killed again,” a proposition whichimplicates some unknown prey). Goldberg’s list of such mapping complexities can beextended just about forever. Still sticking to English, one can speak of some individual’sactivity, let us say “eati ng” without ever so saying (“John ate and Bill did too”, whichseems to omit mention of Bill as an eater), and so forth. There are even lawfu lly unlawfulways of speaking built into the idiomatic structure of a language (“John kicked thebucket”).1These complexities are compounded many times over by the diversity of naturallanguage syntax. Though people speaking various languages evidently express many ofthe very same propositional thoughts, they do so by use of grammars whose surfaceproperties vary massively. How then can children align the forms with the meanings?This learning problem has led syntacticians working in the tradition of generativegrammar to posit abstract categories and mechanisms that are far removed from thesurface form of a sentence. At this level of abstraction, languages begin to look moresimilar (Baker, 1996, 2001; Chomsky, 1965, 1975, 1986).2If only the child learner hadaccess to properties at this level of abstraction, understandin g the world-wide, species-wide success of language learning would be easier. By hypothesis, these propertieswould act as a kind of filter on the input. The learner would thus be const rained to1See Jackendoff (1997), McGinnis (2002), and Nunberg, Sag, and Wasow (1994) for discussion of syntacticand semantic regularities found even in idiomatic expressions.2Fashions certainly change in the grammatical formalisms designed by linguists to bring the forms into lawfulalignment with the meanings, but despite many recent rumors to the contrary, the complexity of these mappings isnever removed, only shifted,


View Full Document

MIT 6 863J - Universal Grammar

Documents in this Course
N-grams

N-grams

42 pages

Semantics

Semantics

75 pages

Semantics

Semantics

82 pages

Semantics

Semantics

64 pages

Load more
Download Universal Grammar
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Universal Grammar and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Universal Grammar 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?