DOC PREVIEW
Berkeley COMPSCI 182 - The Neural Basis of Thought and Language

This preview shows page 1-2-24-25 out of 25 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

The Neural Basis ofThought and LanguageWeek 14Administrivia•Final exam review session Monday–7-9pm in 310 Soda•Final in class next Thursday, May 8th•Be there on time!•Format:–closed books, closed notes–short answers, no blue books•Final paper due on bSpace on Monday, May 11Questions1. How does the analyzer use the constructions to parse a sentence?2. How can we learn new ECG constructions?3. What are ways to re-organize and consolidate the current grammar?4. What metric is used to determine when to form a new construction?“you”youschema Addresseesubcase of HumanFORM (sound)MEANING (stuff)Analyzing “You Throw The Ball”“throw”throwschema Throwroles:throwerthrowee“ball”ballschema Ballsubcase of Object“block”blockschema Blocksubcase of Objectt1 before t2t2 before t3Thrower-Throw-Objectt2.thrower ↔ t1t2.throwee t3↔“the”AddresseeThrowthrowerthroweeBallDo not forget the SemSpec!Analyzing in ECGcreate a recognizer for each construction in the grammarfor each level j (in ascending order)repeatfor each recognizer r in jfor each position p of utteranceinitiate r starting at puntil we don't find anything newRecognizer for the Transitive-Cn• an example of a level-1 construction is Red-Ball-Cn• each recognizer looks for its constituents in order (the ordering constraints on the constituents can be a partial ordering)agt v objI get cuplevel 0level 2level 1Questions1. How does the analyzer use the constructions to parse a sentence?2. How can we learn new ECG constructions?3. What are ways to re-organize and consolidate the current grammar?4. What metric is used to determine when to form a new construction?Constructions(Utterance, Situation)1. Learner passes input (Utterance + Situation) and current grammar to Analyzer.AnalyzeSemantic Specification,Constructional Analysis2. Analyzer produces SemSpec and Constructional Analysis.3. Learner updates grammar:Hypothesizea. Hypothesize new map.Reorganizeb. Reorganize grammar(merge or compose).c. Reinforce(based on usage).Learning-Analysis Cycle (Chang, 2004)AcquisitionReorganizeHypothesizeProductionUtterance(Comm. Intent, Situation)GenerateConstructions(Utterance, Situation)AnalysisComprehensionAnalyzePartialUsage-based Language LearningBasic Learning Idea• The learner’s current grammar produces a certain analysis for an input sentence•The context contains richer information (e.g. bindings) that are unaccounted for in the analysis• Find a way to account for these meaning relations (by looking for corresponding form relations)SemSpecr1r2r1r2r3r1r2Contextr1r2r1r2r3r1r2“you”“throw”“ball”youthrowball“block”blockschema Addresseesubcase of HumanFORM (sound)MEANING (stuff)lexical constructionsInitial Single-Word Stageschema Throwroles:throwerthroweeschema Ballsubcase of Objectschema Blocksubcase of Object“you”youschema Addresseesubcase of HumanFORM MEANINGNew Data: “You Throw The Ball”“throw”throwschema Throwroles:throwerthrowee“ball”ballschema Ballsubcase of Object“block”blockschema Blocksubcase of Object“the”AddresseeThrowthrowerthroweeBallSelfSITUATIONAddresseeThrowthrowerthroweeBallbeforerole-fillerthrow-ballRelational Mapping ScenariosAfBfAmBmABform-relationrole-fillerthrow ball throw.throwee ball↔AfBfAmBmABform-relationrole-fillerXrole-fillerput ball down put.mover ball↔down.tr ball↔AfBfAmBmABform-relationrole-fillerYrole-fillerNomi ball possession.possessor Nomi↔possession.possessed ball↔Questions1. How does the analyzer use the constructions to parse a sentence?2. How can we learn new ECG constructions?3. What are ways to re-organize and consolidate the current grammar?4. What metric is used to determine when to form a new construction?Merging Similar Constructionsthrow before blockThrow.throwee = Blockthrow before ballThrow.throwee = Ballthrow before-s ingThrow.aspect = ongoingthrow-ing the ballthrow the blockthrow before ObjectfTHROW.throwee = ObjectmTHROW-OBJECTResulting Constructionconstruction THROW-OBJECTconstructional constituentst : THROWo : OBJECTformtf before ofmeaning tm.throwee o↔mComposing Co-occurring Constructionsball before offMotion mm.mover = Ballm.path = Offball offthrow before ballThrow.throwee = Ballthrow the ballthrow before ball ball before offTHROW.throwee = BallMotion mm.mover = Ballm.path = OffTHROW-BALL-OFFResulting Constructionconstruction THROW-BALL-OFFconstructional constituentst : THROWb : BALLo : OFFformtf before bfbf before ofmeaning evokes MOTION as m tm.throwee b↔m m.mover b↔m m.path o↔mQuestions1. How does the analyzer use the constructions to parse a sentence?2. How can we learn new ECG constructions?3. What are ways to re-organize and consolidate the current grammar?4. What metric is used to determine when to form a new construction?Size Of Grammar•Size of the grammar G is the sum of the size of each construction:• Size of each construction c is:where •nc = number of constituents in c,•mc = number of constraints in c,• length(e) = slot chain length of element reference e∑∈=GccG )size()size(∑∈++=ceccemnc )length()size(Example: The Throw-Ball Cxnconstruction THROW-BALLconstructional constituentst : THROWb : BALLformtf before bfmeaning tm.throwee b↔msize(THROW-BALL) = 2 + 2 + (2 + 3) = 9∑∈=ceccemnc )length(++)size(Complexity of Data Given Grammar•Complexity of the data D given grammar G is the sum of the analysis score of each input token d:•Analysis score of each input token d is:where • c is a construction used in the analysis of d•weightc relative frequency of ≈ c,•|typer| = number of ontology items of type r used,•heightd = height of the derivation graph,•semfitd = semantic fit provide by the analyzer∑∈=DddGD )score()|(complexitydddc crrcsemfitheighttypeweightd ++⋅+=∑ ∑∈ ∈η)score(Minimum Description Length•Choose grammar G to minimize cost(G|D):– cost(G|D) = α • size(G) + β • complexity(D|G)– Approximates Bayesian learning; cost(G|D) 1/posterior probability 1/P(G|D)≈ ≈•Size of grammar = size(G) 1/prior 1/P(G)≈ ≈– favor fewer/smaller constructions/roles; isomorphic mappings•Complexity of data given grammar 1/likelihood ≈ 1/P(D|G)≈– favor simpler analyses(fewer, more likely constructions)– based on derivation length + score of derivationFinal Remark• The goal here is to build a cognitive plausible model of language learning• A very different game that one


View Full Document

Berkeley COMPSCI 182 - The Neural Basis of Thought and Language

Documents in this Course
Load more
Download The Neural Basis of Thought and Language
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view The Neural Basis of Thought and Language and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view The Neural Basis of Thought and Language 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?