DOC PREVIEW
Berkeley COMPSCI 182 - The Neural Basis of Thought and Language

This preview shows page 1-2-17-18-19-36-37 out of 37 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

The Neural Basis ofThought and LanguageFinalReview SessionAdministrivia•Final in class Thursday, May 8th–Be there on time!–Format:•closed books, closed notes•short answers, no blue books•A9 due tonight•Final paper due on Monday, May 12Resources•Textbook!•Class slides•Section slides•Joe Makin's class notes from 2006– on notes pageThe Second HalfCognition and LanguageComputationStructured ConnectionismComputational NeurobiologyBiologyMidtermFinalabstractionMotor ControlMetaphorGrammarBailey ModelKARMAECGReinforcement LearningECG LearningBayes NetsOverview• Bailey Model– feature structures– Bayesian model merging• Bayes Nets• KARMA– X-schema, frames–aspect–event-structure metaphor–inference• FrameNet– frames– image schemas• Reinforcement Learning• ECG–SemSpecs–parsing– constructions– learning algorithmBailey’s VerbLearn Model• 3 Levels of representation1. cognitive: words, concepts2. computational: f-structs, x-schemas3. connectionist: structured models, learning rules• Input: labeled hand motions (f-structs)• learning: 1. the correct number of senses for each verb2. the relevant features in each sense, and3. the probability distributions on each included feature• execution: perform a hand motion based on a labelBayes Nets•Probability•Bayes' Rule / Product Rule–P(x,y) = P(x) P(y|x)= P(y) P(x|y)– P(x|y) = P(x) P(y|x) / P(y)•Write factored distribution P(x,y,z,...) = P(x) P(y|x) ...•Infer distributions over variables given evidence– variable elimination (by summation: P(x) = sum_y P(x,y))•Temporal Bayes' Nets–pretend variables in different time slicesEvent Structure Metaphor•States are Locations •Changes are Movements •Causes are Forces•Causation is Forced Movement•Actions are Self-propelled Movements•Purposes are Destinations•Means are Paths•Difficulties are Impediments to Motion•External Events are Large, Moving Objects•Long-term, Purposeful Activities are JourneysEgo Moving versus Time MovingResults30.8%69.2%Object Moving73.3%26.7%Ego MovingMeeting is FridayMeeting is MondayPRIMEKARMA• DBN to represent target domain knowledge• Metaphor maps link target and source domain • X-schema to represent source domain knowledgeX-Schemas• Active representation•Has hierarchical actions–defined by network structure• Actions have structure (e.g. ready, iterating, ongoing, failed, complete)– defined by network structure• Properly-designed nets will be goal-directed– take best actions to reach goal, given current context– related to “reinforcement learning”Reinforcement Learning•unsupervised learning•learn behaviors•reward– discounts•use estimated future valueV s=maxaQ s , aQ s , a=E[r s , a V s ']Reinforcement Learning•Learning methods– Value iteration– Q-learning•Biology– dopamine = reward difference• only for reward, not punishment–non-exponential discounting•preference switchingLanguage•Grammar– Syntax•Semantics•Metaphor•Simulation•UnificationGrammar• A grammar is a set of rules defining a formal language• a common example is Context-Free Grammarα → β∀α : single non-terminal• β : any combination of terminals and non-terminalsS → NP VPNP → Det Noun | ProperNounVP → Verb NP | Verb PPPP → Preposition NPNoun → kiwi | orange | storeProperNoun → Pat | IDet → a | an | theVerb → ate | went | shopPreposition → to | atSentence generation: Pat ate the kiwi• start from S and apply any applicable rules• forward expansionS → NP VPNP → Det Noun | ProperNounVP → Verb NP | Verb PPPP → Preposition NPNoun → kiwi | orange | storeProperNoun → Pat | IDet → a | an | theVerb → ate | went | shopPreposition → to | atSNP VPDet Noun VP ProperNoun VPa Noun VP an Noun VP the Noun VP a kiwi VP a orange VP a store VP …Unification Grammar• Basic idea: capture these agreement features for each non-terminal in feature structuresEnforce constraints on these features using unification rulesPatnumber : SGperson : 3rd agreementInumber : SGperson : 1stagreementWentagreementShopagreementnumber : person : 1stVP → Verb NP VP.agreement Verb.agreement↔S → NP VPNP.agreement VP.agreement↔Poverty and Opulence•Poverty of the stimulus– Coined to suggest how little information children have to learn from•Opulence of the substrate– Opulence = “richness”–Coined in response to suggest how much background information children haveAnalysis ProcessUtteranceSimulationBelief StateGeneral KnowledgeConstructionsSemanticSpecification“Harry walked into the café.”The INTO constructionconstruction INTO subcase of Spatial-Relationform selff .orth “into”←meaning: Trajector-Landmarkevokes Container as contevokes Source-Path-Goal as spgtrajector spg.trajector↔landmark cont↔cont.interior spg.goal↔cont.exterior spg.source↔The Spatial-Phrase constructionconstruction SPATIAL-PHRASEconstructionalconstituentssr : Spatial-Relationlm : Ref-Exprformsrf before lmfmeaningsrm.landmark ↔ lmmThe Directed-Motion constructionconstruction DIRECTED-MOTIONconstructionalconstituentsa : Ref-Expm: Motion-Verbp : Spatial-Phraseform af before mf mf before pfmeaningevokes Directed-Motion as dmselfm.scene dm↔dm.agent a↔mdm.motion m↔mdm.path p↔mschema Directed-Motionrolesagent : Entitymotion : Motionpath : SPGDo not forget the SemSpec!What exactly is simulation?•Belief update and/or X-schema executionhungry meetingcafetime of dayreadystartongoingfinishdoneiterateWALKat goalConstructions(Utterance, Situation)1. Learner passes input (Utterance + Situation) and current grammar to Analyzer.AnalyzeSemantic Specification,Constructional Analysis2. Analyzer produces SemSpec and Constructional Analysis.3. Learner updates grammar:Hypothesizea. Hypothesize new map.Reorganizeb. Reorganize grammar(merge or compose).c. Reinforce(based on usage).Learning-Analysis Cycle (Chang, 2004)Three ways to get new constructions• Relational mapping–throw the ball• Merging–throw the block–throwing the ball• Composing–throw the ball–ball off–you throw the ball offTHROW < BALL < OFFTHROW < OBJECTTHROW < BALLGrammar merging•How can we measure description length?– complicated rules are bad– lots of rules are bad•measure “derivation length”– alpha * size(rules) + derivationCost(rules, sentences)How do you learn…the meanings of spatial


View Full Document

Berkeley COMPSCI 182 - The Neural Basis of Thought and Language

Documents in this Course
Load more
Download The Neural Basis of Thought and Language
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view The Neural Basis of Thought and Language and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view The Neural Basis of Thought and Language 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?