DOC PREVIEW
MIT 6 863J - Natural Language Processing

This preview shows page 1-2-22-23 out of 23 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

6.863J Natural Language ProcessingLecture 18: the meaning of it all, #4Instructor: Robert C. [email protected](or #42)6.863J/9.611J SP04 Lecture 18The Menu Bar• Administrivia:• Lab 4a out April 14 – last lab before final projectAgenda:Scoping ambiguities & computation - solutionsQuantifier raising (QR)Cooper storageKeller storage“Hole” semanticsPrelude to discourse representation theory6.863J/9.611J SP04 Lecture 18Montague’s approach (& other current linguistic theory)• Rule of Quantifier raising – like moving other phrases• Landing site in position at head of Sbar (function or operator position)• Combine with indexed pronoun (alternatively: empty element or trace) instead of quantifying NP• When placeholder has moved high enough in tree to give the scope we need, replace by quantifying NP6.863J/9.611J SP04 Lecture 18Example: every person loves a woman (Sbar)∃y(woman(y)&∀x(person(x) → LOVE(x,y)))Every person (NP)λP.∀x(person(x) → P@x)loves her-3 (VP)λy.LOVE(y,z3))loves (trans verb)λX.λy.X@λx.LOVE(y,x)her-3 (NP)λP.P@ z3Every person loves her-3 (S)λz3.∀x(person(x) →LOVE(x,z3)A woman (NP)λP.∃y(woman(y) & P@y)QR6.863J/9.611J SP04 Lecture 18Why do we have to solve this?• Readings aren’t always logically independent• Direct construction doesn’t give us the right ambiguities• Example (demo): every customer in a restaurant eats a big kahuna burger• forall A ((exists B (restaurant(B) & in(A,B)) & customer(A)) -> exists C ((big(C) & (kahuna(C) & burger(C))) & eat(A,C)))6.863J/9.611J SP04 Lecture 18Here they all are…1. forall A ((exists B (restaurant(B) & in(A,B)) & customer(A)) > exists C ((big(C) & (kahuna(C) & burger(C))) & eat(A,C)))2. forall A ((in(A,B) & customer(A)) > exists C (restaurant(C) & exists D((big(D) & (kahuna(D) & burger(D))) & eat(A,D))))3. forall A ((in(A,B) & customer(A)) > exists C (restaurant(C) & exists D((big(D) & (kahuna(D) & burger(D))) & eat(A,D))))4. forall A ((in(A,B) & customer(A)) > exists C ((big(C) & (kahuna(C) & burger(C))) & exists D (restaurant(D) & eat(A,C))))5. exists A ((big(A) & (kahuna(A) & burger(A))) & forall B ((exists C (restaurant(C) & in(B,C)) & customer(B)) > eat(B,A)))6. exists A (restaurant(A) & forall B ((in(B,A) & customer(B)) > exists C ((big(C) & (kahuna(C) & burger(C))) & eat(B,C))))7. exists A ((big(A) & (kahuna(A) & burger(A))) & forall B ((in(B,C) & customer(B)) > exists D (restaurant(D) & eat(B,A))))8. exists A (restaurant(A) & exists B ((big(B) & (kahuna(B) & burger(B))) & forall C ((in(C,A) & customer(C)) > eat(C,B))))9. exists A ((big(A) & (kahuna(A) & burger(A))) & exists B (restaurant(B) & forall C ((in(C,B) & customer(C)) > eat(C,A))))6.863J/9.611J SP04 Lecture 18Montague approach• Idea of having a ‘dummy’ semantic rep that we use when needed is basically right…• But… way it is used here is not smart from a modular engineering or computational design• Don’t want to futz w/ grammar – only want to add on this combinatory mechanism to existing grammars• Storage methods – move the QR idea from syntax to semantics• Cooper storage & Keller storage6.863J/9.611J SP04 Lecture 18Cooper storageHistory: cf W. Woods and Lunar systemKey ideas:• Associate each node of parse tree with a store• Store contains core semantic rep together w/ quantifiers associated w/ nodes lower in the tree• After sentence is parsed, store is used to generate scoped representations• Order in which store is retrieved determines the different scopings (cf also for PP attachment…)6.863J/9.611J SP04 Lecture 18Formally stores• A store is an n-place sequence• Stores are represented by angle brackets < and >• The first item of the sequence is the core semantic representation• Subsequent elements are pairs (β,i) where β is the semantic representation of an NP (that is, another lambda expression) and i is an index• An index is a label that picks out a free variable in the core semantic representation6.863J/9.611J SP04 Lecture 18Use of the store• Quantified Noun phrases can repackage the information that the store containsMore precisely:Storage (Cooper)If the store <φ,(β, j),…,(β’, k)> is a semantic representation for a quantified NP, then the store < λP.P@zi, φ,(β, j),…(β’, k)> where i is some unique index, is also a rep for that NP6.863J/9.611J SP04 Lecture 18Let’s try it• Every person loves a woman6.863J/9.611J SP04 Lecture 18Tree for this showing indicesEvery person (NP)<λQ.Q@z6,(λP.∀x(person(x) → P@x), 6)>loves a woman (VP)<λu.LOVE(u,z7),(λP.∃y(woman(y)&P@y),7)>loves (trans verb)λX.λu.X@λv.LOVE(u,v)a woman (NP)<λQ.Q@ z7,(λP.∃y(woman(y)&P@y),7)>Every person loves a woman (S)<LOVE(z6,z7), (λP.∀x(person(x) → P@x), 6), (λP.∃y(woman(y)&P@y),7)>6.863J/9.611J SP04 Lecture 18Retrieval 1• Want the ordinary scoped representation• How do we get this?• Remove one of the indexed binding operators from the store• Combine it with the core representation• Result is a new core representation• Continue until store has just one element6.863J/9.611J SP04 Lecture 18Or precisely• Retrieval:• Let σ1 and σ2 be (possibly empty) sequences of binding operators• If the store <φ, σ1, (β, i ), σ2> is associated with an expression of category S, then the store< β@λzi .φ, σ1, σ2> is also associated with this expressionInformally: pull out the indexed QP and apply6.863J/9.611J SP04 Lecture 18Let’s see how it works<LOVE(z6,z7),(λP.∀x(person(x) → P@x), 6),(λP.∃y(woman(y)&P@y),7)>• Retrieval rule to this store, pull 1stquantifier out< λP.∀x(person(x) → P@x)@ λz6. LOVE(z6,z7), (λP.∃y(woman(y)&P@y),7)>• Beta-convert (lambda apply) to simplify:<∀x(person(x) → love(x,z7)), (λP.∃y(woman(y)&P@y),7)>• Pull 2ndquantifier (the last one remaining)<λP.∃y(woman(y)&P@y)@λz7.∀x(person(x) → love(x,z7))>• Result:<∃y(woman(y)&∀x(person(x) → love(x,y))>How do we get the other reading?6.863J/9.611J SP04 Lecture 18Are we ok?• Cooper storage gives a lot of freedom• Quantifiers retrieved in any order• The only constraint is the use of co-indexed variables• Is this too much rope?Mia knows every owner of a hash bar6.863J/9.611J SP04 Lecture 18Nested NPs cause a problem• Store:<Know(Mia, z2), (λP.∀y(owner(y)&Of(y,z1)→P@y),2), (λQ.∃x(hashbar(x)&Q@a),1)>• Pull 2:<∀y(owner(y)&Of(y,z1)→Know(Mia,y)), λQ.∃x(hashbar(x)&Q@a),1)><∃x(hashbar(x)& ∀y(owner(y)&Of(y,x)


View Full Document

MIT 6 863J - Natural Language Processing

Documents in this Course
N-grams

N-grams

42 pages

Semantics

Semantics

75 pages

Semantics

Semantics

82 pages

Semantics

Semantics

64 pages

Load more
Download Natural Language Processing
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Natural Language Processing and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Natural Language Processing 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?