UCI ICS 278 - lecture 15 question answering ask MSR

Unformatted text preview:

Question-Answering via the Web: the AskMSR System Note: these viewgraphs were originally developed by Professor Nick Kushmerick, University College Dublin, Ireland. These copies are intended only for use for review in ICS 278.Question-AnsweringQuestion-Answering on the WebAskMSR“Traditional” approach (warning: Straw man?)AskMSR: Shallow approachAskMSR: DetailsStep 1: Rewrite queriesQuery rewritingQuery Rewriting - weightsStep 2: Query search engineStep 3: Mining N-GramsMining N-GramsStep 4: Filtering N-GramsStep 5: Tiling the AnswersExperimentsResults [summary]ExampleOpen IssuesAskMSR’s Conclusions1Question-Answering via the Web: the AskMSR SystemNote: these viewgraphs were originally developed by Professor Nick Kushmerick, University College Dublin, Ireland. These copies are intended only for use for review in ICS 278.2Question-Answering•Users want answers, not documentsDatabasesInformation RetrievalInformation ExtractionQuestion AnsweringIntelligent PersonalElectronic Librarian•Active research over the past few years, coordinated by US government “TREC” competitions•Recent intense interest from security services (“What is Bin Laden’s bank account number?”)3Question-Answering on the Web•Web = a potentially enormous “data set” for data mining–e.g., 4.3 billion Web pages indexed by Google•Example: AskMSR Web question answering system–“answer mining”•Users pose relatively simple questions–E.g., “who killed Abraham Lincoln”?•Simple parsing used to reformulate as a “template answer”•Search engine results used to find answers (redundancy helps)•System is surprisingly accurate (on simple questions)•Key contributor to system success is massive data (rather than better algorithms)–References:•Dumais et al, 2002: “Web question answering: Is more always better? ” In Proceedings of SIGIR'024AskMSR•Web Question Answering: Is More Always Better?–Dumas, Bank, Brill, Lin, Ng (Microsoft, MIT, Berkeley)•Q: “Where isthe Louvrelocated?”•Want “Paris”or “France”or “75058Paris Cedex 01”or a map•Don’t justwant URLsLecture 5Adapted from: COMP-4016 ~ Computer Science Department ~ University College Dublin ~ www.cs.ucd.ie/staff/nick ~ © Nicholas Kushmerick 20025“Traditional” approach (warning: Straw man?)•Traditional deep natural-language processing approach–Full parse of documents and question–Rich knowledge of vocabulary, cause/effect, common sense, enables sophisticated semantic analysis•Essential to find the year Lincoln died from the following document:•The non-Canadian, non-Mexican president of a North American country whose initials are AL and who was killed by John Wilkes booth died ten revolutions of the earth around the sun after 1855.6AskMSR: Shallow approach•Just ignore those documents, and look for ones like this instead:7AskMSR: Details123458Step 1: Rewrite queries•Intuition: The user’s question is often syntactically quite close to sentences that contain the answer–Where is the Louvre Museum located? –The Louvre Museum is located in Paris–Who created the character of Scrooge?–Charles Dickens created the character of Scrooge.9Query rewriting•Classify question into seven categories–Who is/was/are/were…?–When is/did/will/are/were …?–Where is/are/were …?a. Category-specific transformation ruleseg “For Where questions, move ‘is’ to all possible locations”“Where is the Louvre Museum located”  “is the Louvre Museum located”  “the is Louvre Museum located”  “the Louvre is Museum located”  “the Louvre Museum is located”  “the Louvre Museum located is”(Paper does not give full details!)b. Expected answer “Datatype” (eg, Date, Person, Location, …)When was the French Revolution?  DATE•Hand-crafted classification/rewrite/datatype rules(Could they be automatically learned?)Nonsense,but whocares? It’sonly a fewmore queriesto Google.10Query Rewriting - weights•One wrinkle: Some query rewrites are more reliable than others+“the Louvre Museum is located”Where is the Louvre Museum located?Weight 5if we get a match, it’s probably right+Louvre +Museum +locatedWeight 1Lots of non-answerscould come back too11Step 2: Query search engine•Throw all rewrites to a Web-wide search engine•Retrieve top N answers (100?)•For speed, rely just on search engine’s “snippets”, not the full text of the actual document12Step 3: Mining N-Grams•Unigram, bigram, trigram, … N-gram:list of N adjacent terms in a sequence•Eg, “Web Question Answering: Is More Always Better”–Unigrams: Web, Question, Answering, Is, More, Always, Better–Bigrams: Web Question, Question Answering, Answering Is, Is More, More Always, Always Better–Trigrams: Web Question Answering, Question Answering Is, Answering Is More, Is More Always, More Always Betters13Mining N-Grams•Simple: Enumerate all N-grams (N=1,2,3 say) in all retrieved snippets•Use hash table and other fancy footwork to make this efficient•Weight of an n-gram: occurrence count, each weighted by “reliability” (weight) of rewrite that fetched the document•Example: “Who created the character of Scrooge?”–Dickens - 117–Christmas Carol - 78–Charles Dickens - 75–Disney - 72–Carl Banks - 54–A Christmas - 41–Christmas Carol - 45–Uncle - 3114Step 4: Filtering N-Grams•Each question type is associated with one or more “data-type filters” = regular expression•When…•Where…•What …•Who …•Boost score of n-grams that do match regexp•Lower score of n-grams that don’t match regexp•Details omitted from paper….DateLocationPerson15Step 5: Tiling the Answers Dickens Charles Dickens Mr CharlesScores201510 merged, discardold n-grams Mr Charles DickensScore 45N-Gramstile highest-scoring n-gramN-GramsRepeat, until no more overlap16Experiments•Used the TREC-9 standard query data set•Standard performance metric: MRR–Systems give “top 5 answers”–Score = 1/R, where R is rank of first right answer–1: 1; 2: 0.5; 3: 0.33; 4: 0.25; 5: 0.2; 6+: 017Results [summary]•Standard TREC contest test-bed:~1M documents; 900 questions•E.g., “who is president of Bolivia”•E.g., “what is the exchange rate between England and the US”•Technique doesn’t do too well (though would have placed in top 9 of ~30 participants!)–MRR = 0.262 (ie, right answered ranked about #4-#5)–Why?


View Full Document

UCI ICS 278 - lecture 15 question answering ask MSR

Download lecture 15 question answering ask MSR
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view lecture 15 question answering ask MSR and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view lecture 15 question answering ask MSR 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?