DOC PREVIEW
Berkeley COMPSCI 188 - Lecture 16: Bayes’ Nets III

This preview shows page 1-2-23-24 out of 24 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CS 188: Artificial Intelligence Spring 2006TodayReachability (the Bayes’ Ball)ExampleSlide 6SummaryInferenceReminder: Alarm NetworkAtomic InferenceSlide 11Slide 12Slide 13Inference by EnumerationSlide 15Evaluation TreeVariable Elimination: IdeaBasic ObjectsBasic OperationsSlide 20Slide 21Slide 22General Variable EliminationSlide 24Slide 25CS 188: Artificial IntelligenceSpring 2006Lecture 16: Bayes’ Nets III3/16/2006Dan Klein – UC BerkeleyTodayLast time:Bayes netsConditional independenceToday:More conditional independenceInference to answer queriesReachability (the Bayes’ Ball)Correct algorithm:Start at source nodeTry to reach target by searchStates: node, along with previous arcSuccessor function:Unobserved nodes:To any childTo any parent if coming from a child (or start)Observed nodes:From parent to parentIf you can’t reach a node, it’s conditionally independent of the start nodeSX XSSX XSExampleRTBDLT’YesYesExampleVariables:R: RainingT: TrafficD: Roof dripsS: I’m sadQuestions:TSDRYesSummaryBayes nets compactly encode joint distributionsGuaranteed independencies of distributions can be deduced from BN graph structureThe Bayes’ ball algorithm (aka d-separation)A Bayes net may have other independencies that are not detectable until you inspect its specific distributionInferenceInference: calculating some statistic from a joint probability distributionExamples:Posterior marginal probability:Most likely explanation:RTBDLT’Reminder: Alarm NetworkAtomic InferenceGiven unlimited time, inference in BNs is easyRecipe:State the marginal probabilities you wantFigure out ALL the atomic probabilities you needCalculate and combine themExample:ExampleWhere did we use the BN structure?We didn’t!ExampleExampleNormalizeInference by EnumerationAtomic inference is extremely slow!Slightly clever way to save work:Move the sums as far right as possibleExample:ExampleEvaluation TreeView the nested sums as a computation tree:Still repeated work: calculate P(m | a) P(j | a) twice, etc.Variable Elimination: IdeaLots of redundant work in the computation tree!We can save time if we cache all partial resultsThis is the basic idea behind variable eliminationBasic ObjectsTrack objects called factorsInitial factors are local CPTsDuring elimination, create new factorsAnatomy of a factor:Variables introducedVariables summed outArgument variables, always non-evidence variables4 numbers, one for each value of D and EBasic OperationsFirst basic operation: join factorsCombining two factors:Just like a database joinBuild a factor over the union of the domainsExample:Basic OperationsSecond basic operation: marginalizationTake a factor and sum out a variableShrinks a factor to a smaller oneA projection operationExample:ExampleExampleGeneral Variable EliminationQuery:Start with initial factors:Local CPTs (but instantiated by evidence)While there are still hidden variables (not Q or evidence):Pick a hidden variable HJoin all factors mentioning HProject out HJoin all remaining factors and normalizeExampleChoose AExampleChoose


View Full Document

Berkeley COMPSCI 188 - Lecture 16: Bayes’ Nets III

Documents in this Course
CSP

CSP

42 pages

Metrics

Metrics

4 pages

HMMs II

HMMs II

19 pages

NLP

NLP

23 pages

Midterm

Midterm

9 pages

Agents

Agents

8 pages

Lecture 4

Lecture 4

53 pages

CSPs

CSPs

16 pages

Midterm

Midterm

6 pages

MDPs

MDPs

20 pages

mdps

mdps

2 pages

Games II

Games II

18 pages

Load more
Download Lecture 16: Bayes’ Nets III
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 16: Bayes’ Nets III and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 16: Bayes’ Nets III 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?