CS 188: Artificial Intelligence Spring 2006TodayReachability (the Bayes’ Ball)ExampleSlide 6SummaryInferenceReminder: Alarm NetworkAtomic InferenceSlide 11Slide 12Slide 13Inference by EnumerationSlide 15Evaluation TreeVariable Elimination: IdeaBasic ObjectsBasic OperationsSlide 20Slide 21Slide 22General Variable EliminationSlide 24Slide 25CS 188: Artificial IntelligenceSpring 2006Lecture 16: Bayes’ Nets III3/16/2006Dan Klein – UC BerkeleyTodayLast time:Bayes netsConditional independenceToday:More conditional independenceInference to answer queriesReachability (the Bayes’ Ball)Correct algorithm:Start at source nodeTry to reach target by searchStates: node, along with previous arcSuccessor function:Unobserved nodes:To any childTo any parent if coming from a child (or start)Observed nodes:From parent to parentIf you can’t reach a node, it’s conditionally independent of the start nodeSX XSSX XSExampleRTBDLT’YesYesExampleVariables:R: RainingT: TrafficD: Roof dripsS: I’m sadQuestions:TSDRYesSummaryBayes nets compactly encode joint distributionsGuaranteed independencies of distributions can be deduced from BN graph structureThe Bayes’ ball algorithm (aka d-separation)A Bayes net may have other independencies that are not detectable until you inspect its specific distributionInferenceInference: calculating some statistic from a joint probability distributionExamples:Posterior marginal probability:Most likely explanation:RTBDLT’Reminder: Alarm NetworkAtomic InferenceGiven unlimited time, inference in BNs is easyRecipe:State the marginal probabilities you wantFigure out ALL the atomic probabilities you needCalculate and combine themExample:ExampleWhere did we use the BN structure?We didn’t!ExampleExampleNormalizeInference by EnumerationAtomic inference is extremely slow!Slightly clever way to save work:Move the sums as far right as possibleExample:ExampleEvaluation TreeView the nested sums as a computation tree:Still repeated work: calculate P(m | a) P(j | a) twice, etc.Variable Elimination: IdeaLots of redundant work in the computation tree!We can save time if we cache all partial resultsThis is the basic idea behind variable eliminationBasic ObjectsTrack objects called factorsInitial factors are local CPTsDuring elimination, create new factorsAnatomy of a factor:Variables introducedVariables summed outArgument variables, always non-evidence variables4 numbers, one for each value of D and EBasic OperationsFirst basic operation: join factorsCombining two factors:Just like a database joinBuild a factor over the union of the domainsExample:Basic OperationsSecond basic operation: marginalizationTake a factor and sum out a variableShrinks a factor to a smaller oneA projection operationExample:ExampleExampleGeneral Variable EliminationQuery:Start with initial factors:Local CPTs (but instantiated by evidence)While there are still hidden variables (not Q or evidence):Pick a hidden variable HJoin all factors mentioning HProject out HJoin all remaining factors and normalizeExampleChoose AExampleChoose
View Full Document