Bayesian Networks Inference Machine Learning 10701 15781 Carlos Guestrin Carnegie Mellon University March 21st 2007 2005 2007 Carlos Guestrin Handwriting recognition Character recognition e g kernel SVMs rr r r r c r a z c bc 2005 2007 Carlos Guestrin Handwriting recognition 2 2005 2007 Carlos Guestrin Factored joint distribution Preview Flu Allergy Sinus Headache Nose 2005 2007 Carlos Guestrin Key Independence assumptions Flu Allergy Sinus Headache Nose Knowing sinus separates the variables from each other 2005 2007 Carlos Guestrin The independence assumption Flu Allergy Sinus Headache Nose Local Markov Assumption A variable X is independent of its non descendants given its parents 2005 2007 Carlos Guestrin Explaining away Flu Local Markov Assumption A variable X is independent of its non descendants given its parents Allergy Sinus Headache Nose 2005 2007 Carlos Guestrin The Representation Theorem Joint Distribution to BN BN Encodes independence assumptions If conditional independencies Obtain in BN are subset of conditional independencies in P 2005 2007 Carlos Guestrin Joint probability distribution A general Bayes net Set of random variables Directed acyclic graph Encodes independence assumptions CPTs Joint distribution 2005 2007 Carlos Guestrin How many parameters in a BN Discrete variables X1 Xn Graph Defines parents of Xi PaXi CPTs P Xi PaXi 2005 2007 Carlos Guestrin Another example Variables B Burglar E Earthquake A Burglar alarm N Neighbor calls R Radio report Both burglars and earthquakes can set off the alarm If the alarm sounds a neighbor may call An earthquake may be announced on the radio 2005 2007 Carlos Guestrin Independencies encoded in BN We said All you need is the local Markov assumption Xi NonDescendantsXi PaXi But then we talked about other in dependencies e g explaining away What are the independencies encoded by a BN Only assumption is local Markov But many others can be derived using the algebra of conditional independencies 2005 2007 Carlos Guestrin Understanding independencies in BNs BNs with 3 nodes Local Markov Assumption Indirect causal effect X Z A variable X is independent of its non descendants given its parents Y Indirect evidential effect X Z Common effect Y X Common cause Z Z X Y Y 2005 2007 Carlos Guestrin Understanding independencies in BNs Some examples A B C E D G F H I J K 2005 2007 Carlos Guestrin An active trail Example A B C D G E F F F When are A and H independent 2005 2007 Carlos Guestrin H Active trails formalized A path X1 X2 Xk is an active trail when variables O X1 Xn are observed if for each consecutive triplet in the trail Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is observed Xi O or one of its descendents 2005 2007 Carlos Guestrin Active trails and independence Theorem Variables Xi and Xj are independent given Z X1 Xn if the is no active trail between Xi and Xj when variables Z X1 Xn are observed A B C E D G F H I 2005 2007 Carlos Guestrin J K The BN Representation Theorem If conditional independencies in BN are subset of conditional independencies in P Obtain Joint probability distribution Important because Every P has at least one BN structure G If joint probability distribution Obtain Then conditional independencies in BN are subset of conditional independencies in P Important because Read independencies of P from BN structure G 2005 2007 Carlos Guestrin Learning Bayes nets Known structure Unknown structure Fully observable data Missing data Data x 1 x m CPTs P Xi PaXi structure 2005 2007 Carlos Guestrin parameters Learning the CPTs Data For each discrete variable Xi x 1 x m 2005 2007 Carlos Guestrin What you need to know Bayesian networks Semantics of a BN Conditional independence assumptions Representation A compact representation for large probability distributions Not an algorithm Variables Graph CPTs Why BNs are useful Learning CPTs from fully observable data Play with applet 2005 2007 Carlos Guestrin General probabilistic inference Flu Query Sinus Headache Using Bayes rule Normalization Allergy 2005 2007 Carlos Guestrin Nose Marginalization Flu Sinus Nose t 2005 2007 Carlos Guestrin Probabilistic inference example Flu Allergy Sinus Headache Nose t Inference seems exponential in number of variables Actually inference in graphical models is NP hard 2005 2007 Carlos Guestrin Fast probabilistic inference example Variable elimination Flu Allergy Sinus Headache Nose t Potential for Exponential reduction in computation 2005 2007 Carlos Guestrin Understanding variable elimination Exploiting distributivity Flu Sinus Nose t 2005 2007 Carlos Guestrin Understanding variable elimination Order can make a HUGE difference Flu Allergy Sinus Headache Nose t 2005 2007 Carlos Guestrin Understanding variable elimination Another example Sinus Nose t Headache Pharmacy 2005 2007 Carlos Guestrin Variable elimination algorithm Given a BN and a query P X e P X e IMPORTANT Instantiate evidence e Choose an ordering on variables e g X1 Xn For i 1 to n If Xi X e Collect factors f1 fk that include Xi Generate a new factor by eliminating Xi from these factors Variable Xi has been eliminated Normalize P X e to obtain P X e 2005 2007 Carlos Guestrin Complexity of variable elimination Poly tree graphs Variable elimination order Start from leaves up find topological order eliminate variables in reverse order Linear in number of variables versus exponential 2005 2007 Carlos Guestrin Complexity of variable elimination Graphs with loops Exponential in number of variables in largest factor generated 2005 2007 Carlos Guestrin Complexity of variable elimination Tree width Moralize graph Connect parents into a clique and remove edge directions Complexity of VE elimination Only exponential in tree width Tree width is maximum node cut 1 2005 2007 Carlos Guestrin Example Large tree width with small number of parents Compact representation Easy inference 2005 2007 Carlos Guestrin Choosing an elimination order Choosing best order is NP complete Reduction from MAX Clique Many good heuristics some with guarantees Ultimately can t beat NP hardness of inference Even optimal order can lead to exponential variable elimination computation In practice Variable elimination often very effective Many many many approximate inference approaches available when variable elimination too expensive 2005 2007 Carlos Guestrin Most likely explanation MLE Flu Query Allergy Sinus Headache
View Full Document