Required Readings from Koller Friedman Representation 2 1 2 2 Inference 5 1 6 1 6 2 6 7 1 Optional 2 3 5 2 5 3 6 3 6 7 2 Bayesian Networks Representation cont Inference Machine Learning 10701 15781 Carlos Guestrin Carnegie Mellon University March 22st 2006 1 Announcements One page project proposal due now We ll go over midterm in this week s recitation Homework 4 out later today due April 5th two weeks from today 2 Handwriting recognition Character recognition e g kernel SVMs rr r r r c r a c z bc 3 Handwriting recognition 2 4 Car starts BN 18 binary attributes Inference P BatteryAge Starts f 218 terms why so fast Not impressed HailFinder BN more than 354 5 58149737003040059690390169 terms Factored joint distribution Preview Flu Allergy Sinus Headache Nose 6 The independence assumption Flu Allergy Sinus Headache Nose Local Markov Assumption A variable X is independent of its non descendants given its parents 7 Explaining away Flu Local Markov Assumption A variable X is independent of its non descendants given its parents Allergy Sinus Headache Nose 8 Chain rule Joint distribution Flu Allergy Local Markov Assumption A variable X is independent of its non descendants given its parents Sinus Headache Nose 9 Two trivial special cases Edgeless graph Fully connected graph 10 The Representation Theorem Joint Distribution to BN BN Encodes independence assumptions If conditional independencies Obtain in BN are subset of conditional independencies in P Joint probability distribution 11 Real Bayesian networks applications Diagnosis of lymph node disease Speech recognition Microsoft office and Windows http www research microsoft com research dtg Study Human genome Robot mapping Robots to identify meteorites to study Modeling fMRI data Anomaly detection Fault dianosis Modeling sensor network data 12 A general Bayes net Set of random variables Directed acyclic graph Encodes independence assumptions CPTs Joint distribution 13 Another example Variables B Burglar E Earthquake A Burglar alarm N Neighbor calls R Radio report Both burglars and earthquakes can set off the alarm If the alarm sounds a neighbor may call An earthquake may be announced on the radio 14 Another example Building the BN B Burglar E Earthquake A Burglar alarm N Neighbor calls R Radio report 15 Independencies encoded in BN We said All you need is the local Markov assumption Xi NonDescendantsXi PaXi But then we talked about other in dependencies e g explaining away What are the independencies encoded by a BN Only assumption is local Markov But many others can be derived using the algebra of conditional independencies 16 Understanding independencies in BNs BNs with 3 nodes Local Markov Assumption Indirect causal effect X Z Y Indirect evidential effect X Z A variable X is independent of its non descendants given its parents Common effect Y X Common cause Y Z Z X Y 17 Understanding independencies in BNs Some examples A B C E D G F H I J K 18 An active trail Example A B C D G E H F F F When are A and H independent 19 Active trails formalized A path X1 X2 Xk is an active trail when variables O X1 Xn are observed if for each consecutive triplet in the trail Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is not observed Xi O Xi 1 Xi Xi 1 and Xi is observed Xi O or one of its descendents 20 Active trails and independence Theorem Variables Xi and Xj are independent given Z X1 Xn if the is no active trail between Xi and Xj when variables Z X1 Xn are observed A B C E D G F H I J K 21 The BN Representation Theorem If conditional independencies in BN are subset of conditional independencies in P Obtain Joint probability distribution Important because Every P has at least one BN structure G If joint probability distribution Obtain Then conditional independencies in BN are subset of conditional independencies in P Important because Read independencies of P from BN structure G 22 Simpler BNs A distribution can be represented by many BNs Simpler BN requires fewer parameters 23 Learning Bayes nets Known structure Unknown structure Fully observable data Missing data Data CPTs P Xi PaXi x 1 x m structure parameters 24 Learning the CPTs Data For each discrete variable Xi x 1 x m 25 What you need to know Bayesian networks A compact representation for large probability distributions Not an algorithm Semantics of a BN Conditional independence assumptions Representation Variables Graph CPTs Why BNs are useful Learning CPTs from fully observable data Play with applet 26 General probabilistic inference Flu Query Allergy Sinus Headache Nose Using Bayes rule Normalization 27 Marginalization Flu Allergy t Sinus 28 Probabilistic inference example Flu Allergy Sinus Headache Nose t 29 Inference seems exponential in number of variables Inference is NP hard Actually P complete Reduction 3 SAT X 1 X 2 X 3 X 2 X 3 X 4 30 Inference unlikely to be efficient in general but Fast probabilistic inference example Variable elimination Flu Allergy Sinus Headache Nose t 31 Potential for Exponential reduction in computation Understanding variable elimination Exploiting distributivity Flu Sinus Nose t 32 Understanding variable elimination Order can make a HUGE difference Flu Allergy Sinus Headache Nose t 33 Understanding variable elimination Intermediate results Flu Allergy Sinus Headache Nose t 34 Intermediate results are probability distributions Understanding variable elimination Another example Sinus Nose t Headache Pharmacy 35 Pruning irrelevant variables Flu Allergy Sinus Headache Nose t Prune all non ancestors of query variables 36 Variable elimination algorithm Given a BN and a query P X e P X e Instantiate evidence e IMPORTANT Prune non ancestors of X e Choose an ordering on variables e g X1 Xn For i 1 to n If Xi X e Collect factors f1 fk that include Xi Generate a new factor by eliminating Xi from these factors Variable Xi has been eliminated Normalize P X e to obtain P X e 37 Complexity of variable elimination Poly tree graphs Variable elimination order Start from leaves up find topological order eliminate variables in reverse order 38 Linear in number of variables versus exponential Complexity of variable elimination Graphs with loops 39 Exponential in number of variables in largest factor generated Complexity of variable elimination Tree width Moralize graph Connect parents into a clique and remove edge directions Complexity of VE elimination Only exponential in tree width Tree width is maximum node cut 1 40
View Full Document