Graphical Models Aarti Singh Slides Courtesy Carlos Guestrin Machine Learning 10 701 15 781 Nov 15 2010 Directed Bayesian Networks Compact representation for a joint probability distribution Bayes Net Directed Acyclic Graph DAG Conditional Probability Tables CPTs distribution factorizes according to graph distribution satisfies local Markov independence assumptions xk is independent of its non descendants given its parents pak Directed Bayesian Networks Graph encodes local independence assumptions local Markov Assumptions Other independence assumptions can be read off the graph using d separation distribution factorizes according to graph distribution satisfies all independence assumptions found by d separation F I Does the graph capture all independencies Yes for almost all distributions that factorize according to graph More in 10 708 D separation a is D separated from b by c a b c Three important configurations c a c b a Causal direction a b c b Common cause a b c V structure Explaining away a b c b a b a c c Undirected Markov Random Fields Popular in statistical physics computer vision sensor networks social networks protein protein interaction network Example Image Denoising xi value at pixel i yi observed noisy value Conditional Independence properties No directed edges Conditional independence graph separation A B C non intersecting set of nodes A B C if all paths between nodes in A B are blocked i e path contains a node z in C Factorization Joint distribution factorizes according to the graph Arbitrary positive function Clique xC x1 x2 Maximal clique xC x2 x3 x4 typically NP hard to compute MRF Example Often Energy of the clique e g lower if variables in clique take similar values MRF Example Ising model cliques are edges xC xi xj binary variables xi 1 1 1 if xi xj 1 if xi xj Probability of assignment is higher if neighbors xi and xj are same Hammersley Clifford Theorem Set of distributions that factorize according to the graph F Set of distributions that respect conditional independencies implied by graph separation I I F Important because Given independencies of P can get MRF structure G I F Important because Read independencies of P from MRF structure G What you should know Graphical Models Directed Bayesian networks Undirected Markov Random Fields A compact representation for large probability distributions Not an algorithm Representation of a BN MRF Variables Graph CPTs Why BNs and MRFs are useful D separation conditional independence factorization Topics in Graphical Models Representation Which joint probability distributions does a graphical model represent Inference How to answer questions about the joint probability distribution Marginal distribution of a node variable Most likely assignment of node variables Learning How to learn the parameters and structure of a graphical model Inference Possible queries 1 Marginal distribution e g P S Posterior distribution e g P F H 1 Flu Allergy Sinus 2 Most likely assignment of nodes arg max P F f A a S s N n H 1 f a s n Headache Nose Inference Possible queries 1 Marginal distribution e g P S Posterior distribution e g P F H 1 Flu Allergy Sinus P F H 1 P F H 1 P F H 1 P H 1 Headache Nose P F H 1 P F f H 1 f P F H 1 will focus on computing this posterior will follow with only constant times more effort Marginalization Need to marginalize over other vars Flu Allergy P S P f a S n h f a n h Sinus P F H 1 P F a s n H 1 a s n 23 terms Headache Nose To marginalize out n binary variables need to sum over 2n terms Inference seems exponential in number of variables Actually inference in graphical models is NP hard Bayesian Networks Example 18 binary attributes Inference P BatteryAge Starts f need to sum over 216 terms Not impressed HailFinder BN more than 354 58149737003040059690 390169 terms Fast Probabilistic Inference P F H 1 P F a s n H 1 Flu a s n Allergy P F P a P s F a P n s P H 1 s a s n P F P a P s F a P H 1 s P n s a s Sinus n Headache Push sums in as far as possible Distributive property x1z x2z z x1 x2 2 multiply 1 mulitply Nose Fast Probabilistic Inference P F H 1 P F a s n H 1 a s n Flu 8 values x 4 multiplies Allergy P F P a P s F a P n s P H 1 s a s n 1 P F P a P s F a P H 1 s P n s a s Sinus n P F P a P s F a P H 1 s a s 4 values x 1 multiply P F P a g1 F a a 2 values x 1 multiply P F g2 F 1 multiply Headache Nose 32 multiplies vs 7 multiplies 2n vs n 2k k scope of largest factor Potential for exponential reduction in computation Fast Probabilistic Inference Variable Elimination P F H 1 P F P a P s F a P n s P H 1 s a s n 1 Flu Allergy P F P a P s F a P H 1 s P n s a s n Sinus P H 1 F a P H 1 F Headache Potential for exponential reduction in computation Nose Variable Elimination Order can make a HUGE difference P F H 1 P F P a P s F a P n s P H 1 s a s n 1 Flu Allergy P F P a P s F a P H 1 s P n s a s n Sinus P H 1 F a P H 1 F Headache P F H 1 P F P a P s F a P n s P H 1 s a Nose ns g F a n 3 scope of largest factor Potential for exponential reduction in computation Variable Elimination Order can make a HUGE difference Y X1 X2 X3 X4 g Y g X1 X2 Xn 1 scope of largest factor n scope of largest factor Variable Elimination Algorithm Given BN DAG and CPTs initial factors p xi pai for i 1 n Given Query P X e P X e X set of variables IMPORTANT Instantiate evidence e e g set H 1 Choose an ordering on the variables e g X1 Xn For i 1 to n If Xi X e Collect factors g1 gk that include Xi Generate a new factor by eliminating Xi from these factors Variable Xi has been eliminated Remove g1 gk from set of factors but add g Normalize P X e to obtain P X e Complexity for Poly tree graphs Variable elimination order Consider undirected version Start from leaves up find topological order eliminate variables in reverse order Does not create any factors bigger than original CPTs For polytrees inference is linear in variables vs exponential in general Complexity for graphs with loops Loop undirected cycle Linear in …
View Full Document