Graphical Models Aarti Singh Slides Courtesy Carlos Guestrin Machine Learning 10 701 15 781 Nov 10 2010 Recitation HMMs Graphical Models Strongly recommended Place NSH 1507 Note Time 5 6 pm Min iid to dependent data HMM Graphical Models sequential dependence general dependence Applications Character recognition e g kernel SVMs rr r r r c r c c c cc Applications Webpage Classification Sports Science News Applications Speech recognition Diagnosis of diseases Study Human genome Robot mapping Modeling fMRI data Fault diagnosis Modeling sensor network data Modeling protein protein interactions Weather prediction Computer vision Statistical physics Many many more Graphical Models Key Idea Conditional independence assumptions useful but Na ve Bayes is extreme Graphical models express sets of conditional independence assumptions via graph structure Graph structure plus associated parameters define joint probability distribution over set of variables nodes Two types of graphical models Directed graphs aka Bayesian Networks Undirected graphs aka Markov Random Fields Topics in Graphical Models Representation Which joint probability distributions does a graphical model represent Inference How to answer questions about the joint probability distribution Marginal distribution of a node variable Most likely assignment of node variables Learning How to learn the parameters and structure of a graphical model Conditional Independence X is conditionally independent of Y given Z probability distribution governing X is independent of the value of Y given the value of Z Equivalent to Also to 9 Directed Bayesian Networks Representation Which joint probability distributions does a graphical model represent For any arbitrary distribution Chain rule More generally Fully connected directed graph between X1 Xn Directed Bayesian Networks Representation Which joint probability distributions does a graphical model represent Absence of edges in a graphical model conveys useful information Directed Bayesian Networks Representation Which joint probability distributions does a graphical model represent BN is a directed acyclic graph DAG that provides a compact representation for joint distribution Local Markov Assumption A variable X is independent of its non descendants given its parents only the parents Bayesian Networks Example Suppose we know the following Flu The flu causes sinus inflammation Allergies cause sinus inflammation Sinus inflammation causes a runny nose Sinus inflammation causes headaches Causal Network Allergy Sinus Headache Nose Local Markov Assumption If you have no sinus infection then flu has no influence on headache flu causes headache but only through sinus Markov independence assumption Local Markov Assumption A variable X is independent of its non descendants given its parents only the parents parents non desc assumption S H N F A F A S S F A N F A H A F Flu H F A N S Sinus N F A H S F A A F Headache Allergy Nose Markov independence assumption Local Markov Assumption A variable X is independent of its nondescendants given its parents only the parents Joint distribution Flu P F A S H N Allergy P F P F A P S F A P H S F A P N S F A H Chain rule Sinus P F P A P S F A P H S P N S Markov Assumption F A H F A S N F A H S Headache Nose How many parameters in a BN Discrete variables X1 Xn Directed Acyclic Graph DAG Defines parents of Xi PaXi CPTs Conditional Probability Tables P Xi PaXi F A S H N E g Xi S PaXi F A F f A f S t S f 0 9 0 1 F t A f 0 8 0 2 F f A t 0 7 0 3 F t A t 0 3 0 7 n variables K values max d parents node O nK x Kd Two trivial special cases Fully disconnected graph Fully connected graph X1 X1 X2 X3 Xi X2 X3 X4 parents non descendants X1 Xi 1 Xi 1 Xn Xi X1 Xi 1 Xi 1 Xn X4 Xi parents X1 Xi 1 non descendants No independence assumption Bayesian Networks Example Xi X1 Xi 1 Xi 1 Xn Y Na ve Bayes Y X1 X2 P X1 Xn Y P Y P X1 Y P X1 Y X3 X4 HMM S1 S2 ST 1 ST O1 O2 OT 1 OT Explaining Away Local Markov Assumption A variable X is independent of its nondescendants given its parents only the parents F A P F A t P F F A S P F A t S t P F S t No Flu P F t S t is high but P F t A t S t not as high since A t explains away S t Infact P F t A t S t P F t S t F A N No Allergy Sinus Headache Nose Independencies encoded in BN We said All you need is the local Markov assumption Xi NonDescendantsXi PaXi But then we talked about other in dependencies e g explaining away What are the independencies encoded by a BN Only assumption is local Markov But many others can be derived using the algebra of conditional independencies D separation a is D separated from b by c a b c Three important configurations c a c b a Causal direction b Common cause a b b V structure Explaining away a c c D separation A B C non intersecting set of nodes A is D separated from B by C A B C if all paths between nodes in A B are blocked i e path contains a node z such that either z z and z in C OR z and neither z nor any of its descendants is in C D separation Example A is D separated from B by C if every path between A and B contains a node z such that either z or z And z in C And neither z nor its descendants are in C z a a b f Yes Consider z f or z e f e c b a b c No Consider z e Representation Theorem Set of distributions that factorize according to the graph F Set of distributions that respect conditional independencies implied by d separation properties of graph I F I Important because Given independencies of P can get BN structure G I F Important because Read independencies of P from BN structure G Markov Blanket Conditioning on the Markov Blanket node i is independent of all other nodes Only terms that remain are the ones which involve i Markov Blanket of node i Set of parents children and coparents of node i Undirected Markov Random Fields Popular in statistical physics and computer vision communities Example Image Denoising xi value at pixel i yi observed noisy value Conditional Independence properties No directed edges Conditional independence graph separation A B C non intersecting set of nodes A B C if all paths between nodes in A B are blocked i e path contains a node z in C Factorization Joint distribution factorizes according to the graph Arbitrary positive function Clique xC x1 x2 Maximal clique xC x2 x3 x4 typically NP hard to compute MRF Example Often Energy of the clique e g lower if variables in clique …
View Full Document