Machine Learning 1010 701 15701 15 781 Spring 2008 Graphical Models II Inference Eric Xing Visit to Asia X1 Tuberculosis X3 XRay Result X7 Smoking Lung Cancer Tuberculosis or Cancer X2 Bronchitis X4 Lecture 19 March 31 2008 X5 X6 Dyspnea Reading Chap 8 C B book X8 Eric Xing 1 Recap of Basic Prob Concepts Joint probability dist on multiple variables z P X 1 X 2 X 3 X 4 X 5 X 6 P X 1 P X 2 X 1 P X 3 X 1 X 2 P X 4 X 1 X 2 X 3 P X 5 X 1 X 2 X 3 X 4 P X 6 X 1 X 2 X 3 X 4 X 5 If Xi s are independent P Xi P Xi z P X 1 X 2 X 3 X 4 X 5 X 6 P X 1 P X 2 P X 3 P X 4 P X 5 P X 6 P X i i z If Xi s are conditionally independent as described by a GM the joint can be factored to simpler products e g X3 X1 p X2 X1 X2 p X3 X2 X6 p X1 p X6 X2 X5 p X4 X1 X4 Eric Xing p X5 X4 P X1 X2 X3 X4 X5 X6 P X1 P X2 X1 P X3 X2 P X4 X1 P X5 X4 P X6 X2 X5 X5 2 1 Eric Xing 3 Markov Random Fields Structure an undirected graph Meaning a node is conditionally independent of every other node in the network given its Directed neighbors Local contingency functions potentials and the cliques in the graph completely determine the joint dist Y1 Y2 X Give correlations between variables but no explicit way to generate samples Eric Xing 4 2 Representation z Defn an undirected graphical model represents a distribution P X1 Xn defined by an undirected graph H and a set of positive potential functions yc associated with cliques of H s t P x1 K xn 1 c xc Z c C where Z is known as the partition function Z c x c x1 K xn c C z Also known as Markov Random Fields Markov networks z The potential function can be understood as an contingency function of its arguments assigning pre probabilistic score of their joint configuration Eric Xing 5 GMs are your old friends Density estimation Parametric and nonparametric methods Regression m s X X X Y Linear conditional mixture nonparametric Classification Generative and discriminative approach Eric Xing Q Q X X 6 3 An incomplete genealogy of graphical models Picture by Zoubin Ghahramani and Sam Roweis Eric Xing 7 Probabilistic Inference z We now have compact representations of probability distributions Graphical Models z A GM M describes a unique probability distribution P z How do we answer queries about P z We use inference as a name for the process of computing answers to such queries Eric Xing 8 4 Query 1 Likelihood z Most of the queries one may ask involve evidence z z z Evidence e is an assignment of values to a set E variables in the domain Without loss of generality E Xk 1 Xn Simplest query compute probability of evidence P e K P x1 K xk e x1 z xk this is often referred to as computing the likelihood of e Eric Xing 9 Query 2 Conditional Probability z Often we are interested in the conditional probability distribution of a variable given the evidence P X e P X e P e P X e P X x e x z z this is the a posteriori belief in X given evidence e We usually query a subset Y of all domain variables X Y Z and don t care about the remaining Z P Y e P Y Z z e z z Eric Xing the process of summing out the don t care variables z is called marginalization and the resulting P y e is called a marginal prob 10 5 Applications of a posteriori Belief z Prediction what is the probability of an outcome given the starting condition A Diagnosis what is the probability of disease fault given symptoms A C Learning under partial observation fill in the unobserved values under an EM setting more later z z B the query node an ancestor of the evidence z z C the query node is a descendent of the evidence z z B The directionality of information flow between variables is not restricted by the directionality of the edges in a GM probabilistic inference can combine evidence form all parts of the network z Eric Xing 11 Query 3 Most Probable Assignment z In this query we want to find the most probable joint assignment MPA for some variables of interest z Such reasoning is usually performed under some given evidence e and ignoring the values of other variables z MPA Y e arg max y P y e arg max y P y z e z z Eric Xing this is the maximum a posteriori configuration of y 12 6 Applications of MPA z Classification z z find most likely label given the evidence Explanation z what is the most likely scenario given the evidence Cautionary note z The MPA of a variable depends on its context the set of variables been jointly queried z Example z MPA of X z MPA of X Y x 0 0 1 1 y P x y 0 0 35 1 0 05 0 0 3 1 0 3 Eric Xing 13 Complexity of Inference Thm Computing P X x e in a GM is NP hard z Hardness does not mean we cannot solve inference z It implies that we cannot find a general procedure that works efficiently for arbitrary GMs z For particular families of GMs we can have provably efficient procedures Eric Xing 14 7 Approaches to inference z z Exact inference algorithms z The elimination algorithm z The junction tree algorithms but will not cover in detail here Approximate inference techniques z Stochastic simulation sampling methods z Markov chain Monte Carlo methods z Variational algorithms will be covered in advanced ML courses Eric Xing 15 Marginalization and Elimination z A signal transduction pathway A B C D E What is the likelihood that protein E is active z Query P e P e P a b c d e d z c b a a na ve summation needs to enumerate over an exponential number of terms By chain decomposition we get P a P b a P c b P d c P e d d Eric Xing c b a 16 8 Elimination on Chains A z B C E D Rearranging terms P e P a P b a P c b P d c P e d d c b d c b a P c b P d c P e d P a P b a a Eric Xing 17 Elimination on Chains X A z B C E D Now we can perform innermost summation P e P c b P d …
View Full Document