1Eric Xing 1Machine LearningMachine Learning1010--701/15701/15--781, Spring 2008781, Spring 2008Graphical ModelsGraphical ModelsEric XingEric XingLecture 18, March 26, 2008Reading: Chap. 8, C.B bookVisit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8X1X2X3X4X5X6X7X8Eric Xing 2Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8What is a graphical model?--- example from medical diagnosticsz A possible world for a patient with lung problem:2Eric Xing 3z Representation: what is the joint probability dist. on multiple variables?z How many state configurations in total? --- 28z Are they all needed to be represented?z Do we get any scientific/medical insight?z Learning: where do we get all this probabilities? z Maximal-likelihood estimation? but how many data do we need?z Where do we put domain knowledge in terms of plausible relationships between variables, and plausible values of the probabilities?z Inference: If not all variables are observable, how to compute the conditional distribution of latent variables given evidence?),,,,,,,,( 87654321XXXXXXXXPRecap of Basic Prob. ConceptsEric Xing 4Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingPatient InformationMedical DifficultiesDiagnostic TestsX1X2X3X4X5X6X7X8Dependencies among variables3Eric Xing 5Probabilistic Graphical Modelsz Represent dependency structure with a graphz Node <-> random variablez Edges encode dependenciesz Absence of edge -> conditional independencez Directed and undirected versionsz Why is this useful?z A language for communicationz A language for computationz A language for developmentz Origins: z Wright 1920’sz Independently developed by Spiegelhalter and Lauritzen in statistics and Pearl in computer science in the late 1980’sEric Xing 6 If Xi's are conditionally independent (as described by a PGM), the joint can be factored to a product of simpler terms, e.g., Why we may favor a PGM? Representation cost: how many probability statements are needed? Algorithms for systematic and efficient inference/learning computation• Exploring the graph structure and probabilistic (e.g., Bayesian, Markovian) semantics Incorporation of domain knowledge and causal (logical) structuresP(X1, X2, X3, X4, X5, X6, X7, X8)= P(X1) P(X2) P(X3| X1) P(X4| X2) P(X5| X2)P(X6| X3, X4) P(X7| X6) P(X8| X5, X6)Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8X1X2X3X4X5X6X7X8Probabilistic Graphical Models, con'd2+2+4+4+4+8+4+8=36, an 8-fold reduction from 28!4Eric Xing 7z Directed edges give causality relationships (Bayesian Network or Directed Graphical Model):z Undirected edges simply give (physical or symmetric) correlations between variables (Markov Random Field or Undirected Graphical model):Two types of GMsEric Xing 8Bayesian Network: Factorization Theoremz Theorem: Given a DAG, The most general form of the probability distribution that is consistent with the graph factors according to “node given its parents”:where is the set of parents of xi. d is the number of nodes(variables) in the graph.P(X1, X2, X3, X4, X5, X6, X7, X8)= P(X1) P(X2) P(X3| X1) P(X4| X2) P(X5| X2)P(X6| X3, X4) P(X7| X6) P(X8| X5, X6)Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8Visit to AsiaTuberculosisTuberculosisor CancerXRay ResultDyspneaBronchitisLung CancerSmokingX1X2X3X4X5X6X7X8X1X2X3X4X5X6X7X8∏=iiiXPP )|()(πXXiπX5Eric Xing 9Structure: DAG• Meaning: a node is conditionally independentof every other node in the network outside its Markov blanket• Local conditional distributions (CPD) and the DAGcompletely determine the joint dist. •Give causality relationships, and facilitate a generativeprocessXY1Y2DescendentAncestorParentChildren's co-parentChildren's co-parentChildBayesian Network: Conditional Independence SemanticsEric Xing 10A CBACBABCLocal Structures & Independenciesz Common parentz Fixing B decouples A and C"given the level of gene B, the levels of A and C are independent"z Cascadez Knowing B decouples A and C"given the level of gene B, the level gene A provides no extra prediction value for the level of gene C"z V-structurez Knowing C couples A and Bbecause A can "explain away" B w.r.t. C"If A correlates to C, then chance for B to also correlate to B will decrease"z The language is compact, the concepts are rich!6Eric Xing 11A simple justificationABCEric Xing 12Graph separation criterionz D-separation criterion for Bayesian networks (D for Directed edges):Definition: variables x and y are D-separated (conditionally independent) given z if they are separated in the moralized ancestral graphz Example:7Eric Xing 13Global Markov properties of DAGsz X is d-separated (directed-separated) from Z given Y if we can't send a ball from any node in X to any node in Z using the "Bayes-ball" algorithm illustrated bellow (and plus some boundary conditions):• Defn: I(G)=all independence properties that correspond to d-separation:• D-separation is sound and complete{});(dsep:)(I YZXYZXGG⊥=Eric Xing 14Example: z Complete the I(G) of this graph:x1x2x4x38Eric Xing 15Towards quantitative specification of probability distributionz Separation properties in the graph imply independence properties about the associated variablesz For the graph to be useful, any conditional independence properties we can derive from the graph should hold for the probability distribution that the graph representsz The Equivalence TheoremFor a graph G,Let D1denote the family of all distributions that satisfy I(G),Let D2denote the family of all distributions that factor according to G,Then D1≡D2.Eric Xing 16ExampleABCp(A,B,C) =9Eric Xing 17ancestorACQhQmT years?AGAGACTree ModelExample, con'dz EvolutionEric Xing 18Example, con'dA AA AX2X3X1XTY2Y3Y1YT... ... Hidden Markov Modelz Speech recognition10Eric Xing 19Example, con'dA0A1AgB0B1BgM0M1F0F1FgC0C1CgSgz Genetic PedigreeEric Xing 200.25a10.75a00.67b10.33b00.550.45a0b001a0b10.10.9a1b0a1b10.3c10.7c0ABCP(a,b,c.d) = P(a)P(b)P(c|a,b)P(d|c)D070.3c0c10.5d10.5d0Conditional probability tables (CPTs)11Eric Xing 21ABCP(a,b,c.d) =
View Full Document