DOC PREVIEW
CMU CS 10708 - Undirected Graphical Models

This preview shows page 1-2-17-18-19-36-37 out of 37 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Clique Trees 3 Let’s get BP right Undirected Graphical Models Here the couples get to swing!Factor divisionIntroducing message passing with divisionLauritzen-Spiegelhalter Algorithm (a.k.a. belief propagation)Clique tree invariantBelief propagation and clique tree invariantSubtree correctnessClique trees versus VEClique tree summaryAnnouncementsSwinging Couples revisitedPotentials (or Factors) in Swinging CouplesComputing probabilities in Markov networks v. BNsNormalization for computing probabilitiesFactorization in Markov networksGlobal Markov assumption in Markov networksThe BN Representation TheoremMarkov networks representation Theorem 1What about the other direction for Markov networks ?Markov networks representation Theorem 2 (Hammersley-Clifford Theorem)Representation Theorem for Markov NetworksCompleteness of separation in Markov networksWhat are the “local” independence assumptions for a Markov network?Local independence assumptions for a Markov networkEquivalence of independencies in Markov networksMinimal I-maps and Markov NetworksHow about a perfect map?Unifying properties of BNs and MNsWhat you need to know so far about Markov networksSome common Markov networks and generalizationsPairwise Markov NetworksA very simple vision applicationLogarithmic representationLog-linear Markov network (most common representation)Structure in cliquesFactor graphsSummary of types of Markov nets1Clique Trees 3Let’s get BP rightUndirected Graphical ModelsHere the couples get to swing!Graphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityOctober 25th, 2006Readings:K&F: 9.1, 9.2, 9.3, 9.4K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.610-708 – Carlos Guestrin 20062 Factor divisionLet X and Y be disjoint set of variablesConsider two factors: 1(X,Y) and 2(Y)Factor =1/20/0=010-708 – Carlos Guestrin 20063 Introducing message passing with divisionVariable elimination (message passing with multiplication)message:belief:Message passing with division:message:belief update:C2: SEC4: GJSC1: CDC3: GDS10-708 – Carlos Guestrin 20064 Separator potentials ijone per edge (same both directions)holds “last message”initialized to 1Message i!jwhat does i think the separator potential should be? i!jupdate belief for j:pushing j to what i thinks about separatorreplace separator potential:C2: SEC4: GJSC1: CDC3: GDSLauritzen-Spiegelhalter Algorithm (a.k.a. belief propagation)10-708 –  Carlos Guestrin 20065 Clique tree invariantClique tree potential:Product of clique potentials divided by separators potentialsClique tree invariant:P(X) = (X)10-708 – Carlos Guestrin 20066 Belief propagation and clique tree invariantTheorem: Invariant is maintained by BP algorithm!BP reparameterizes clique potentials and separator potentialsAt convergence, potentials and messages are marginal distributions10-708 – Carlos Guestrin 20067 Subtree correctnessInformed message from i to j, if all messages into i (other than from j) are informedRecursive definition (leaves always send informed messages)Informed subtree:All incoming messages informedTheorem:Potential of connected informed subtree T’ is marginal over scope[T’]Corollary:At convergence, clique tree is calibrated i = P(scope[i]) ij = P(scope[ij])10-708 – Carlos Guestrin 20068 Clique trees versus VEClique tree advantagesMulti-query settingsIncremental updatesPre-computation makes complexity explicitClique tree disadvantagesSpace requirements – no factors are “deleted”Slower for single queryLocal structure in factors may be lost when they are multiplied together into initial clique potential10-708 –  Carlos Guestrin 20069 Clique tree summarySolve marginal queries for all variables in only twice the cost of query for one variableCliques correspond to maximal cliques in induced graphTwo message passing approachesVE (the one that multiplies messages)BP (the one that divides by old message)Clique tree invariantClique tree potential is always the sameWe are only reparameterizing clique potentialsConstructing clique tree for a BNfrom elimination orderfrom triangulated (chordal) graphRunning time (only) exponential in size of largest cliqueSolve exactly problems with thousands (or millions, or more) of variables, and cliques with tens of nodes (or less)10-708 – Carlos Guestrin 200610 AnnouncementsRecitation tomorrow, don’t miss it!!!Khalid on Undirected Models10-708 – Carlos Guestrin 200611 Swinging Couples revisitedThis is no perfect map in BNsBut, an undirected model will be a perfect map10-708 – Carlos Guestrin 200612 Potentials (or Factors) in Swinging Couples10-708 – Carlos Guestrin 200613 Computing probabilities in Markov networks v. BNsIn a BN, can compute prob. of an instantiation by multiplying CPTsIn an Markov networks, can only compute ratio of probabilities directly10-708 – Carlos Guestrin 200614 Normalization for computing probabilitiesTo compute actual probabilities, must compute normalization constant (also called partition function)Computing partition function is hard! ! Must sum over all possible assignments10-708 – Carlos Guestrin 200615 Factorization in Markov networksGiven an undirected graph H over variables X={X1,...,Xn}A distribution P factorizes over H if 9 subsets of variables D1µX,…, DmµX, such that the Di are fully connected in Hnon-negative potentials (or factors) 1(D1),…, m(Dm)also known as clique potentialssuch that Also called Markov random field H, or Gibbs distribution over H10-708 – Carlos Guestrin 200616 Global Markov assumption in Markov networksA path X1 – … – Xk is active when set of variables Z are observed if none of Xi 2 {X1,…,Xk} are observed (are part of Z) Variables X are separated from Y given Z in graph H, sepH(X;Y|Z), if there is no active path between any X2X and any Y2Y given ZThe global Markov assumption for a Markov network H is10-708 – Carlos Guestrin 200617 The BN Representation TheoremJoint probabilitydistribution:ObtainIf conditionalindependenciesin BN are subset of conditional independencies in PImportant because: Independencies are sufficient to obtain BN structure GIf joint probabilitydistribution:ObtainThen conditionalindependenciesin BN


View Full Document

CMU CS 10708 - Undirected Graphical Models

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Undirected Graphical Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Undirected Graphical Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Undirected Graphical Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?