DOC PREVIEW
CMU CS 10708 - Generalized Belief Propagation

This preview shows page 1-2-3-4-5 out of 14 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

11Generalized Belief PropagationGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 12th, 2008Readings:K&F: 10.2, 10.310-708 – Carlos Guestrin 2006-200810-708 – Carlos Guestrin 2006-20082More details on Loopy BP Numerical problem: messages < 1 get multiplied togetheras we go around the loops numbers can go to zero normalize messages to one: Zi!jdoesn’t depend on Xj, so doesn’t change the answer Computing node “beliefs” (estimates of probs.): DifficultySATGradeHappyJobCoherenceLetterIntelligence210-708 – Carlos Guestrin 2006-20083Loopy BP in Factor graphs From node i to factor j: F(i) factors whose scope includes Xi From factor j to node i: Scope[j] = Y[{Xi} Belief:  Node: Factor:A B C D EABC ABD BDE CDE10-708 – Carlos Guestrin 20064Loopy BP v. Clique trees: Two ends of a spectrumDifficultySATGradeHappyJobCoherenceLetterIntelligenceDIGGJSLHGJCDGSI310-708 – Carlos Guestrin 20065Generalize cluster graph Generalized cluster graph: For set of factors F Undirected graph Each node i associated with a cluster Ci Family preserving: for each factor fj2 F, 9 node i such that scope[fi] Ci Each edge i – j is associated with a set of variables Sij Ci Cj10-708 – Carlos Guestrin 20066Running intersection property (Generalized) Running intersection property (RIP) Cluster graph satisfies RIP if whenever X2 Ciand X2 Cjthen 9 one and only one path from Cito Cjwhere X2Suvfor every edge (u,v) in the path410-708 – Carlos Guestrin 20067Examples of cluster graphs10-708 – Carlos Guestrin 20068Two cluster graph satisfying RIP with different edge sets510-708 – Carlos Guestrin 20069Generalized BP on cluster graphs satisfying RIP Initialization: Assign each factor  to a clique (), Scope[]C() Initialize cliques:  Initialize messages: While not converged, send messages: Belief:10-708 – Carlos Guestrin 200610Cluster graph for Loopy BPDifficultySATGradeHappyJobCoherenceLetterIntelligence610-708 – Carlos Guestrin 200611What if the cluster graph doesn’t satisfy RIP10-708 – Carlos Guestrin 200612Region graphs to the rescue Can address generalized cluster graphs that don’t satisfy RIP using region graphs: Book: 10.3 Example in your homework! 710-708 – Carlos Guestrin 200613Revisiting Mean-Fields Choice of Q: Optimization problem:Announcements  Recitation tomorrow HW5 out soon Will not cover relational models this semester Instead, recommend Pedro Domingos’ tutorial on Markov Logic Markov logic is one example of a relational probabilistic model November 14thfrom 1:00 pm to 3:30 pm in Wean 462310-708 – Carlos Guestrin 2006-200814810-708 – Carlos Guestrin 200615Interpretation of energy functional Energy functional: Exact if P=Q: View problem as an approximation of entropy term:10-708 – Carlos Guestrin 200616Entropy of a tree distribution Entropy term: Joint distribution: Decomposing entropy term: More generally:  dinumber neighbors of XiDifficultySATGradeCoherenceLetterIntelligence910-708 – Carlos Guestrin 200617Loopy BP & Bethe approximation Energy functional: Bethe approximation of Free Energy: use entropy for trees, but loopy graphs: Theorem: If Loopy BP converges, resulting bij& biare stationary point (usually local maxima) of Bethe Free energy! DifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 200618GBP & Kikuchi approximation Exact Free energy: Junction Tree Bethe Free energy: Kikuchi approximation: Generalized cluster graph  spectrum from Bethe to exact Theorem: If GBP converges, resulting bCiare stationary point (usually local maxima) of Kikuchi Free energy! DifficultySATGradeHappyJobCoherenceLetterIntelligenceDIGGJSLHGJCDGSI1010-708 – Carlos Guestrin 200619What you need to know about GBP Spectrum between Loopy BP & Junction Trees: More computation, but typically better answers If satisfies RIP, equations are very simple General setting, slightly trickier equations, but not hard Relates to variational methods: Corresponds to local optima of approximate version of energy functional 20Parameter learning in Markov netsGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 12th, 2008Readings:K&F: 10.2, 10.310-708 – Carlos Guestrin 2006-20081110-708 – Carlos Guestrin 200621Learning Parameters of a BN Log likelihood decomposes: Learn each CPT independently Use countsDSGHJCLI10-708 – Carlos Guestrin 200622Log Likelihood for MN Log likelihood of the data:DifficultySATGradeHappyJobCoherenceLetterIntelligence1210-708 – Carlos Guestrin 200623Log Likelihood doesn’t decompose for MNs Log likelihood: A convex problem Can find global optimum!! Term log Z doesn’t decompose!!DifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 200624Derivative of Log Likelihood for MNsDifficultySATGradeHappyJobCoherenceLetterIntelligence1310-708 – Carlos Guestrin 200625Derivative of Log Likelihood for MNs 2DifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 200626Derivative of Log Likelihood for MNsDifficultySATGradeHappyJobCoherenceLetterIntelligence Derivative: Computing derivative requires inference: Can optimize using gradient ascent Common approach Conjugate gradient, Newton’s method,… Let’s also look at a simpler solution1410-708 – Carlos Guestrin 200627Iterative Proportional Fitting (IPF)DifficultySATGradeHappyJobCoherenceLetterIntelligence Setting derivative to zero: Fixed point equation: Iterate and converge to optimal parameters Each iteration, must compute: 10-708 – Carlos Guestrin 200628What you need to know about learning MN parameters? BN parameter learning easy MN parameter learning doesn’t decompose! Learning requires inference! Apply gradient ascent or IPF iterations to obtain optimal


View Full Document

CMU CS 10708 - Generalized Belief Propagation

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Generalized Belief Propagation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Generalized Belief Propagation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Generalized Belief Propagation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?