DOC PREVIEW
CMU CS 10708 - variational-loopybp-annotated

This preview shows page 1-2-3-4 out of 12 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

11Mean Field and Variational Methodsfinishing offGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 5th, 2008Readings:K&F: 10.1, 10.510-708 – Carlos Guestrin 2006-200810-708 – Carlos Guestrin 2006-20082210-708 – Carlos Guestrin 2006-20083What you need to know so far Goal: Find an efficient distribution that is close to posterior Distance: measure distance in terms of KL divergence Asymmetry of KL: D(p||q)  D(q||p) Computing right KL is intractable, so we use the reverse KL10-708 – Carlos Guestrin 2006-20084Reverse KL & The Partition FunctionBack to the general case Consider again the defn. of D(q||p): p is Markov net PF Theorem:  where energy functional:DifficultySATGradeHappyJobCoherenceLetterIntelligence310-708 – Carlos Guestrin 2006-20085Understanding Reverse KL, Energy Function & The Partition Function Maximizing Energy Functional  Minimizing Reverse KL Theorem: Energy Function is lower bound on partition function Maximizing energy functional corresponds to search for tight lower bound on partition function10-708 – Carlos Guestrin 2006-20086Structured Variational Approximate Inference Pick a family of distributions Q that allow for exact inference e.g., fully factorized (mean field) Find Q2Q that maximizes  For mean field410-708 – Carlos Guestrin 2006-20087Optimization for mean field Constrained optimization, solved via Lagrangian multiplier 9 , such that optimization equivalent to: Take derivative, set to zero Theorem: Q is a stationary point of mean field approximation iff for each i: 10-708 – Carlos Guestrin 2006-20088Understanding fixed point equationDifficultySATGradeHappyJobCoherenceLetterIntelligence510-708 – Carlos Guestrin 2006-200810 Theorem: The fixed point:is equivalent to: where the Scope[j] = Uj[ {Xi}Qionly needs to consider factors that intersect XiDifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 2006-200811There are many stationary points!610-708 – Carlos Guestrin 2006-200812 Initialize Q (e.g., randomly or smartly) Set all vars to unprocessed Pick unprocessed var Xi update Qi: set var i as processed if Qichanged set neighbors of Xito unprocessed Guaranteed to convergeVery simple approach for finding one stationary pointDifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 2006-200813More general structured approximations  Mean field very naïve approximation Consider more general form for Q assumption: exact inference doable over Q Theorem: stationary point of energy functional: Very similar update ruleDifficultySATGradeHappyJobCoherenceLetterIntelligence710-708 – Carlos Guestrin 2006-200816What you need to know about variational methods Structured Variational method: select a form for approximate distribution minimize reverse KL  Equivalent to maximizing energy functional searching for a tight lower bound on the partition function Many possible models for Q: independent (mean field) structured as a Markov net cluster variational Several subtleties outlined in the book17Loopy Belief PropagationGraphical Models – 10708Carlos GuestrinCarnegie Mellon UniversityNovember 5th, 2008Readings:K&F: 10.2, 10.310-708 – Carlos Guestrin 2006-2008810-708 – Carlos Guestrin 2006-200818Recall message passing over junction trees Exact inference: generate a junction tree message passing over neighbors inference exponential in size of cliqueDifficultySATGradeHappyJobCoherenceLetterIntelligenceDIGGJSLHGJCDGSI10-708 – Carlos Guestrin 2006-200819Belief Propagation on Tree Pairwise Markov Nets Tree pairwise Markov net is a tree!!!  no need to create a junction tree Message passing: More general equation: N(i) – neighbors of i in pairwise MN Theorem: Converges to true probabilities:DifficultySATGradeHappyJobCoherenceLetterIntelligence910-708 – Carlos Guestrin 2006-200820Loopy Belief Propagation on Pairwise Markov Nets What if we apply BP in a graph with loops? send messages between pairs of nodes in graph, and hope for the best What happens? evidence goes around the loops multiple times may not converge if it converges, usually overconfident about probability values But often gives you reasonable, or at least useful answers especially if you just care about the MPE rather than the actual probabilitiesDifficultySATGradeHappyJobCoherenceLetterIntelligence10-708 – Carlos Guestrin 2006-200821More details on Loopy BP Numerical problem: messages < 1 get multiplied togetheras we go around the loops numbers can go to zero normalize messages to one: Zi!jdoesn’t depend on Xj, so doesn’t change the answer Computing node “beliefs” (estimates of probs.): DifficultySATGradeHappyJobCoherenceLetterIntelligence1010-708 – Carlos Guestrin 2006-200822An example of running loopy BP10-708 – Carlos Guestrin 2006-200823Convergence If you tried to send all messages, and beliefs haven’t changed (by much) ! convergedDifficultySATGradeHappyJobCoherenceLetterIntelligence1110-708 – Carlos Guestrin 2006-200824(Non-)Convergence of Loopy BP Loopy BP can oscillate!!! oscillations can small oscillations can be really bad! Typically,  if factors are closer to uniform, loopy does well (converges) if factors are closer to deterministic, loopy doesn’t behave well  One approach to help: damping messages new message is average of old message and new one:  often better convergence but, when damping is required to get convergence, result often badgraphs from Murphy et al. ’9910-708 – Carlos Guestrin 2006-200825Loopy BP in Factor graphs What if we don’t have pairwise Markov nets?1. Transform to a pairwise MN2. Use Loopy BP on a factor graph Message example: from node to factor: from factor to node:A B C D EABC ABD BDE CDE1210-708 – Carlos Guestrin 2006-200826Loopy BP in Factor graphs From node i to factor j: F(i) factors whose scope includes Xi From factor j to node i: Scope[j] = Y[{Xi} Belief:  Node: Factor:A B C D EABC ABD BDE CDE10-708 – Carlos Guestrin 2006-200827What you need to know about loopy BP Application of belief propagation in loopy graphs Doesn’t always converge damping can help good message


View Full Document

CMU CS 10708 - variational-loopybp-annotated

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

15 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download variational-loopybp-annotated
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view variational-loopybp-annotated and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view variational-loopybp-annotated 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?