DOC PREVIEW
CMU CS 10708 - Lecture

This preview shows page 1-2-3-4-5 out of 15 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 15 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

11School of Computer ScienceApproximate Inference:Loopy Belief Propagation and VariantsProbabilistic Graphical Models (10Probabilistic Graphical Models (10--708)708)Lecture 16, Nov 12, 2007Eric XingEric XingReceptor AKinase CTF FGene GGene HKinaseEKinase DReceptor BX1X2X3X4X5X6X7X8Receptor AKinase CTF FGene GGene HKinaseEKinase DReceptor BX1X2X3X4X5X6X7X8X1X2X3X4X5X6X7X8Reading: KF-Chap. 12Eric Xing 2An Ising model on 2-D imagez Nodes encode hidden information (patch-identity).z They receive local information from the image (brightness, color).z Information is propagated though the graph over its edges.z Edges encode ‘compatibility’ between nodes.?air or water ?2Eric Xing 3Why Approximate Inference?z Tree-width of NxN graph is O(N)z N can be a huge number(~1000s of pixels)z Exact inference will be too expensive⎪⎭⎪⎬⎫⎪⎩⎪⎨⎧+=<∑∑exp)(jiiiijiijXXXZXp01θθEric Xing 4z For a distribution p(X|θ) associated with a complex graph, computing the marginal (or conditional) probability of arbitraryrandom variable(s) is intractablez Variational methodsz formulating probabilistic inference as an optimization problem:{})( minor maxarg*fFff S∈=queries ticprobabiliscertain tosolutions or,ondistributiy probabilit )(tractable a:fe.g.Variational Methods3Eric Xing 5The Objectivez Let us call the actual distribution Pz We wish to find a distribution Q such that Q is a “good”approximation to Pz Recall the definition of KL-divergencez KL(Q1||Q2)>=0z KL(Q1||Q2)=0 iff Q1=Q2zBut, KL(Q1||Q2) ≠ KL(Q2||Q1)∏∈=FfaaaXfZXP )(/1)())()(log()()||(21121XQXQXQQQKLX∑=Eric Xing 6Which KL?z Computing KL(P||Q) requires inference!z But KL(P||Q) can be computed without performing inference on Pz Using ))()(log()()||(XPXQXQPQKLX∑=)(log)()(log)( XPXQXQXQXX∑∑−=))(/1log()()||(∏∈−−=FfaaQQaXfZEXHPQKL∏∈=FfaaaXfZXP )(/1)(∑∈−−−=FfaaQQaXfEZXH )(log/1log)()(log)( XPEXHQQ−−=4Eric Xing 7The ObjectivezzWe will call the “Energy Functional” *z =?z F(P,Q) >= F(P,P)ZXfEXHPQKLFfaaQQalog)(log)()||( +−−=∑∈),( QPF),( QPF*also called Gibbs Free Energy),( PPFEric Xing 8The Energy Functionalz Let us look at the functionalz can be computed if we have marginals over each faz is harder! Requires summation over all possible valuesz Computing F, is therefore hard in general.z Approach 1: Approximate with easy to compute∑∈−−=FfaaQQaXfEXHQPF )(log)(),(∑∈FfaaQaXfE )(log∑−=XQXQXQH )(log)(),( QPF∧),( QPF5Eric Xing 9Tree Energy Functionalsz Consider a tree-structured distributionz The probability can be written as:zzz involves summation over edges and vertices and is therefore easy to compute()()1−∈∏∏=iiiEjijiijxbxxbb,,)(x()()()()∑∑∑∑×+−=∈ijixiiiiiEjixxjiijjiijtreexbxbxxbxxbH ln,ln, ,,1()()() ()() ()() ()()()()()()()() ()∑∑∑∑∑∑∑∑∑∑∑∑∑∑×−+=−−⎟⎟⎠⎞⎜⎜⎝⎛×+−−=∈∈∈iijiijiijixiiiiiixiiiiiiEjijijijiijxxjiijixiiiiEjijijixxjiijxiiiiiEjijiijxxjiijTreexbxbxfxbxbxxfxxbxxbxfxbxxfxxbxbxbxxbxxbFlnln,,ln,ln,ln, ln,ln, ,,,,,,,,211X2X3X4X5X6X7X8X73625178672312.. FFFFFFFFFF−−−−−−++++=Eric Xing 10Tree Energy Functionalsz Consider a tree-structured distributionz The probability can be written as:zzz involves summation over edges and vertices and is therefore easy to compute()()idiiiaaaxbbb−∏∏=1xx)(()()()()()∑∑∑∑−+−=iaiiiiiiaaaaatreebbdbbHxxxxxx lnln 1()()()( ) () ()∑∑∑∑−+=iaiiiiiiaaaaaaaTreebbdfbbFxxxxxxx lnln 11X2X3X4X5X6X7X8X73625178672312.. FFFFFFFFFF−−−−−−++++=6Eric Xing 11Bethe Approximation to Gibbs Free Energyz For a general graph, choosez Called “Bethe approximation” after the physicist Hans Bethez Equal to the exact Gibbs free energy when the factor graph is a treez In general, HBetheis not the same as the H of a treeBethaFQPF =∧),(1X2X3X4X5X6X7X8X862517867231222 FFFFFFFFFFbethe−−−−−++++= ....()()()( ) () () ()bethaaaiiiiiiaaaaaaaBetheHfbbdfbbFia−−=−+=∑∑∑∑xxxxxxxxlnln 1()()()()()∑∑∑∑−+−=iaiiiiiiaaaaaBethebbdbbHxxxxxx lnln 1Eric Xing 12Bethe Approximationz Pros:z Easy to compute, since entropy term involves sum over pairwise and single variablesz Cons:z may or may not be well connected toz It could, in general, be greater, equal or less than z Optimize each b(xa)'s. z For discrete belief, constrained opt. with Lagrangian multiplier z For continuous belief, not yet a general formulaz Not always convergebetheFQPF =∧),(),( QPF),( QPF7Eric Xing 13Undirected graph (Markov random field)Directed graph(Bayesian network)∏∏=iijjiijiixxxZxP)()(),()(1)(ψψij)(iixψ),()( jiijxxψ)|()()(parents∏=iiixxPxPiParents(i)factor graphsinteractionsvariablesFrom GM to factored graphsEric Xing 14Recall Beliefs and messages in FGi∏∈→∝)()()()(iNaiiaiiiixmxfxb“beliefs” “messages”a∏∏∈∈→∝)(\)()()()(aNiaiNciicaaaaxmXfXbThe “belief” is the BP approximation of the marginal probability.8Eric Xing 15Bethe Free Energy for FG()()()( ) () ()∑∑∑∑−+=iaiiiiiiaaaaaaaBethabbdfbbFxxxxxxx lnln 1Eric Xing 16() ( ) ()∑∑∑∑∑∑⎭⎬⎫⎩⎨⎧−+−+=∈axXiiaaaNixiaixiiiiBetheiaiixbXbxxbFL\)( }1)({λγ0)(=∂∂iixbL⎟⎟⎠⎞⎜⎜⎝⎛−∝∑∈ )()(11exp)(iNaiaiiiixdxbλ0)(=∂∂aaXbL⎟⎟⎠⎞⎜⎜⎝⎛+−∝∑∈ )()()(exp)(aNiiaiaaaaxXEXbλConstrained Minimization of the Bethe Free Energy9Eric Xing 17Bethe = BP on FGz Identifyz to obtain BP equations:i∏∈→∝)()()()(iNaiiaiiiixmxfxb“beliefs” “messages”a∏∏∈∈→∝)(\)()()()(aNiaiNciicaaaaxmXfXbThe “belief” is the BP approximation of the marginal probability.∏≠∈→=aiNbiibiaixmx)()(ln)(λEric Xing 18Using,)()(\∑=→iaxXaaiiaXbxbwe getai∑∑∏∏∈∈→→=iaxXiaNjajNbjjbaaiiaxmXfxm\\)(\)()()()(ia=BP Message-update Rules( A sum product algorithm )10Eric Xing 19ikkkkijkkkMki∏∑→→∝kiikxiijiijjjixMxxxxMi)()(),()(ψψCompatibilities (interactions)external evidence∏∝kkkiiiixMxxb )()()(ψBelief Propagation on treesz BP Message-update Rulesz BP on trees always converges to exact marginals (cf. Junction tree algorithm)Eric Xing 20ikkkkijkkkMkiBelief Propagation on loopy graphsz BP Message-update Rulesz May not converge or converge to a wrong solution∏∑→→∝kiikxiijiijjjixMxxxxMi)()(),()(ψψCompatibilities (interactions)external evidence∏∝kkkiiiixMxxb )()()(ψ11Eric Xing 21Loopy Belief Propagationz If BP is used on graphs with loops, messages may


View Full Document

CMU CS 10708 - Lecture

Documents in this Course
Lecture

Lecture

15 pages

Lecture

Lecture

25 pages

Lecture

Lecture

24 pages

causality

causality

53 pages

lecture11

lecture11

16 pages

Exam

Exam

15 pages

Notes

Notes

12 pages

lecture

lecture

18 pages

lecture

lecture

16 pages

Lecture

Lecture

17 pages

Lecture

Lecture

17 pages

Lecture

Lecture

19 pages

Lecture

Lecture

42 pages

Lecture

Lecture

16 pages

r6

r6

22 pages

lecture

lecture

20 pages

lecture

lecture

35 pages

Lecture

Lecture

19 pages

Lecture

Lecture

21 pages

lecture

lecture

21 pages

lecture

lecture

13 pages

review

review

50 pages

Semantics

Semantics

30 pages

lecture21

lecture21

26 pages

MN-crf

MN-crf

20 pages

hw4

hw4

5 pages

lecture

lecture

12 pages

Lecture

Lecture

25 pages

Lecture

Lecture

25 pages

Lecture

Lecture

14 pages

Lecture

Lecture

15 pages

Load more
Download Lecture
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?