Unformatted text preview:

12EE/Ma 127b Lecture 6April 9, 2007Copyrightc∞ 2007 by R. J. McElieceOutline• Applications of the Forward-Backward Algorithm• Markov Chain Probabilities• Hidden Markov Models.3Markov Chain ProbabilitesWhat is the probability that the particle passes throughvertex h? Or edge bf?4Some General QuestionsPr{Particle passed through v} =XP :Av7→Bπ(P ) = µv(A, B).Pr{Particle passed through e} =XP :Ae7→Bπ(P ) = µe(A, B).• Here π(P ) is the product of the edge probabilities.5The Equivalent Stochastic MatricesΠ1=≥b cA 1/3 2/3¥Π2=√d e fb 3/4 0 1/4c 1/3 2/3 0!Π3=g h jd 0 1/3 2/3e 1/2 1/6 1/3f 0 1/4 3/4Π4=k mg 1 0h 3/4 1/4j 0 1Π5=√Bk 1m 1!6Here are the α’sα0=≥AA 1¥α1= Π1=≥b cA 1/3 2/3¥α2= Π1Π2=≥d e fA 17/36 16/36 3/36¥7Here are the α’sα3= Π1Π2Π3=≥g h jA 96/432 109/432 227/432¥α4= Π1Π2Π3Π4=≥k mA 79/192 113/192¥α5= Π1Π2Π3Π4Π5=≥BA 1¥8And here are the β’s (!)β1= Π1Π2Π3Π4Π5=≥BA 1¥β2= Π2Π3Π4Π5=√Bb 1c 1!β3= Π3Π4Π5=Bd 1e 1f 19And here are the β’s (!)β4= Π4Π5=Bg 1h 1j 1β5= Π5=√Bk 1m 1!β6=≥BB 1¥10A Second Application: A Hidden Markov ChainSuppose our markovian particle is hidden from view, but asit follows a random pathP = e1e2· · · eN,where init e1= A, fin ei= init ei+1, for i = 1, . . . , N − 1,and fin eN= B, it produces “evidence”Y = Y1Y2. . . YN,where Yiis a random variable which is a “noisy” version ofthe edge ei∈ Ei−1,i.11What is the Evidence?The evidence is stochastically described by a series of tran-sition probability matrices pi(y|e), i = 1, . . . , N:Pr{Yi= y|ei= e} = pi(y|e).We call (y∗1, . . . , y∗N) the “evidence,” denoted E.12The Secret is to Use Bayes’ RulePr{S|T } =Pr{T |S} Pr{S}Pr{T }.(Because Pr{T } Pr{S|T } = Pr{S} Pr{T |S} = Pr{S, T }.)13Example• 1% of the population has DD.• There is a test for DD that is 95 % reliable.• Suppose Pete tests positive for DD. What is the probabil-ity that Pete actually has DD?Pr{DD|pos} =Pr{pos|DD} Pr{DD}Pr{pos}Pr{DD|pos} =Pr{pos|DD} Pr{DD}Pr{pos}.14Doing the MathPr{DD|pos} ∝ Pr{pos|DD} Pr{DD}Pr{DD|pos} ∝ Pr{pos|DD} Pr{DD}.Pr{pos|DD} Pr{DD} = (.95)(.01) = .0095Pr{pos|DD} Pr{DD} = (.05)(.99) = .0495Pr{DD|pos} =.0095.0095 + .0495= .161Pr{DD|pos} =.0495.0095 + .0495= .839.15What Can We Infer about P from E?By Bayes’s rule,Pr{P |E} =Pr{P } Pr{E|P }Pr{E}= a posteriori probability of P .P∗= argmaxP{Pr{P |E}}= Viterbi’s’ algorithm (“Max-Product” form)16What Else Can We Infer about P from E?• Did P pass through v?Pr{v ∈ P |E} =XP :Av7→BPr{P |E} ∝XP :Av7→BPr{P } Pr{E|P }• Did P pass through e?Pr{e ∈ P |E} =XP :Ae7→BPr{P |E}∝XP :Ae7→BPr{P } Pr{E|P }.17What Else Can We Infer about P from E?• In both cases we need to know Pr{P } (the priors) andPr{E|P } (the likelihoods) for every path P from A to B.Pr{P } =NYi=1π(ei) (prior)Pr{E|P } =NYi=1pi(y∗i|ei). (likelihood)18And So . . .If we define the edge weights for e ∈ Ei−1,ias∞(e) = π(e)pi(y∗i|e)∞(P ) =NYi=1∞(ei).This is a Job for the FBA!Pr{v ∈ P |E} ∝YP :Av7→B∞(P ) = α(v)β(v)Pr{e ∈ P |E} ∝YP :Ae7→B∞(P ) = α(init e)∞(e)β(fin e).19A Famous Example20State TransitionsLet the state transition table be as follows: (In this examplep(y|e) depends only on init e.)Π =0 1 20 1/2 1/2 01 1/2 0 1/22 1/2 1/2 021Transition Evidence• Let the transition evidence be as follows: (In this examplep(y|e) depends only on init e.)0 1 2 ?0∗ 1/2 1/4 1/8 1/81∗ 1/8 1/2 1/4 1/82∗ 1/4 1/8 1/2 1/822The Gammas• Γ[a]i,j= The i → j edge weight if a is observed.Γ[0] =0 1 20 1/4 1/41 1/16 1/162 1/8 1/8Γ[1] =0 1 20 1/8 1/81 1/4 1/42 1/16 1/16Γ[2] =0 1 20 1/16 1/161 1/8 1/82 1/4 1/4Γ[?] =0 1 20 1/16 1/161 1/16 1/162 1/16


View Full Document

CALTECH EE 127 - Lecture 6

Download Lecture 6
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 6 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 6 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?