DOC PREVIEW
Stanford CS 262 - Hidden Markov Models

This preview shows page 1-2-3-21-22-23-43-44-45 out of 45 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Hidden Markov ModelsOutline for our next topicExample: The Dishonest Casino PlayerQuestion # 1 – EvaluationQuestion # 2 – DecodingQuestion # 3 – LearningThe dishonest casino modelA HMM is memory-lessDefinition of a hidden Markov modelSlide 10Slide 11A parse of a sequenceGenerating a sequence by the modelLikelihood of a parseExample: the dishonest casinoSlide 16Slide 17Slide 18Slide 19Slide 20The three main questions on HMMsLet’s not be confused by notationProblem 1: DecodingDecodingDecoding – main ideaThe Viterbi AlgorithmSlide 27Viterbi Algorithm – a practical detailExampleProblem 2: EvaluationSlide 31A couple of questionsEvaluationThe Forward AlgorithmThe Forward Algorithm – derivationSlide 36Relation between Forward and ViterbiMotivation for the Backward AlgorithmThe Backward Algorithm – derivationThe Backward AlgorithmComputational ComplexityPosterior DecodingSlide 43Slide 44Viterbi, Forward, BackwardHidden Markov Models12K…12K…12K…………12K…x1x2x3xK21K2Outline for our next topic•Hidden Markov models – the theory•Probabilistic interpretation of alignments using HMMsLater in the course:•Applications of HMMs to biological sequence modeling and discovery of features such as genesExample: The Dishonest Casino PlayerA casino has two dice:•Fair dieP(1) = P(2) = P(3) = P(5) = P(6) = 1/6•Loaded dieP(1) = P(2) = P(3) = P(5) = 1/10P(6) = 1/2Casino player switches between fair and loaded die randomly, on avg. once every 20 turnsGame:1. You bet $12. You roll (always with a fair die)3. Casino player rolls (maybe with fair die, maybe with loaded die)4. Highest number wins $2Question # 1 – EvaluationGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONHow likely is this sequence, given our model of how the casino works?This is the EVALUATION problem in HMMsProb = 1.3 x 10-35Question # 2 – DecodingGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONWhat portion of the sequence was generated with the fair die, and what portion with the loaded die?This is the DECODING question in HMMsFAIR LOADED FAIRQuestion # 3 – LearningGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONHow “loaded” is the loaded die? How “fair” is the fair die? How often does the casino player change from fair to loaded, and back?This is the LEARNING question in HMMsProb(6) = 64%The dishonest casino modelFAIR LOADED0.050.050.950.95P(1|F) = 1/6P(2|F) = 1/6P(3|F) = 1/6P(4|F) = 1/6P(5|F) = 1/6P(6|F) = 1/6P(1|L) = 1/10P(2|L) = 1/10P(3|L) = 1/10P(4|L) = 1/10P(5|L) = 1/10P(6|L) = 1/2A HMM is memory-lessAt each time step t, the only thing that affects future states is the current state tK1…2Definition of a hidden Markov modelDefinition: A hidden Markov model (HMM)•Alphabet  = { b1, b2, …, bM }•Set of states Q = { 1, ..., K }•Transition probabilities between any two statesaij = transition prob from state i to state jai1 + … + aiK = 1, for all states i = 1…K•Start probabilities a0ia01 + … + a0K = 1•Emission probabilities within each stateei(b) = P( xi = b | i = k)ei(b1) + … + ei(bM) = 1, for all states i = 1…KK1…2End Probabilities ai0in Durbin; not neededA HMM is memory-lessAt each time step t, the only thing that affects future states is the current state tP(t+1 = k | “whatever happened so far”) =P(t+1 = k | 1, 2, …, t, x1, x2, …, xt) =P(t+1 = k | t)K1…2A HMM is memory-lessAt each time step t, the only thing that affects xt is the current state tP(xt = b | “whatever happened so far”) =P(xt = b | 1, 2, …, t, x1, x2, …, xt-1) =P(xt = b | t)K1…2A parse of a sequenceGiven a sequence x = x1……xN,A parse of x is a sequence of states  = 1, ……, N12K…12K…12K…………12K…x1x2x3xK21K2Generating a sequence by the modelGiven a HMM, we can generate a sequence of length n as follows:1. Start at state 1 according to prob a01 2. Emit letter x1 according to prob e1(x1)3. Go to state 2 according to prob a124. … until emitting xn 12K…12K…12K…………12K…x1x2x3xn21K20e2(x1)a02Likelihood of a parseGiven a sequence x = x1……xNand a parse  = 1, ……, N,To find how likely this scenario is: (given our HMM)P(x, ) = P(x1, …, xN, 1, ……, N) = P(xN | N) P(N | N-1) ……P(x2 | 2) P(2 | 1) P(x1 | 1) P(1) = a01 a12……aN-1N e1(x1)……eN(xN) 12K…12K…12K…………12K…x1x2x3xK21K2A compact way to write a01 a12……aN-1N e1(x1)……eN(xN)Enumerate all parameters aij and ei(b); n paramsExample: a0Fair : 1; a0Loaded : 2; … eLoaded(6) = 18 Then, count in x and  the # of times eachparameter j = 1, …, n occursF(j, x, ) = # parameter j occurs in (x, )(call F(.,.,.) the feature counts) Then,P(x, ) = j=1…n jF(j, x, ) = = exp[j=1…n log(j)F(j, x, )]Example: the dishonest casinoLet the sequence of rolls be:x = 1, 2, 1, 5, 6, 2, 1, 5, 2, 4Then, what is the likelihood of = Fair, Fair, Fair, Fair, Fair, Fair, Fair, Fair, Fair, Fair?(say initial probs a0Fair = ½, aoLoaded = ½)½  P(1 | Fair) P(Fair | Fair) P(2 | Fair) P(Fair | Fair) … P(4 | Fair) =½  (1/6)10  (0.95)9 = .00000000521158647211 ~= 0.5  10-9Example: the dishonest casinoSo, the likelihood the die is fair in this runis just 0.521  10-9OK, but what is the likelihood of = Loaded, Loaded, Loaded, Loaded, Loaded, Loaded, Loaded, Loaded, Loaded, Loaded?½  P(1 | Loaded) P(Loaded, Loaded) … P(4 | Loaded) =½  (1/10)9  (1/2)1 (0.95)9 = .00000000015756235243 ~= 0.16  10-9Therefore, it somewhat more likely that all the rolls are done with the fair die, than that they are all done with the loaded dieExample: the dishonest casinoLet the sequence of rolls be:x = 1, 6, 6, 5, 6, 2, 6, 6, 3, 6Now, what is the likelihood  = F, F, …, F?½  (1/6)10  (0.95)9 ~= 0.5  10-9, same as beforeWhat is the likelihood = L, L, …, L?½  (1/10)4  (1/2)6 (0.95)9 = .00000049238235134735 ~= 0.5  10-7So, it is 100 times more likely the die is loadedQuestion # 1 – EvaluationGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONHow likely is this sequence,


View Full Document

Stanford CS 262 - Hidden Markov Models

Documents in this Course
Lecture 8

Lecture 8

38 pages

Lecture 7

Lecture 7

27 pages

Lecture 4

Lecture 4

12 pages

Lecture 1

Lecture 1

11 pages

Biology

Biology

54 pages

Lecture 7

Lecture 7

45 pages

Load more
Download Hidden Markov Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Hidden Markov Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Hidden Markov Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?