DOC PREVIEW
Stanford CS 262 - Time Warping Hidden Markov Models

This preview shows page 1-2-15-16-31-32 out of 32 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Time Warping Hidden Markov ModelsReview of Last LectureThe Four-Russian AlgorithmBLAST  Original VersionPatternHunterTodayTime WarpingTime WarpingSlide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Hidden Markov ModelsOutline for our next topicExample: The Dishonest CasinoQuestion # 1 – EvaluationQuestion # 2 – DecodingQuestion # 3 – LearningThe dishonest casino modelDefinition of a hidden Markov modelA Hidden Markov Model is memory-lessA parse of a sequenceLikelihood of a parseExample: the dishonest casinoSlide 29Slide 30The three main questions on HMMsLet’s not be confused by notationTime WarpingHidden Markov ModelsLecture 2, Thursday April 3, 2003Review of Last LectureLecture 2, Thursday April 3, 2003Lecture 4, Thursday April 10, 2003The Four-Russian AlgorithmtttLecture 4, Thursday April 10, 2003BLAST  Original VersionDictionary:All words of length k (~11)Alignment initiated between words of alignment score  T (typically T = k)Alignment:Ungapped extensions until score below statistical thresholdOutput:All local alignments with score > statistical threshold…………queryDBqueryscanLecture 4, Thursday April 10, 2003PatternHunterMain features: •Non-consecutive position words•Highly optimized5 hits3 hits3 hits7 hits7 hitsConsecutive Positions Non-Consecutive Positions6 hitsOn a 70% conserved region: Consecutive Non-consecutiveExpected # hits: 1.07 0.97Prob[at least one hit]: 0.30 0.47Lecture 4, Thursday April 10, 2003Today•Time Warping•Hidden Markov modelsLecture 4, Thursday April 10, 2003Time WarpingTime WarpingLecture 4, Thursday April 10, 2003Time WarpingAlign and compare two trajectories in multi-D space(t)(t)•Additive random error•Variations in speed from one segment to anotherLecture 4, Thursday April 10, 2003Time WarpingDefinition: (u), (u) are connected by an approximate continuous time warping (u0, v0), if: u0, v0 are strictly increasing functions on [0, T], and (u0(t))  (v0(t)) for 0  t  T (t)(t)0Tu0(t)v0(t)Lecture 4, Thursday April 10, 2003Time WarpingHow do we measure how “good” a time warping is?Let’s try:0T w((u0(t)), (v0(t)) ) dtHowever, an equivalent time warping ( u1(s), v1(s) ), is given by:s = f(t); f: [0, T]  [0, S]has score0S w((u1(s)), (v1(s)) ) ds = 0T w((u0(t)), (v0(t)) ) f’(t) dtThis is arbitrarily differentLecture 4, Thursday April 10, 2003Time WarpingThis one works:d( u0, v0 ) = 0T w((u0(t)), (v0(t)) ) [(u’0(t) + v’0(t))/2] dtNow, if s = f(t); t = g(s), and g = f-1,0S w((u1(s)), (v1(s)) ) (u’1(s) + v’1(s))/2 ds = f(t) = f(g(s)) = s;f’(t) = f’(g(s)) g’(s) = 1, therefore g’(s) = 1/f’(t)u0(t) = u0(g(s)), therefore u’0(t) = u’0(g(s)) g’(s)0T w((u0(t)), (v0(t)) ) (u’0(t)+v’0(t))/2 g’(s) f’(t) dt =0T w((u0(t)), (v0(t)) ) [(u’0(t) + v’0(t))/2] dtLecture 4, Thursday April 10, 2003Time WarpingFrom continuous to discrete:Let’s discretize the signals:(t): a = a0……aM(t): b = b0……bNDefinition: a, b are connected by an approximate discrete time warping (u, v), if u and v are weakly increasing integer functions on 1  h  H, such that au[h]  bv[h] for all h = 1……HMoreover, we require u[0] = v[0] = 0; u[H] = M; v[h] = NLecture 4, Thursday April 10, 2003Time Warpingvu001 212MNDefine possible steps:(u, v) is the possible difference of u and vbetween steps h-1 and h (1, 0)(u, v) = (1, 1) (0, 1)Lecture 4, Thursday April 10, 2003Time WarpingAlternatively:(2, 0)(u, v) = (1, 1) (0, 2)Advantage:Every time warp has the same number of stepspossible position at h(0, 2)possible position at h(1, 1)position at h-1possible position at h(2, 0)Lecture 4, Thursday April 10, 2003Time WarpingDiscrete objective function:For 0  i = u[h]  M; 0  j = v[h]  N,Define w(i, j) = w( au[h], bv[h] )Then,D(u, v) = h w(u[h], v[h]) (u + v )/2In the case where we allow (2, 0), (1, 1), and (0, 2) steps,D(u, v) = h w(u[h], v[h])Lecture 4, Thursday April 10, 2003Time WarpingAlgorithm for optimal discrete time warping:Initialization: D(i, 0) = ½ i’<i w(i, 0)D(0, j) = ½ j’<j w(0, j)D(1, j) = D(i, 1) = w(i, j) + w(i-1, j-1)Iteration:For i = 2……MFor j = 2……ND(i – 2, j) + w(i, j)D(i, j) = min D(i – 1, j – 1) + w(i, j)D(i – 2, j) + w(i, j)Hidden Markov Models12K…12K…12K…………12K…x1x2x3xK21K2Lecture 4, Thursday April 10, 2003Outline for our next topic•Hidden Markov models – the theory•Probabilistic interpretation of alignments using HMMsLater in the course:•Applications of HMMs to biological sequence modeling and discovery of features such as genesLecture 4, Thursday April 10, 2003Example: The Dishonest CasinoA casino has two dice:•Fair dieP(1) = P(2) = P(3) = P(5) = P(6) = 1/6•Loaded dieP(1) = P(2) = P(3) = P(5) = 1/10P(6) = 1/2Casino player switches back-&-forth between fair and loaded die once every 20 turnsGame:1. You bet $12. You roll (always with a fair die)3. Casino player rolls (maybe with fair die, maybe with loaded die)4. Highest number wins $2Lecture 4, Thursday April 10, 2003Question # 1 – EvaluationGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONHow likely is this sequence, given our model of how the casino works?This is the EVALUATION problem in HMMsLecture 4, Thursday April 10, 2003Question # 2 – DecodingGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONWhat portion of the sequence was generated with the fair die, and what portion with the loaded die?This is the DECODING question in HMMsLecture 4, Thursday April 10, 2003Question # 3 – LearningGIVENA sequence of rolls by the casino player1245526462146146136136661664661636616366163616515615115146123562344QUESTIONHow “loaded” is the loaded die? How “fair” is the fair die? How often does the casino player change from fair to loaded, and back?This is the LEARNING question in HMMsLecture 4, Thursday April 10, 2003The dishonest casino modelFAIR LOADED0.050.050.950.95P(1|F) = 1/6P(2|F) = 1/6P(3|F) = 1/6P(4|F) = 1/6P(5|F) = 1/6P(6|F) = 1/6P(1|L) = 1/10P(2|L) = 1/10P(3|L) = 1/10P(4|L) = 1/10P(5|L) = 1/10P(6|L) = 1/2Lecture 4, Thursday April 10, 2003Definition of a hidden Markov modelDefinition: A hidden Markov model (HMM)•Alphabet  = { b1, b2, …, bM }•Set


View Full Document

Stanford CS 262 - Time Warping Hidden Markov Models

Documents in this Course
Lecture 8

Lecture 8

38 pages

Lecture 7

Lecture 7

27 pages

Lecture 4

Lecture 4

12 pages

Lecture 1

Lecture 1

11 pages

Biology

Biology

54 pages

Lecture 7

Lecture 7

45 pages

Load more
Download Time Warping Hidden Markov Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Time Warping Hidden Markov Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Time Warping Hidden Markov Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?