Machine Learning 1010 701 15701 15 781 Fall 2006 Hidden Markov Model Eric Xing Lecture 15 November 2 2006 Reading Chap 1 2 C B book Eric Xing 1 Hidden Markov Model from static to dynamic mixture models Static mixture Y1 XA1 Eric Xing N Dynamic mixture Y1 Y2 Y3 YT XA1 XA2 XA3 XAT 2 1 Hidden Markov Models The underlying source genomic entities dice The sequence Y1 Y2 Y3 YT A1 X A2 X A3 X AT X Ploy NT sequence of rolls Eric Xing 3 Example The Dishonest Casino A casino has two dice Fair die P 1 P 2 P 3 P 5 P 6 1 6 z Loaded die P 1 P 2 P 3 P 5 1 10 P 6 1 2 Casino player switches back forth between fair and loaded die once every 20 turns z Game 1 You bet 1 2 You roll always with a fair die 3 Casino player rolls maybe with fair die maybe with loaded die 4 Highest number wins 2 Eric Xing 4 2 Puzzles Regarding the Dishonest Casino GIVEN A sequence of rolls by the casino player 1245526462146146136136661664661636616366163616515615115146123562344 QUESTION z How likely is this sequence given our model of how the casino works z z What portion of the sequence was generated with the fair die and what portion with the loaded die z z This is the EVALUATION problem in HMMs This is the DECODING question in HMMs How loaded is the loaded die How fair is the fair die How often does the casino player change from fair to loaded and back z This is the LEARNING question in HMMs Eric Xing 5 A Stochastic Generative Model z Observed sequence 1 4 3 6 6 4 A B A B z Hidden sequence a parse or segmentation B Eric Xing B A A 6 3 Definition of HMM z Observation space Alphabetic set Euclidean space z z z z C c1 c2 L cK Rd y1 y2 y3 yT Index set of hidden states xAT xA1 xA2 xA3 I 1 2 L M Graphical model Transition probabilities between any two states j p yt 1 yti 1 1 ai j or p yt yti 1 1 Multinomial ai 1 ai 2 K ai M i I 1 2 Start probabilities p y1 Multinomial 1 2 K M Emission probabilities associated with each state K p xt yti 1 Multinomial bi 1 bi 2 K bi K i I or in general p xt yti 1 f i i I State automata Eric Xing 7 Probability of a Parse z z Given a sequence x x1 xT and a parse y y1 yT To find how likely is the parse given our HMM and the sequence p x y y 1i def M 1 1 Eric Xing def ayt yt 1 i 1 y ay y LayT z y2 y3 yT xA1 xA2 xA3 xAT p x1 xT y1 yT Joint probability p y1 p x1 y1 p y2 y1 p x2 y2 p yT yT 1 p xT yT p y1 P y2 y1 p yT yT 1 p x1 y1 p x2 y2 p xT yT p y1 yT p x1 xT y1 yT Let y i z y1 1 2 1 yT Marginal probability Posterior probability yti ytj 1 a M i j 1 ij def M yti xtk K and byt xt bik i 1 k 1 by x LbyT xT 1 1 p x y p x y y 1 y L y p y x p x y p x 2 N y T 1 ay t 2 t 1 yt T p xt yt t 1 8 4 The Dishonest Casino Model 0 05 0 95 FAIR P 1 F 1 6 P 2 F 1 6 P 3 F 1 6 P 4 F 1 6 P 5 F 1 6 P 6 F 1 6 0 95 LOADED 0 05 P 1 L 1 10 P 2 L 1 10 P 3 L 1 10 P 4 L 1 10 P 5 L 1 10 P 6 L 1 2 Eric Xing 9 Example the Dishonest Casino z Let the sequence of rolls be z z x 1 2 1 5 6 2 1 6 2 4 Then what is the likelihood of z y Fair Fair Fair Fair Fair Fair Fair Fair Fair Fair say initial probs a0Fair aoLoaded P 1 Fair P Fair Fair P 2 Fair P Fair Fair P 4 Fair 1 6 10 0 95 9 00000000521158647211 5 21 10 9 Eric Xing 10 5 Example the Dishonest Casino z So the likelihood the die is fair in all this run is just 5 21 10 9 z OK but what is the likelihood of z Loaded Loaded Loaded Loaded Loaded Loaded Loaded Loaded Loaded Loaded P 1 Loaded P Loaded Loaded P 4 Loaded 1 10 8 1 2 2 0 95 9 00000000078781176215 0 79 10 9 z Therefore it is after all 6 59 times more likely that the die is fair all the way than that it is loaded all the way Eric Xing 11 Example the Dishonest Casino z Let the sequence of rolls be z z Now what is the likelihood F F F z z x 1 6 6 5 6 2 6 6 3 6 1 6 10 0 95 9 0 5 10 9 same as before What is the likelihood y L L L 1 10 4 1 2 6 0 95 9 00000049238235134735 5 10 7 z So it is 100 times more likely the die is loaded Eric Xing 12 6 Three Main Questions on HMMs 1 Evaluation GIVEN an HMM M FIND Prob x M ALGO Forward and a sequence x 2 Decoding an HMM M and a sequence x the sequence y of states that maximizes e g P y x M or the most probable subsequence of states Viterbi Viterbi ForwardForward backward GIVEN FIND ALGO 3 Learning GIVEN FIND ALGO an HMM M with unspecified transition emission probs and a sequence x parameters i aij ik that maximize P x BaumBaum Welch EM Eric Xing 13 Applications of HMMs z Some early applications of HMMs z finance but we never saw them z speech recognition z modelling ion channels z In the mid late 1980s HMMs entered genetics and molecular biology and they are now firmly entrenched z Some current applications of HMMs to biology z mapping chromosomes z aligning biological sequences z predicting sequence structure z inferring evolutionary relationships z finding genes in DNA sequence Eric Xing 14 7 Typical structure of a gene Eric Xing 15 GENSCAN Burge Karlin E0 E1 E2 I0 I1 I2 Ei Et Es 5 UTR poly A promoter Forward strand Reverse strand Eric Xing 3 UTR p y intergenic region Forward strand Reverse strand 1 2 3 4 GAGAACGTGTGAGAGAGAGGCAAGCCGAAAAATCAGCCGC CGAAGGATACACTATCGTCGTCCTTGTCCGACGAACCGGT GGTCATGCAAACAACGCACAGAACAAATTAATTTTCAAAT TGTTCAATAAATGTCCCACTTGCTTCTGTTGTTCCCCCCT TTCCGCTAGCGGAATTTTTTATATTCTTTTGGGGGCGCTC TTTCGTTGACTTTTCGAGCACTTTTTCGATTTTCGCGCGC TGTCGAACGGCAGCGTATTTATTTACAATTTTTTTTGTTA GCGGCCGCCGTTGTTTGTTGCAGATACACAGCGCACACAT ATAAGCTTGCACACTGATGCACACACACCGACACGTTGTC ACCGAAATGAACGGGACGGCCATATGACTGGCTGGCGCTC GGTATGTGGGTGCAAGCGAGATACCGCGATCAAGACTCGA ACGAGACGGGTCAGCGAGTGATACCGATTCTCTCTCTTTT GCGATTGGGAATAATGCCCGACTTTTTACACTACATGCGT TGGATCTGGTTATTTAATTATGCCATTTTTCTCAGTATAT CGGCAATTGGTTGCATTAATTTTGCCGCAAAGTAAGGAAC ACAAACCGATAGTTAAGATCCAACGTCCCTGCTGCGCCTC GCGTGCACAATTTGCGCCAATTTCCCCCCTTTTCCAGTTT TTTTCAACCCAGCACCGCTCGTCTCTTCCTCTTCTTAACG TTAGCATTCGTACGAGGAACAGTGCTGTCATTGTGGCCGC TGTGTAGCTAAAAAGCGTAATTATTCATTATCTAGCTATC TTTTCGGATATTATTGTCATTTGCCTTTAATCTTGTGTAT TTATATGGATGAAACGTGCTATAATAACAATGCAGAATGA AGAACTGAAGAGTTTCAAAACCTAAAAATAATTGGAATAT AAAGTTTGGTTTTACAATTTGATAAAACTCTATTGTAAGT GGAGCGTAACATAGGGTAGAAAACAGTGCAAATCAAAGTA CCTAAATGGAATACAAATTTTAGTTGTACAATTGAGTAAA ATGAGCAAAGCGCCTATTTTGGATAATATTTGCTGTTTAC AAGGGGAACATATTCATAATTTTCAGGTTTAGGTTACGCA TATGTAGGCGTAAAGAAATAGCTATATTTGTAGAAGTGCA TATGCACTTTATAAAAAATTATCCTACATTAACGTATTTT ATTTGCTTTAAAACCTATCTGAGATATTCCAATAAGGTAA GTGCAGTAATACAATGTAAATAATTGCAAATAATGTTGTA ACTAAATACGTAAACAATAATGTAGAAGTCCGGCTGAAAG CCCCAGCAGCTATAGCCGATATCTATATGATTTAAACTCT TGTCTGCAACGTTCTAATAAATAAATAAAATGCAAAATAT AACCTATTGAGACAATACATTTATTTTATTTTTTTATATC ATCAATCATCTACTGATTTCTTTCGGTGTATCGCCTAATC
View Full Document