Unformatted text preview:

1Digital Speech ProcessingDigital Speech Processing——Lecture 20Lecture 20The Hidden Markov The Hidden Markov Model (HMM)Model (HMM)2Lecture OutlineLecture Outline• Theory of Markov Models– discrete Markov processes– hidden Markov processes• Solutions to the Three Basic Problems of HMM’s– computation of observation probability– determination of optimal state sequence– optimal training of model• Variations of elements of the HMM– model types– densities• Implementation Issues– scaling– multiple observation sequences– initial parameter estimates– insufficient training data• Implementation of Isolated Word Recognizer Using HMM’s3Stochastic Signal ModelingStochastic Signal Modeling• Reasons for Interest:– basis for theoretical description of signal processing algorithms– can learn about signal source properties– models work well in practice in real world applications• Types of Signal Models– deteministic, parametric models– stochastic models4Discrete Markov ProcessesDiscrete Markov Processes{}12System of distinct states, , ,...,NNSSS−− −⎡⎤⎡⎤======⎣⎦⎣⎦123 4 512 1Time( ) 1 2 3 4 5 ...State ...Markov Property:| , ,... |tit jt k tit jtqqq q qPq S q Sq S Pq S q S5Properties of State Transition CoefficientsProperties of State Transition Coefficients=≥∀=∀∑10,1jiNjiiajiaj−⎡⎤== = ≤≤⎣⎦1Consider processes where state transitions are time independent, i.e.,|,1,ji t i t jaPqSq S ijN6Example of Discrete Markov Example of Discrete Markov ProcessProcessOnce each day (e.g., at noon), the weather is observed and classified as being one of the following:– State 1—Rain (or Snow; e.g. precipitation)– State 2—Cloudy– State 3—Sunnywith state transition probabilities:{}⎡⎤⎢⎥==⎢⎥⎢⎥⎣⎦0.4 0.3 0.30.2 0.6 0.20.1 0.1 0.8ijAa7Discrete Markov ProcessDiscrete Markov ProcessProblem: Given that the weather on day 1 is sunny, what is the probability (according to the model) that the weather for the next 7 days will be “sunny-sunny-rain-rain-sunny-cloudy-sunny”?Solution: We define the observation sequence, O, as:{}=33311323,,,,,,,O SSSSSSSSand we want to calculate P(O|Model). That is:[]=33311323( | Model) , , , , , , , | ModelPO PSSSSSSSS8Discrete Markov ProcessDiscrete Markov Process[][][ ] [ ][ ][][][]()()()()()()()[]ππ−==⋅===⋅== ≤≤333113232333 131131 23 3223 33 311113 32 232041( | Model) , , , , , , , | Model||||| |1 0.8 0.1 0.4 0.3 0.1 0.21.536 10,1iiPO P S S S S S S S SPS PS S PS S PS SPS S PS S PS SaaaaaaPq S i N9Discrete Markov ProcessDiscrete Markov ProcessProblem: Given that the model is in a known state, what is the probability it stays in that state for exactly d days?Solution:{}()()−∞==≠+== −===−∑111,,,...,,123 1| Model, (1 ) ( )1()1iii i j idiii iiiiidiiO SSS S S SddPO q S a a pdddpda10ExerciseExerciseGiven a single fair coin, i.e., P (H=Heads)= P (T=Tails) = 0.5, which you toss once and observe Tails:a) what is the probability that the next 10 tosses will provide the sequence {H H T H T T H T T H}?SOLUTION:SOLUTION:For a fair coin, with independent coin tosses, the probability of any specific observation sequence of length 10 (10 tosses) is (1/2)10since there are 210such sequences and all are equally probable. Thus:P (H H T H T T H T T H) = (1/2)1011ExerciseExerciseb) what is the probability that the next 10 tosses will produce the sequence {H H H H H H H H H H}?SOLUTION:SOLUTION:Similarly:P (H H H H H H H H H H)= (1/2)10Thus a specified run of length 10 is equally as likely as a specified run of interlaced H and T.12ExerciseExercisec) what is the probability that 5 of the next 10 tosses willbe tails? What is the expected number of tails over the next 10 tosses?=⎛⎞⎛⎞==⎜⎟⎜⎟⎝⎠⎝⎠∑10100101(Number of in 10 coin tosses) 52dET ddThus, on average, there will be 5H and 5T in 10 tosses, but the probability of exactly 5H and 5T is only about 0.25.SOLUTION:SOLUTION:The probability of 5 tails in the next 10 tosses is just the number of observation sequences with 5 tails and 5 heads (in any sequence) and this is:P (5H, 5T)=(10C5) (1/2)10= 252/1024≈0.25since there are (10C5) combinations (ways of getting 5H and 5T) for 10 coin tosses, and each sequence has probability of (1/2)10. The expected number of tails in 10 tosses is:13Coin Toss ModelsCoin Toss ModelsA series of coin tossing experiments is performed. The number of coins is unknown; only the results of each coin toss are revealed. Thus a typical observation sequence is:==123... ...TO O O O O HHTTTHTTH HProblem: Build an HMM to explain the observation sequence.Issues:1. What are the states in the model?2. How many states should be used?3. What are the state transition probabilities?14Coin Toss ModelsCoin Toss Models15Coin Toss ModelsCoin Toss Models16Coin Toss ModelsCoin Toss ModelsProblem: Consider an HMM representation (model λ) of a coin tossing experiment. Assume a 3-state model (corresponding to 3 different coins) with probabilities:0.750.250.5P(T)0.250.750.5P(H)State 3State 2State 1and with all state transition probabilities equal to 1/3. (Assume initial state probabilities of 1/3).a) You observe the sequence: O=H H H H T H T T T TWhat state sequence is most likely? What is the probability of the observation sequence and this most likely state sequence?17Coin Toss Problem SolutionCoin Toss Problem SolutionSOLUTION:SOLUTION:Given O=HHHHTHTTTT, the most likely state sequence is the one for which the probability of each individual observation is maximum. Thus for each H, the most likely state is S2and for each T the most likely state is S3. Thus the most likely state sequence is:S= S2 S2S2S2S3 S2 S3 S3S3S3The probability of O and S (given the model) is:λ⎛⎞=⎜⎟⎝⎠10101(, | ) (0.75)3POS18Coin Toss ModelsCoin Toss Modelsb) What is the probability that the observation sequence came entirely from state 1?SOLUTION:SOLUTION:The probability of O given that S is of the form:λλλλλ=⎛⎞=⎜⎟⎝⎠⎛⎞===⎜⎟⎝⎠1111111111101010ˆis:1ˆ(, | ) (0.50)3ˆThe ratio of ( , | ) to ( , | ) is:(, | ) 357.67ˆ2(, | )S SSSSSSSSSSPOSPOS POSPOSRPOS19Coin Toss ModelsCoin Toss Modelsc) Consider the observation sequence:=O HT T HTHHTTHHow would your answers to parts a and b change?SOLUTION:SOLUTION: Given which has the same number of 's and 's, the answers to parts a and b would remain the same as the most likely


View Full Document
Download The Hidden Markov Model
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view The Hidden Markov Model and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view The Hidden Markov Model 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?