DOC PREVIEW
UB CSE 555 - Hidden Markov Models

This preview shows page 1-2-3-21-22-23-43-44-45 out of 45 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 45 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Hidden Markov ModelsandSequential DataSequential DataSound Spectrogram of Spoken WordsTask of making a sequence of decisionsModel Assuming IndependenceMarkov ModelFirst Order Markov ModelSecond Order Markov ModelIntroducing Latent VariablesTwo Models Described by this GraphLatent variable with three discrete statesMarkov model for the production of spoken wordsFirst-order Markov Models (MMs)First Order Hidden Markov ModelsFirst Order Hidden Markov ModelsHidden Markov Model ComputationThree Basic Problems for HMMsProblem 1: Evaluation ProblemEvaluation Problem FormulaComputationally Simpler Evaluation AlgorithmHMM ForwardProblem 2: Decoding ProblemProblem 3: Learning ProblemForward-Backward AlgorithmBackward ProbabilitiesBackward evaluation algorithmEstimating the aij and bjkCalculating Improved Estimate of aijCalculating Improved Estimate of bjkLearning AlgorithmConvergenceHMM Word RecognitionHMM spoken word recognitionCursive Word Recognition (not HMM)Summary: Key Algorithms for HMMHidden Markov ModelsandSequential DataSequential Data• Often arise through measurement of time series• Snowfall measurements on successive days in Buffalo• Rainfall measurements in Chirrapunji• Daily values of currency exchange rate• Acoustic features at successive time frames in speech recognition• Non-time series• Nucleotide base pairs in a strand of DNA • Sequence of characters in an English sentence• Parts of speech of successive words10Sound Spectrogram of Spoken Words• “Bayes Theorem”• Plot of the intensity of the spectral coefficients versus time index• Successive observations of the speech spectrum are highly correlatedTask of making a sequence of decisions• Processes in time, states at time t are influenced by a state at time t-1• In many time series applications, eg financial forecasting, wish to predict next value from previous values• Impractical to consider general dependence of future dependence on all previous observations• Complexity would grow without limit as number of observations increases• Markov models assume dependence on most recent observations10Model Assuming Independence• Simplest model: • Treat as independent • Graph without linksMarkov Model• Most general Markov model for observations {xn}• Product rule to express joint distribution of sequence of observations),..|(),..(1111 −=∏=NNNnNxxxpxxpFirst Order Markov Model• Chain of observations {xn}• Distribution p{xn|xn-1} is conditioned on previous observation• Joint distribution for a sequence of n variables• It can be verified (using product rule from above) that• If model is used to predict next observation, distribution of prediction will only depend on preceding observation and independent of earlier observations)|()..|(111 −−=nnnnxxpxxxp)|()(),..(1111 −=∏=nnNnNxxpxpxxpSecond Order Markov Model• Conditional distribution of observation xndepends on the values of two previous observations xn-1 and xn-2 ),|()|()(),..(2111211 −−=∏=nnnNnNxxxpxxpxpxxp• Each observation is influenced by previous two observationsIntroducing Latent Variables• For each observation xn, introduce a latent variable zn• znmay be of different type or dimensionality to the observed variable• Latent variables form the Markov chain• Gives the “state-space model”Latent variablesLatent variablesObservationsObservations)|()|()(),..,,..(111111 nnNnnnNnnNzxpzzpzpzzxxp∏∏=−==• Joint distribution for this modelTwo Models Described by this GraphLatent variablesLatent variablesObservationsObservations1. If latent variables are discrete: Hidden Markov ModelObserved variables in a HMM may be discrete or continuous2. If both latent and observed variables are Gaussian then we obtain linear dynamical systemsLatent variable with three discrete states• Transition probabilities aijare represented by a matrixNot a graphical model since the nodes are not separate Not a graphical model since the nodes are not separate variables but states of a single variablevariables but states of a single variableThis can be unfolded over time to get trellis diagramThis can be unfolded over time to get trellis diagramMarkov model for the production of spoken wordsStates represent phonemesProduction of the word: “cat”• Represented by states/k/ /a/ /t/ • Transitions from • /k/ to /a/• /a/ to /t/• /t/ to a silent state• Although only the correct cat soundIs represented by model, perhapsother transitions can beintroduced, eg, /k/ followed by /t/Markov ModelMarkov Modelfor word for word ““catcat””/k//a/ /t/10First-order Markov Models (MMs)• State at time t: ω(t)• Particular sequence of length T:ωT= {ω(1), ω(2), ω(3), …, ω(T)}e.g., ω6= {ω1, ω4, ω2, ω2, ω1, ω4}Note: system can revisit a state at different steps and not every state needs to be visitedModel for the production of any sequence is described by the transition probabilitiesP(ωj(t + 1) | ωi(t)) = aijWhich is a time-independentprobability of having state ωj at step (t+1) given at time t state was ωiNo requirement transitional probabilities be symmetricParticular modelParticular modelθ = {aij}Given model θ probability thatmodel generated sequence ω6= {ω1, ω4, ω2, ω2, ω1, ω4}isP(ω6| θ) = a14. a42. a22. a21. a14Can include a priori probabilityof first state asP(ω(1) = ωi )Discrete states = nodes, Transition Discrete states = nodes, Transition probsprobs= links= linksIn firstIn first--order discrete time HMM at step t system is in state order discrete time HMM at step t system is in state ωω(t(t))State at step t +1 is a random function that depends on state atState at step t +1 is a random function that depends on state atstesteand transition probabilitiesand transition probabilitiesp t p t 10First Order Hidden Markov Models• Perceiver does not have access to the states ω(t) • Instead we measure properties of the emitted sound• Need to augment Markov model to allow for visible states (symbols)First Order Hidden Markov Models• Visible States (symbols) VT= {v(1), v(2), v(3), …, v(T)}• For instance V6= {v5, v1, v1, v5, v2, v3}• In any state ωj(t) probability of emitting symbol vk(t) is bjkThree hidden units in HMMThree hidden units in HMMVisible states and emissionVisible states and emissionprobabilities of visible states in redprobabilities of visible states in redHidden Markov Model Computation• Finite State Machines with transitional probabilities–


View Full Document
Download Hidden Markov Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Hidden Markov Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Hidden Markov Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?