Unformatted text preview:

AnnouncementsHidden Markov ModelsMarkov ChainExamplesStates of Markov ChainAsymptotic behavior of Markov ChainHidden Markov ModelThree problemsProbability of ObservationsAnnouncements•Midterm scores (without challenge problem):–Median 85.5, mean 79, std 16.–Roughly, 85-100 ~A, 70-85 ~B, <70 ~C.Hidden Markov Models•Generative, rather than descriptive model.–Objects produced by random process.•Dependencies in process, some random events influence others.–Time is most natural metaphor here.•Simplest, most tractable model of dependencies is Markov.•Lecture based on: Rabiner, “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition.”Markov Chain•States: S1, … SN•Discrete time steps, 1, 2, …•State at time t is q(t).•Initial state, q(1). pi(i) = P(q(1) = Si).•P(q(t) = Sj | q(t-1)= Si, q(t-2)=Sk, … )= P(q(t) = Sj | q(t-1) = Si).This is what makes it Markov.•Time independence: a(ij) = P(q(t) = Sj | q(t-1) = Si).Examples•1D random walk in finite space.•1D curve generated by random walk in orientation.States of Markov Chain•Represent state at time t as vector:w(t) = (P(q(t)=S1), P(q(t)=S2), … P(q(t) = SN))•Put transitions, a(ij) into matrix A.–A is Stochastic, meaning columns sum to 1.•Then w(t) = A*w(t-1).Asymptotic behavior of Markov Chain•w(n) = A(A(…(A(v(1))))) = An(w(1)).–w(n) will be leading eigenvector of A.•This means asymptotic behavior independent of initial conditions•Some special conditions required: –Reach every state from every state (ergodic).–Markov chain may not converge (periodic)Hidden Markov Model•Observations, v(1), v(2) …, v(M).–We never know the state, but at each time step a state produces an observation.•Observation distribution: b(j,k) = P(v(k) at t| q(t) = Sj).Note this is also taken to be time independent.•Example, HMM that generates contours varying from smooth to rough.Three problems•Probability of observations given model.–Use to select model given observations (eg, speech recognition).–To refine estimate of HMM.•Given model and observations, what were likely states?–States may have semantics (rough/smooth contour).–May provide intuitions about model.•Find model to optimize probability of observations.–Learning the model.Probability of Observations•Solved with dynamic programming.•Whiteboard (see Rabiner for


View Full Document

UMD CMSC 828 - Hidden Markov Models

Download Hidden Markov Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Hidden Markov Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Hidden Markov Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?