Unformatted text preview:

Markov Chains A Markov chain is characterized by:   000000100111|PrPrPr,2,1,0Pr,2,1,010,0|Pr,,,|PrinijininijijijnnnnnnijPpiXjXiXjXiiXpiPjiPiXjXiXiXiXjXP Matrix notation: The matrix of one-step transition probabilities = Ρ ijPThe matrix of n-step transition probabilities PPPPP nnijP n times nnjXpppPpPr,,,p210  The one-step transition probability matrix partitioned into transient, Q, and absorbing, R, states. The expected times spent in each state before being absorbed. The expected times to absorption. The probabilities of absorption. RNB1NTQINI0RQP1 Limiting probabilities: Regular chains (for some n, all elements of nP are > 0) 1,,2,1,0lim00NjjNkkjkjjnijnNjPP Classification of states: Accessible: state j is accessible from state i if for some 0,0 nijPn . Notation: ji  1Communication: States i and j communicate if each state is accessible from the other. Notation: ji  Two states that communicate are in the same class. A Markov chain with only one state is irreducible. Periodicity: State i is said to have period if is the greatest common divisor of all integers for which . If =1, the chain is aperiodic. idid1n0niiPid Probability of first return to state i at step n given you start in state i. iXnkiXiXfknnii0|1,,2,1,Pr  Connection with n-step probabilities.    kniinkkiiniiPfP0 Probability that a process starting in i returns to i at some time.  NnniiNnniiiifff00lim Recurrent: state i is recurrent if and only if 1iif or equivalently if and only if 1nniiP Transient: state i is transient if and only if 1iif or equivalently if and only if 1nniiP Theorem: For a recurrent irreducible aperiodic Markov chain a) inniiniinmnfP11lim0 b) jPPniinji states allfor limlim nn  Stationary probability distribution: A set of probabilities,  to 0for ii,that satisfy the following: 1,2,1,000 jjkkjkjjP


View Full Document

ISU STAT 432 - Formulas MarkovChains

Documents in this Course
Load more
Download Formulas MarkovChains
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Formulas MarkovChains and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Formulas MarkovChains 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?