Markov Chains A Markov chain is characterized by: 000000100111|PrPrPr,2,1,0Pr,2,1,010,0|Pr,,,|PrinijininijijijnnnnnnijPpiXjXiXjXiiXpiPjiPiXjXiXiXiXjXP Matrix notation: The matrix of one-step transition probabilities = Ρ ijPThe matrix of n-step transition probabilities PPPPP nnijP n times nnjXpppPpPr,,,p210 The one-step transition probability matrix partitioned into transient, Q, and absorbing, R, states. The expected times spent in each state before being absorbed. The expected times to absorption. The probabilities of absorption. RNB1NTQINI0RQP1 Limiting probabilities: Regular chains (for some n, all elements of nP are > 0) 1,,2,1,0lim00NjjNkkjkjjnijnNjPP Classification of states: Accessible: state j is accessible from state i if for some 0,0 nijPn . Notation: ji 1Communication: States i and j communicate if each state is accessible from the other. Notation: ji Two states that communicate are in the same class. A Markov chain with only one state is irreducible. Periodicity: State i is said to have period if is the greatest common divisor of all integers for which . If =1, the chain is aperiodic. idid1n0niiPid Probability of first return to state i at step n given you start in state i. iXnkiXiXfknnii0|1,,2,1,Pr Connection with n-step probabilities. kniinkkiiniiPfP0 Probability that a process starting in i returns to i at some time. NnniiNnniiiifff00lim Recurrent: state i is recurrent if and only if 1iif or equivalently if and only if 1nniiP Transient: state i is transient if and only if 1iif or equivalently if and only if 1nniiP Theorem: For a recurrent irreducible aperiodic Markov chain a) inniiniinmnfP11lim0 b) jPPniinji states allfor limlim nn Stationary probability distribution: A set of probabilities, to 0for ii,that satisfy the following: 1,2,1,000 jjkkjkjjP
View Full Document