Yale STAT 251 - Summary of Markov Chain facts

Unformatted text preview:

Summary of Markov Chain facts23 February 20091 Classification of statesFor a Markov chain {Xn: n = 0, 1, 2, . . . } with a (countable or finite) statespace S defineTi= inf{n ≥ 1 : Xn= i} and Ni=Xn∈N1{Xn= i}.Note that EµNi=Pn∈NPµ{Xn= i}.Define i j to mean Pi{Tj< ∞} > 0 and i ! j to mean that both i jand j i.Suppose the chain has transition probabilities P (i, j). A stationary measureis a set of nonnegative numbers {λi: i ∈ S} for whichλk=Xj∈SλjP (j, k) for each k ∈ S.If, in addition,Pi∈Sλi= 1 then the λi’s are called a stationary probability dis-tribution. For such a distribution,Pλ{Xn= i} = λifor all n ∈ N and all i ∈ S.IfPi∈Sλi< ∞ then a stationary measure can be rescaled to define a stationaryprobability distribution. Sometimes there are stationary measures that cannot bestandardized (cf. ).1.1 TransienceA state i is said to be transient if Pi{Ti< ∞} < 1.(i) For a transient state, EiNi< ∞ and Pi{Ni< ∞} = 1.(ii) If i ! j and i is transient then so is j.Statistics 251/551, spring 2009cDavid Pollard21.2 RecurrenceA state i is said to be recurrent if Pi{Ti< ∞} = 1.(i) For a recurrent state, Pi{Ni= ∞} = 1 and EiNi= ∞.(ii) If i ! j and i is recurrent then so is j.(iii) If {πj: j ∈ S} is a stationary probability distribution with πi> 0 thenthe state i is recurrent. [In fact, i is positive recurrent—see Chang NotesTheorem 1.41.](iv) If i is recurrent then λj:= Ei{# visits to state j up to time Ti} defines astationary measure with λi= 1. [See Section 3.]1.3 Null RecurrenceA recurrent state i is said to be null recurrent if EiTi= ∞.1.4 Positive RecurrenceA recurrent state i is said to be positive recurrent if EiTi< ∞.(i) If i is a positive recurrent state then there exists a stationary probability dis-tribution with πi= 1/EiTi. [See Section 3.] If the state space is irreduciblethen the stationary distribution is unique.(ii) If i ! j and i is positive recurrent then so is j. [See Section 2.]1.5 PeriodicityThe period of a state i is defined as the greatest common divisor of{n ∈ N : Pi{Xn= i} > 0}(i) If i ! j then i and j have the same period. [cf. Chang Notes page 16.](ii) If state i has period d then Pi{Xmd= i} > 0 for large enough integers m.[cf. Chang Notes page 30.]31.6 Basic limit theoremIf a chain is irreducible, positive recurrent, and aperiodic then(i) There exists a unique stationary probability distribution π.(ii) For every initial distribution µ,Xi∈S|Pµ{Xn= i}−πi| → 0 as n → ∞.2 Proof of assertion 1.4(ii)Suppose that state i is positive recurrent and the chain starts in state i. We aregiven that γ := EiTi< ∞. We need to show that EjTj< ∞.Write τ0= 0 < τ1< τ2< . . . for the times at which the chain is in state i.The first cycle starts at time 1 and ends at time τ1. The second cycle starts attime τ1+ 1 and ends at time τ2. And so on. We could also write each τkas asum T(1)i+ T(2)i+ · · · + T(k)i, where T(m)idenotes the length of the mth cycle. Therandom variables T(1)i, T(2)i, . . . are independent and identically distributed, eachwith expected value γ. [The variable T(1)iis the same as the variable Tiused todefine γ.]Similarly, define 1 ≤ σ1< σ2< . . . for the times at which the chain visitsstate j. Note that σ1is what we have also been denoting by Tj.More subtly, define random variables that pick out the cycles where there arevisits to state j:ν1= first m for which chain visits j at least once during mth cycleν2= first m > ν1for which chain visits j at lest once during mth cycleFor the history shown in the picture, note that ν2= 4 even though the second visitto state j occurs during the second cycle.4state istate j0τ1τ2τ3τ4σ1σ2σ3ν1=2 ν2=4timeThe events {chain visits j during mth cycle} are independent, each with thesame Piprobability θ. We must have θ > 0, otherwise the chain could nevervisit j, in violation of the assumption that i j. Consequently, ν1has a geometric(θ)distribution,Pi{ν1= m} = (1 − θ)m−1θ for m ∈ N,with expected value Eiν1= 1/θ. Similarly, ν2is the sum of two independentrandom variables, each distributed geometric(θ), with expected value Eiν2= 2/θ.The key idea is that during cycles 1, 2, . . . , ν2there must be at least two visitsto state j. That is, we must have σ2≤ τν2. Moreover, between times σ1and σ2the chain makes an excursion that starts and ends in state j. We can hope that abound on Eiτν2will provide a bound on EjTj.To get the bound on Eiτν2, first note thatτν2=Xk∈NT(k)i1{k ≤ ν2}You should check that if ν2= m then the sum on the right-hand side reduces toT(1)i+ T(2)i+ · · · + T(m)i= τm. Take expectations then condition.Eiτν2=Xk∈NEi(T(k)i| k ≤ ν2)Pi{k ≤ ν2}.The information k ≤ ν2tells us precisely that among cycles 1, 2, . . . , k − 1 therehas been at most one during which there was one or more visits to state j; Theevent {k ≤ ν2} gives only information about the first k − 1 cycles, whereas T(k)iis the length of the kth cycle. Independence of what happens from one cycle tothe next therefore lets us discard the conditoning information, leavingEi(T(k)i| k ≤ ν2) = Ei(T(k)i) = γ.5The expression for Eiτν2simplifies toγXk∈NPi{k ≤ ν2} = γEi(Xk∈N1{k ≤ ν2}) = γEiν2= 2γ/θ < ∞,becausePk∈N1{k ≤ ν2} = ν2. Thus Eiσ2≤ Eiτν2< ∞.Now we have only to extract the excursion from j back to j by conditioningon the value of σ1, which we can also write as Tj.Eiσ2=Xn∈NPi{Tj= n}Ei(σ2| Tj= n).Notice that we don’t need a contribution from Tj= ∞ becausePi{Tj= ∞} = Pi{chain never visits j } = 0.If you write Ei(σ2| Tj= n) asEi(σ2| X16= j, X26= j, . . . , Xn−16= j, Xn= j)you should see, by the Markov property, that Ei(σ2| Tj= n) = n + EjTj. Notehow the first n steps contribute to σ2. ThusEiσ2= EjTjXn∈NPi{Tj= n} +Xn∈NPi{j= n}n = EjTj+ EiTj.Not only can we now conclude that EjTj< ∞, so that state j is positive recurrent,but also that EiTj< ∞.3 Stationary distributions for Markov chainsSuppose i is a recurrent state for a Markov chain. For each j in the state space Sdefineλj= Ei{# visits to state j up to time Ti}= EiXn∈N1{Xn= j, n ≤ Ti}=Xn∈NPi{Xn= j, n ≤ Ti}Note that λi= 1 because Xn6= i for 1 ≤ n < Tiand Xn= i when n = Ti.<1> THEOREM.Pj∈SλjP (j, k) = λkfor each state k.6PROOF By definition,(∗) =Xj∈SλjP (j, k) =Xj∈SXn∈NPi{Xn= j, n ≤ Ti}P (j, k)The summand is equal toPi{Xn= j, Xn+1= k, n ≤ Ti}because Pi{Xn+1= k | Xn=


View Full Document

Yale STAT 251 - Summary of Markov Chain facts

Documents in this Course
Load more
Download Summary of Markov Chain facts
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Summary of Markov Chain facts and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Summary of Markov Chain facts 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?