DOC PREVIEW
Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

This preview shows page 1-2-3-4 out of 11 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

3. Continuous Markov Chains3.1. Poisson process3.2. Definitions3.3. probability transition matrixMATH 56A SPRING 2008STOCHASTIC PROCESSESKIYOSHI IGUSAContents3. Continuous Markov Chains 783.1. Poisson process 783.2. Definitions 813.3. probability transition matrix 84Date: March 16, 2008.7778 CONTINUOUS MARKOV CHAINS3. Continuous Markov Chains“Continuous” means continuous time. We still have a countable (andthus discrete) set of states. I started by saying that continuous Markovchains are similar to Poisson processes. Later I discussed when they aretransient, positive recurrent and null recurrent. There is also anotherinteresting possibility. Continuous Markov chains can explode!3.1. Poisson process. I started with the following example. Cus-tomers come into a store at a rate of one every 212minutes. (λ =25perminute.) Assume this is a Poisson process. This means:(1) (independence) The number of occurrences of the event in oneinterval of time is independent of the number of occurrences ina different disjoint) time interval.(2) (time homogeneous) The rate λ at which the event occurs isconstant (independent of the time). For example, the rate isthe same at midnight as it is at noon.(3) Customers arrive one at a time. I.e., the event never occurstwice at exactly the same time. Even if you go with a friend,we assume that they have a high speed camera similar to thoseused on a race track to determine who enters the store first.3.1.1. conversion to Markov chain. We convert this into a Markovchain by letting Xtbe the number of occurrences of the Poisson eventin the time interval (0, t]. This means that X0= 0. The state space isthe set of nonnegative integers:S = {0, 1, 2, 3, · · · }and the graph giving the state Xtat time t looks like this: In this0123T1T1+ T2↑t = 0•••figure,T1= 1st time that the event occursT2= time between 1st and second occurrence of eventMATH 56A SPRING 2008 STOCHASTIC PROCESSES 79Each time Tnis an exponential random variable.3.1.2. exponential variable. Every Poisson process has an exponentialvariable associated with it. This is the time T between occurrences ofthe event. In the Markov chain T is how long you stay in the state youare in. I proved the following theorem in class:Theorem 3.1.P(T1> t) = P(Xt= 0) = e−λt.(And the same holds for T2, T3, · · · .)In order to prove this I needed the definition:Definition 3.2. The rate λ of a Poisson process is defined to be thelimit:λ = lim∆t→0P(event occurs in ∆t)∆tThis equation is often written using “little oh”:P(event occurs in ∆t) = λ∆t + o(∆t)So, the probability that it doesn’t occur isP(event does not occur in ∆t) ≈ 1 − λ∆tThe definition of the little oh is that this is a quantity that goes to zerofaster than the named quantity:lim∆t→0o(∆t)∆t= 0If you compare this definition with the definition of the rate λ you willsee that it says exactly the same thing.Proof. We want to calculate the probability P(T1> t). This is theprobability that the event does not occur in the time interval (0, t]. Tocalculate this you break up this interval into little intervals:∆1t ∆2t ∆3t| | | | | | | | | |0 t| {z }N piecesIf there are N equal pieces then each piece has length∆t =tNThe ith pieces is∆it =(i − 1)tN,itN80 CONTINUOUS MARKOV CHAINSIn order for the event not to occur during the entire interval, it needsto not occur in each of these little intervals ∆it. Since these are inde-pendent:P(T1> t) =NYi=1P(event does not occur in ∆it)≈YN(1 − λ∆t) =YN1 − λtN=1 −λtNNThe exact value is given by taking a limit:P(T1> t) = limN→∞1 −λtNN= e−λt From this equation we can find the cdf and pdf of T = T1:The cumulative distribution function (cdf) of a random variable T isFT(t) = P(T ≤ t).For the exponential variable, it is:FT(t) = 1 − P(T > t) = 1 − e−λtif t ≥ 0(And FT(t) = 0 if t < 0.)The probability density function (pdf) is the derivative of the cumu-lative distribution function when the cdf is differentiable:fT(t) =ddtFT(t) =ddt1 − e−λt= λe−λtMATH 56A SPRING 2008 STOCHASTIC PROCESSES 813.2. Definitions. On the second day I gave more precise definitions:A continuous finite Markov chain has set of states S = {1, 2, · · · , n}with transition probabilities given by an infinitesimal generator A.This is an n × n matrix A = (α(i, j)) whose rows add to zero andwhose negative entries are all along the diagonal. If the entries arewritten aij= α(i, j) the equations corresponding to these statementsare:(1) aij≥ 0 if i 6= j(2) ai1+ ai2+ · · · + ain= 0Example 3.3.A =−3 1 20 −2 23 1 −4This is an infinitesimal generator. The negative numbers are all on thediagonal and the rows add up to zero. The numbers off the diagonalare positive or zero. Notice that the numbers can be greater than 1.3.2.1. The numbers aijare the infinitesimal transition probabilities:aij= α(i, j) = limt→0+P(Xt= j | X0= i)tI used the following example to explain what this means.Example 3.4.−6 60 0This means: The transition from state 1 to• •1 2Boston Cambridgestate 2 is a Poisson process with rateλ = 6 /yearQuestion: If this represents people moving from Boston to Cambridge,how long does a Boston resident expect to remain in Boston?Answer: 2 months.The rate λ = 6 does not mean 6 people per year, it means 600% peryear! This is the same as 50% per month. If half the people leave inone month then you might think the city will be empty in 2 months.But:T = time until the 1st occurrence of the event (the jump 1 → 2).82 CONTINUOUS MARKOV CHAINST is an exponential variable with rate λ = 6. This meansP(T > t) = e−λt= e−6t.e−λt= 1 − λt| {z }1st order+λ2t22!+λ3t33!+ · · ·• When t is very small, t2, t3, etc are negligible ande−λt≈ 1 − λtP(T > t) ≈ 1 − λtwith error o(t).P(T ≤ t)t≈λtt≈ λwith error o(t)/t. Since o(t)/t → 0 as t → 0, we get:limt→0+P(T ≤ t)t= λ• t = 1/6 (2 months):P(T > t) = e−6t= e−6/6= e−1≈ 0.368This is the probability that you don’t jump to state 2. I.e., if people areleaving Boston at the rate of 50% per month then in 2 months 63.2%will have left and 36.8% will remain. The number 0 is just the firstorder approximation:e−λt≈ 1 − λt = 1 − 6(16) = 0.MATH 56A SPRING 2008 STOCHASTIC PROCESSES 833.2.2. computation of expected value. The definition of expected valueisE(T ) =Zt fT(t) dtSince we know thatpdf = fT(t) = λe−λtfor


View Full Document

Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

Documents in this Course
Load more
Download MATH 56A SPRING 2008 STOCHASTIC PROCESSES
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?