Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

Unformatted text preview:

1. Finite Markov Chains1.1. Concept and examples1.2. Long term behavior1.3. Invariant probability distribution1.4. Transient classes1.5. Canonical form of P1.6. The substochastic matrix Q1.7. Transient and recurrentHomework 1a: Finite Markov ChainsHomework 1b: Leontief ModelMATH 56A SPRING 2008STOCHASTIC PROCESSESKIYOSHI IGUSAContents1. Finite Markov Chains 221.1. Concept and examples 221.2. Long term behavior 271.3. Invariant probability distribution 311.4. Transient classes 361.5. Canonical form of P 401.6. The substochastic matrix Q 451.7. Transient and recurrent 53Homework 1a: Finite Markov Chains 54Homework 1b: Leontief Model 55Date: February 11, 2008.2122 FINITE MARKOV CHAINS1. Finite Markov Chains1.1. Concept and examples. On the first day I explained the con-cept behind finite Markov chains, gave the definition and two examples.But first I explained how you convert a higher order difference equationinto a first order matrix equation. When we randomize this process weget a finite Markov chain.1.1.1. reduction to first order. I used the Fibonacci sequence as anexample to illustrate how higher order equations can be reduced tofirst order equations in more variables. The Fibonacci sequence is thesequence1, 1, 2, 3, 5, 8, 13, · · ·given by the second order difference equationf(n) = f(n − 1) + f(n − 2).To convert this to first order you letg(n) := f (n − 1).Then f(n − 2) = g(n − 1) and the original equation becomes:f(n) = f(n − 1) + g(n − 1).Thus (f(n), g(n)) depends only on (f(n − 1), g(n −1)) and the relationis given by the matrix equation:(f(n), g(n)) = (f (n − 1), g(n − 1))1 11 0I explained it like this: You have to make your decision about whatto do tomorrow based on the information you have today. You can onlyuse the information that you had yesterday if you recorded it. Thus,every day, you need to record the important information, either onpaper or in your computer, otherwise it is lost and won’t be availabletomorrow.The Fibonacci sequence, in this first order form, looks like this:n 0 1 2 3 4 5f(n) 1 1 2 3 5 8g(n) 0 1 1 2 3 5So, on Day 4, the information you have is today’s number 5 and therecord you kept of yesterday’s number 3. You add these to get tomor-row’s number 8 and you record the number 5 so that you have still haveit tomorrow. Each day you look at the information you get that dayand the information that was recorded from the past. So, this processis realistic and makes sense.MATH 56A SPRING 2008 STOCHASTIC PROCESSES 231.1.2. concept. “A stochastic process is a random process which evolveswith time.” This definition is too broad for a careful, complete math-ematical analysis, especially at the beginning.We want to start with simple models that we can analyze and under-stand completely. Then we will go to more and more general modelsadding complexities one step at a time.A Markov chain is a stochastic process which has four simplifyingassumptions:(1) There are only finitely many states. For example, in the Kermack-McKendrick model, there were only 3 states: S, I, R. I also usedthe example of the Brandeis campus. If we made the movementof people on campus into a Markov process then the set of stateswould be the buildings (plus one for the outside). Your exactlocation, for example which room you were in, is disregarded.(2) Time is discrete. Time is a nonnegative integer (starting att = 0). For example, for movement of people on campus, peopleare only allowed to move from building to building on the hour.Or, we only record or notice which building people are in at1pm, 2pm, 3pm, etc.(3) You forget the past. What happens at time n + 1 depends onlyon the situation at time n. Which building you are in at 2pmdepends only on which building you were in at 1pm. If you addmore states (more variables), you can keep track of informationfrom the past and still satisfy the “forget the past” rule.(4) Rules of movement do not change with time. If, at 2pm, every-one in building 2 move to building 5 then the same thing willhappen at 3pm, 4pm, etc. The Fibonacci sequence or any firstorder recurrence has this property.I used two examples to illustrate these principles.1.1.3. mouse example.11from “Markov Chains ... ” by Pierre Br´emaud24 FINITE MARKOV CHAINSIn this example, a mouse is randomly moving from room to room. Thecat and cheese do not move. But, if the mouse goes into the cat’s room,he never comes out. If he reaches the cheese he also does not come out.This will be a Markov chain with the following details.(1) There are 5 states (the five rooms). I numbered them: 1,2,3,4,5.(2) The mouse moves in integer time, say every minute.(3) The mouse does not remember which room he was in before.Every minute he picks an adjacent room at random, possiblygoing back to the room he was just in.(4) The probabilities do not change with time. For example, when-ever the mouse is in room 3 he will go next to room 2,4 or 5with equal probability.The mouse moves according to the transition probabilitiesp(i, j) = P(the mouse goes to room j when he is in room i).These probabilities form a matrix called the transition matrix :P = (p(i, j)) =1 2 3 4 5123450 1/2 0 1/2 01/2 0 1/2 0 00 1/3 0 1/3 1/30 0 0 1 00 0 0 0 1I pointed out the two important properties of this matrix:(1) Every row adds up to 1. This is because the mouse has to gosomewhere or stay where he is. When all of the possibilities arelisted and they are mutually exclusive, the probabilities mustadd up to 1.(2) The entries are nonnegative and at most 1 (because they areprobabilities).1.1.4. students example.2Each year, the students at a certain college either flunk out, repeatthe year or go on to the next year with the following probabilities:p = P(flunking out of school)q = P(repeating a year)r = P(passing to the next year)The first step is to determine what are the states. Then find the tran-sition matrix. Later we can answer other questions, such as: What is2from “Finite Markov Chains” by Kemeny and SnellMATH 56A SPRING 2008 STOCHASTIC PROCESSES 25the probability that a Freshman will eventually graduate? and howlong will it take?There are 6 states: the student is either(1) Freshman(2) Sophomore(3) Junior(4) Senior(5) graduated(6) flunked outThe transition matrix isP = (p(i, j)) =1 2 3 4 5 6123456q r 0 0 0 p0 q r 0 0 p0 0 q r 0 p0 0 0 q r p0 0 0 0 1 00 0 0 0 0 1It is important to notice that the rows add


View Full Document

Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

Documents in this Course
Load more
Download MATH 56A SPRING 2008 STOCHASTIC PROCESSES
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?