DOC PREVIEW
Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

This preview shows page 1-2-19-20 out of 20 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

IntroductionExampleMartingaleMarkov processGoal of the course0. Differential and difference equations0.1. Linear differential equations in one variable0.2. Kermack-McKendrick 0.3. systems of first order equations0.4. Linear difference equationsHomework 0MATH 56A SPRING 2008STOCHASTIC PROCESSESKIYOSHI IGUSAContentsIntroduction 2Example 2Martingale 3Markov process 3Goal of the course 30. Differential and difference equations 40.1. Linear differential equations in one variable 40.2. Kermack-McKendrick 100.3. systems of first order equations 140.4. Linear difference equations 18Homework 0 20Date: January 23, 2008.12 PRELIMINARIESIntroductionIn the first lecture, I discussed the concept of a stochastic processand gave a very quick introduction to two of the main concepts in thiscourse: martingales and Markov processes. I also tried to convey theflavor and philosophy of the course.A stochastic process is defined to be a random process which evolveswith time. For example, if you toss two dice then you get the numbers2 through 12 with a certain fixed probability distribution. This isstandard probability theory. An example of a stochastic process mightbe: Toss two dice and get a total of X1. Then toss that many dice andget a total of X2and so on. As time goes on you will need a lot of dice!Example. The next example I gave was the question: What is theprobability that your family name will survive? The answer I got was 0.I.e., with probability 1, everyone on Earth will have the same last name.This is the male version. You get your last name from your father. But,you get your mitochondria from your mother. The female version isthat everybody on Earth will eventually have the same mitochondriawhich is true!The setup for the population extinction problem (which we will studymore carefully later) is the following.Start at time t = 0 with a male population of N0.Ntis the male population after t generations.X1is the number of male offspring from the first man,X2is the number of male offspring from the second man, etc.Then, the number of males in the next generation will beN1= X1+ X2+ ···+ XN0.Assume that Xiare independent identically distributed (i.i.d.) randomvariables. In particular, they all have the same expected value:E(X1) = E(X2) = ··· = µ.So,E(N1) = µN0.This repeats and we getE(Nt) = µtN0.This is exponential growth.MATH 56A SPRING 2008 STOCHASTIC PROCESSES 3Martingale. One thing that is good to do is to make a “martingale”:Mt=Ntµt.Then,E(Mt) = N0.This is constant. That makes Mta martingale. (A martingale is astochastic random variable which you expect to have the same valuetomorrow as it has today.)The “Martingale Convergence Theorem” now tells us that Mtcon-verges to M∞. On page 119 of our book, it says that, if µ > 1 thenE(M∞) = E(M0) (= N0in this case). But, in class I argued that µ = 1and that the expected value of M∞will be zero!Markov process. For this I converted the problem into a Markovprocess. This is defined to be a system in which there is a fixed setof states and each state there is a fixed probability of going to eachother state. For example, in the random walk the states are the integerpoints on the real line. If you are at any point, the probability of goingto the left one space is 1/2 and the probability of going to the rightone space is 1/2.For the population problem, the states are: 0, 1, 2, 3, 4, ··· and youare in state Ntat time t. Given Ntmen in generation t, there is a certainprobability of every possible number of males in the next generation.So, we have a Markov process. We will learn that, in a Markov system,there are only two types of states: “recurrent” and “transient”. Arecurrent state is one that you keep coming back to with probabilityone. The only recurrent state is 0 (extinction). All other states mustbe transient which means you only go there a finite number of times.I will explain this later in the course.Since all finite states except 0 are transient, the Markov processwill “almost surely” (a.s.) go to 0 or ∞. Almost surely means “withprobability one.” I did not explain why infinity is not possible. In anycase, the answer I got was P(M∞) = 0, a.s.Goal of the course. We will look more carefully at this and otherexample. But the main example that I am interested in is the Black-Scholes equation. Some of you already know this equation from eco-nomics where it is usually derived using a binomial distribution. Wewill do a more serious analysis of this equation using stochastic inte-gration. Since this is the last topic in the book, we need to cover theentire book! We will go very fast, skipping some of the things at thebeginning so that we can get to the end.4 PRELIMINARIES0. Differential and difference equationsWe have two days to go over the basics of linear differential equations.Differential equations is a one semester course and we don’t have timeto cover it in detail. However, to do Markov chains you just need tounderstand how first order linear differential equations work.0.1. Linear differential equations in one variable. In the first lec-ture, I discussed linear differential equations (Diffeq’s) in one variableof arbitrary order. I presented the problem and the complete solutionbut without proof. These missing proofs I appended at the end sothat these notes will faithfully represent the style and content of thelectures.The problem in degree d = 2 is to find a function y = f(t) so that:(0.1) y00+ ay0+ by + c = 0where a, b, c are constants. (The degree, or order is the number of timesthat the variables are differentiated. In this case the degree is 2 sincewe have y00.)0.1.1. particular solution. A (one) solution y = f0(t) of this equationis called a particular solution. It is really easy to find:y = f0(t) =−cb.This is a constant function. It’s derivative (and higher derivatives) arezero: y0= y00= 0. So, when you plug it into Equation (0.1) you get0 + 0 + by + c = 0 ⇒ y = −c/b.If b = 0 then the answer isy = f0(t) =−cat.This is also easy to see: y0= −c/a and y00= 0. So,y00+ ay0+ by + c = 0 + a(−c/a) + 0 + c = −c + c = 0.Now, suppose you have another solution y = f(t).0.1.2. homogeneous equation.Lemma 0.1. If f0(t), f (t) are two solutions of the differential equationthen the differencey = f(t) − f0(t)is a solution of the homogeneous equationy00+ ay0+ by = 0.(This is the original equation minus the constant term c.)MATH 56A SPRING 2008 STOCHASTIC PROCESSES 5This lemma implies that


View Full Document

Brandeis MATH 56A - MATH 56A SPRING 2008 STOCHASTIC PROCESSES

Documents in this Course
Load more
Download MATH 56A SPRING 2008 STOCHASTIC PROCESSES
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view MATH 56A SPRING 2008 STOCHASTIC PROCESSES 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?