DOC PREVIEW
Brandeis MATH 56A - MATH 56A: STOCHASTIC PROCESSES CHAPTER 9

This preview shows page 1-2-23-24 out of 24 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MATH 56A: STOCHASTIC PROCESSESCHAPTER 99. Stochastic integrationI will continue with the intuitive des cription of stochastic integralsthat I started last week.9.0. the idea. I already talked about the probabilitistic and analyticapproach to Brownian motion. Stochastic integrals combine thesemethods. A key idea is L´evy’s quadratic variation which is used inKunita and Watanabe’s [2] reformulation of stochastic integration.9.0.1. quadratic variation. We want to define the stochastic integralZt=!t0YsdXswhere Xsis Brownian motion in R1. However, there is a big problembecause dXshas unbounded variation1. In other words,!|dXs| := limδt→0"|Xti+1− Xti| = ∞.Fortunately, we can still define the stochastic integral because the “qua-dratic variation” of Xt(denoted by #X$t) is bounded:Theorem 9.1 (L´evy).#X$t=!t0(dXs)2:= limδt→0"(Xti+1− Xti)2= twith probability one.Proof. (p. 207) It is easy to see that the quadratic variation is approx-imately equal to t since the summands have expected value:E((Xti+1− Xti)2) = ti+1− ti= δtSo the sum has expected value:E#"(Xti+1− Xti)2$="δt = tDate: December 4, 2006.1The Riemann sum converges if and only the function has bounded variation.12 MATH 56A: STOCHASTIC PROCESSES CHAPTER 9The variance of each summand is2:V ar((Xti+1− Xti)2) = E((Xti+1− Xti)4) − E((Xti+1− Xti)2)2= 2δt2So, the sum have variance:V ar("(Xti+1− Xti)2) ="2δt2= 2tδt → 0This means that, in the limit, the sum has zero variance and is thereforenot random. The value of this limit is almost sure equal to its expectedvalue which is t. !This theorem is usually written in the differential form(9.1) (dXt)2= dtFor arbitrary increments δt of t this is(9.2) (δXt)2:= (Xt+δt− Xt)2= δt + oeff(δt)where I labeled the error term as an effective little-oh. Usual: o(δt)/δt →0 as δt → 0. But effective little-oh means: If you take N ≈ 1/δt inde-pendent copies of oeff(δt) you get:(9.3)"1/δt copiesoeff(δt) → 0 as δt → 0These three equations (9.1), (9.2), (9.3) summarize the statementand proof of L´evy’s theorem on quadratic variation of Brownian mo-tion.9.0.2. It¯o’s formula. Using quadratic variation we can “prove” It¯o’sformula.Suppose that we have a particle density function f (x) for x ∈ Rand Xtis Brownian motion. The probabilistic argument said that weshould look for the expected present value of f at the future position2This is an easy calculation. The moment generating function for the standardnormal distribution isE(eXt) =!ext−x2/2dx/√2π = et2/2!e−(x−t)2/2dx/√2π = et2/2The coefficient of t2nin E(eXt) is E(X2n)/(2n)! and the coefficient of t2nin et2/2is 1/n!2n. Therefore, for X ∼ N (0, 1),E(X2n) =(2n)!n!2n= (2n − 1)!! := 1 · 3 · 5 · 7 ···(2n − 1)You need to multiply by σ2nwhen X ∼ N(0, σ2).MATH 56A: STOCHASTIC PROCESSES CHAPTER 9 3Xt. So, we assume that f(x) is not time dependent. It only varies withposition x. Do you remember the following formula?f(Xt+δt) −f(Xt) = f"(Xt)(Xt+δt−Xt) +12f""(Xt)(Xt+δt−Xt)2+ o(δt)This can be abbreviated:δf(Xt) = f"(Xt)δXt+12f""(Xt)(δXt)2+ o(δt)Use quadratic variation: (δXt)2= δt + oeff(δt). Then:δf(Xt) = f"(Xt)δXt+12f""(Xt)δt + oeff(δt)Now take the sum from 0 to t. (We need to change t above to s so thats can be the variable going from 0 to t: 0 ≤ s ≤ t.)f(Xt) − f(X0) ="f"(Xs)δXs+"12f""(Xs)δs +"oeff(δs)Now, take the limit as δs → 0. Then the last term goes to zero by(9.3) and we get It¯o’s formula:(9.4) f(Xt) − f(X0) =!t0f"(Xs) dXs+!t012f""(Xs) dsHere the stochastic integral is!t0f"(Xs) dXs:= limδs→0"f"(Xs)δXs9.0.3. discussion. Why is this not a proof of It¯o’s formula? The mainthing is that we have n’t defined the stochastic integral:Zt=!t0YsdXsWe only showed that the traditional “limit of Riemann sum” definitionmakes sense and gives something which satisfies It¯o’s formula in thespecial case when Yt= f"(Xt) is the derivative of a twice differentiablefunction of standard Brownian motion Xt. In general we need theintegral defined for predictable stochastic processes Ys. This means Ysmust be Fs-measurable and left continuous. Some people (e.g., ourbook) take Ysto be right continuous. However, following my “bible”[4], it makes more intuitive sense to have information (Xtand Ft) beright continuous and processes Ytbased on this information should bepredictable.4 MATH 56A: STOCHASTIC PROCESSES CHAPTER 99.1. discrete stochastic integrals. Stochastic integrals are constructedin three steps. First you have discrete time and finite state space (afinite Markov process). Then you have continuous time and finite statespace (a continuous Markov chain). Then you take a limit.The important properties of the construction are visible at each step:• The construction is linear.• The result is a martingale Zt.• Z2t−#Z$tis also a martingale where #Z$tis the quadratic vari-ation of Zt.Compare this with what you know about Brownian motion:(1) Xtis a martingale.(2) X2t− t is also a martingale.(3) #X$t= t by L´evy’s theorem which we just proved.9.1.1. set up. Take simple random walk on Z. This gives a martingaleXnwith X0= 0 and increments Xn+1−Xn= ±1 with equal probability.Suppose that Ynis a predictable process, i.e., Ynis Fn−1-measurable.The discrete integral isZn:=n"i=1Yi(Xi− Xi−1) =n"i=1YiδXi(This is supposed to resemble%Y dX.)The idea is that, at time n, you place a bet Yn+1that Xnwill increase.The money that you win or lose at that step isYn+1(Xn+1− Xn)Since you cannot see the future, Yn+1is only Fn-measurable.9.1.2. linearity. This construction satisfies the following linearity con-dition:"(aYi+ bVi)δXi= a"YiδXi+ b"ViδXiIn short, Znis a linear function of {Yi}.9.1.3. martingale.Theorem 9.2. Znis a martingale and Z0= 0.Proof. This is easy to verify:E(Zn|Fn−1) = Zn−1+ E(Yn(Xn− Xn−1) |Fn−1)Since Ynis Fn−1-meaaurable, the last term vanishes:E(Yn(Xn− Xn−1) |Fn−1) = YnE(Xn− Xn−1|Fn−1) = 0MATH 56A: STOCHASTIC PROCESSES CHAPTER 9 5So,E(Zn|Fn−1) = Zn−1!9.1.4. quadratic variation. The quadratic variation of Znis just thesum of squares of differences:#Z$n:=n"i=1(Zi− Zi−1)2="Y2isince these differences areZi− Zi−1= Yi(Xi− Xi−1) = ±YiTheorem 9.3. Suppose that E(Y2i) < ∞ for each i. Then (Zn)2−#Z$nis a martingale. In particular,V ar(Zn) = E(Z2n) =n"i=1E(Y2i)Proof. The difference between Z2nand the quadratic variation of Znisjust the sum of the cross terms:Z2n−


View Full Document

Brandeis MATH 56A - MATH 56A: STOCHASTIC PROCESSES CHAPTER 9

Documents in this Course
Load more
Download MATH 56A: STOCHASTIC PROCESSES CHAPTER 9
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view MATH 56A: STOCHASTIC PROCESSES CHAPTER 9 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view MATH 56A: STOCHASTIC PROCESSES CHAPTER 9 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?