Unformatted text preview:

STOR 556 Time Series Data Analysis Spring 2022 Lecture date January 27 Scribe Younghoon Kim Lecture 6 Autoregressive models Last time we derived candidate causal non causal solutions to the AR 1 equation We now show that these solutions are stationary Consider more generally cid 88 j cid 88 cid 88 j1 j2 cid 26 0 2 Z Xt ajZt j Zt WN 0 2 Z Obviously EXt cid 80 j ajEZt j 0 For autocovariance function ACVF Cov Xt Xt h E XtXt h aj1aj2 E Zt j1Zt h j2 2 Z ajaj h X h cid 88 j since E Zt j1Zt h j2 if t j1 t h j2 if t j1 t h j2 j2 j1 h Note that 2 does not depend on t The model 1 is called a linear time series model Now let s go back to a causal solution of AR 1 equation where 1 Xt cid 80 We have aj 0 for j 0 So for h 0 j 0 jZt j X h 2 z cid 88 j 0 j cid 124 cid 123 cid 122 cid 125 aj j h cid 124 cid 123 cid 122 cid 125 aj h 2 z h 2j cid 88 j 0 2 z h 1 2 1 1 a 1 a a2 if a 1 Also X h X h since when 1 rewrite the solution Xt cid 80 to X 0 h For non causal solution j jZt j This leads k 1 kZt k cid 80 1 X h 2 Z j j h 1 cid 88 j Z h 2 2 1 2 2 Z 2 1 h 1 1 2 Note that by replacing by 1 and 2 For this reason we focus on 1 case for identification purpose Z by 2 Z 2 one has the equivalence of 3 and 4 Recall that a general polynomial of order n has n roots in general but they may be complex valued For a complex number z z1 iz2 one can visual it on a plane as in the left panel 1 2 3 4 6 1 Figure 1 Illustration of the representation on a complex plane of Figure 1 The circle centered at the origin is called a unit circle Now think about a linear polynomial z 1 z for AR 1 model Then a root is 1 So the root for 1 lies outside the unit circle Similarly for 1 case the root is placed inside the unit circle This is illustrated in the right panel of Figure 1 Here is an another simpler derivation for the autocovariance function of AR 1 model Consider Xt Xt 1 Zt multiply both sides by Xt 1 and take E This gives us since Xt 1 Zt 1 Zt 2 Likewise by multiplying both sides by Xt 2 yields In general X h h X 0 Finally by multiplying both sides by Xt leads to X 1 X 0 X 2 X 1 2 X 0 X 0 X 1 2 Z 5 6 Combining 5 and 6 gives us X 0 2 1 2 Finally we get X h 2 Z h 1 2 Z We introduce another perspective on the solution of AR 1 model Consider the case 1 Xt Xt 1 Zt Xt Xt 1 Zt I B Xt Zt Xt Zt I B 2B2 Zt Zt Zt 1 2Zt 2 1 I B 6 2 For 1 I B Xt Zt 1 I B Zt Xt cid 18 1B 1 cid 19 I 1B 1 Zt 1B 1 I 1B 1 2B 2 Zt 1Zt 1 2Zt 2 3Zt 3 One can use this argument for more general models E g consider AR 2 Then one has where z 1 1z 2z2 If we define roots of this polynomial as r1 r2 then z 1 1 r1 z This gives us B 1 1 r1 B so that 7 becomes B 1 1 r2 z 1 1 r2 Xt 1Xt 2 2Xt 2 Zt I 1B 2B2 cid 124 cid 123 cid 122 cid 125 B Xt Zt Xt cid 16 1 cid 17 cid 16 cid 17 Zt I 1 r1 B I 1 r2 B 7 8 If the roots of z are outside the unit circle one can show that 8 defines a causal stationary solution to AR 2 equation AR models can be viewed from the perspective of differentiated equations Consider AR 1 If the time steps are small one can think of xt xt 1 as model xt xt 1 zt first a derivative of xt with respect to t That is by subtracting xt on both sides we get a first order differential equation as xt xt 1 1 xt zt xt 1 xt zt The solution of the first order differential equation behaves like an increasing or decreasing exponential function However a second order differential equation gotten from an AR 2 model xt a1 xt a2xt zt now can have a different behavior from a solution to the first order differential equation In particular if the two complex valued roots are outside the unit circle the solution exhibits a harmonic oscillation As we can see through the lecture material AR 2 can exhibit not only both positive and negative correlations but also cyclical behavior Autoregressive moving average models Now we look at ARMA p q model q refers to the order of the moving average part For AR p we take q 0 The ARMA p q equation is Xt 1Xt 1 pXt p Zt 1Zt 1 2Zt 2 qZt q 6 3 But why don t we focus on moving average model This relates to the homework problem Let s consider MA 1 model with 1 Xt Zt Zt 1 I B Zt 9 1 I B Xt Zt Xt Xt 1 2Xt 2 3Xt 3 cid 125 cid 124 Zt cid 123 cid 122 In practice if we truncate the infinite number of terms to AR 3 as in the difference from MA 1 will be small Thus an autoregressive model is sufficient but note that MA model may be more parsimonious the simpler model is better The autocorrelation function of 9 can be shown to be X h h 0 1 1 2 h 1 0 h 2 3 10 which looks simple However while estimation of in AR 1 Xt Xt 1 Zt is straight forward estimation the estimation of in MA 1 Xt Zt Zt 1 is more delicate This is because zt is latent for example suppose we have observations x1 x2 xT Then we want to estimate x1 z1 z0 z1 x1 x2 z2 z1 z2 x2 x1 x3 z3 z2 z3 x3 x2 1 take z0 0 zT zT zT 1 zt xT so that to solve min ically difficult In contrast for AR 1 model cid 80 T t 1 zt 2 is to minimize a polynomial of order T 1 which is numer t 2 xt xt 1 2 is a quadratic polynomial cid 80 T Note that a simple estimator of for MA 1 model is that …


View Full Document

CUHK- Shenzhen STOR 556 - Lecture 6

Download Lecture 6
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 6 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 6 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?