**Unformatted text preview:**

STOR 556: Time Series Data Analysis Spring 2022Lecture 5Lecture date: January 25 Scribe: Younghoon KimStationary time series models:We focus on random (irregular) fluctuations left after the classical decomposition, whichcan still exhibit a dependence over time. Let’s look at stationary time series models; whosestatistical properties do not change over time. For example,Xt= Zt+ Zt−1, Ztis iid noise.A typical example of non-stationary model is a random walk;Xt= Zt+ Zt−1+ . . . + Z1= Xt−1+ Ztbecause its variability increases over time.Stationarity means that the mean does not depend on time t and the covariance of XtandXt+hdoes not depend on t but possibly depends on lag h. We name these covariances ofXtand Xt+hautocovariance function (ACVF) of X. ‘Auto’ refers to both variables XtandXt+hbeing for X. Note thatCov(Xs, Xt) = γX(t − s),Cov(Xt, Xt) = Var(Xt) = γX(0).For autocorrelation function (ACF) of X, obviously ρX(0) = 1 at h = 0. And it is symmetricρX(h) = ρX(−h). ACVF has the same property since by assuming EXt= 0 for simplicity,γX(h) = E(XtXt+h) = E(Xt+hXt) = γX(−h).Lastly, the non-negative definiteness here means that if you think of a vector (X1, . . . , XT)and its correlation as a matrix, it is a non-negative matrix.Sample quantities:For stationary models, because their properties do not change over time, we replace anyexpected values EXtor EXtXt+hand so on by averages of the data over time.For the sample ACVF described in the class notes,ˆγ(h) =1TT −hXt=1(xt+h− ¯x)(xt− ¯x). (1)5-1Figure 1: Illustration of the exampleThe example of correlogram is depicted in the left panel of Figure 1. Note that summationin (1) is up to T − h which is decreasing in h. This is because we make T − h pairs of data.One can illustrate this through the middle panel of Figure 1. So, as the lag h gets larger,it is more difficult to estimate because we lack data. Each point of correlogram representsthe correlation between xtand xt+hat the right panel of Figure 1 as we have seen before.Finally, note that the denominator in the sample ACVF (1) is T , not T − h.Here, the sample ACF can be computed for any time series. For example, suppose themodel has a trend. Then the corresponding correlogram looks like below:You are asked to think in homework about the case when the model has a seasonal compo-nent.White noise:5-2Two things are mentioned: One, the blue dashed lines represent 95% confidence intervalsfor each calculated sample ACF. This is used for choosing a model. Two, for the time being,we don’t have a distinction between i.i.d. noise and white noise; technically, i.i.d. model isalso a white noise model but converse does not hold in general.Autoregressive series:A convenient tool for writing an AR model is a backward shift operator B. For AR(1),Xt= ϕ1Xt−1| {z }ϕ1BXt+Zt(2)⇔ (I − ϕ1B)Xt= Zt(I is used for ”identity”, IXt= Xt.)⇔ ϕ(B)Xt= Zt, ϕ(z) = 1 − ϕ1z.Consider AR(1) model and set ϕ = ϕ1. First, let’s discuss the existence of a stationarymodel (solution) that satisfies the AR(1) equation. Note thatXt= ϕXt−1+ Zt= ϕ(ϕXt−2+ Zt−1) + Zt= ϕ2Xt−2+ ϕZt−1+ Zt= ϕ3Xt−3+ ϕ2Zt−2+ ϕZt−1+ Zt= . . .= Zt+ ϕZt−1+ ϕ2Zt−2+ ϕ3Zt−3+ . . .=∞Xj=0ϕjZt−j. (3)This candidate solution makes sense when |ϕ| < 1. It turns out that (i) it is well-definedand stationary. Also, (ii) it satisfies AR(1) equation sinceXt= Zt+ ϕZt−1+ ϕ2Zt−2+ . . .= Zt+ ϕ(Zt−1+ ϕZt−2+ . . .)= Zt+ ϕXt−1.5-3There is also a stationary solution when |ϕ| > 1. By rearranging (2),Xt−1= ϕ−1Xt− ϕ−1Zt⇒ Xt= ϕ−1Xt+1− ϕ−1Zt+1= ϕ−1(ϕ−1Xt+2− ϕ−1Zt+2) − ϕ−1Zt+1= ϕ−2Xt+2− ϕ−2Zt+2− ϕ−1Zt+1= ϕ−3Xt+3− ϕ−3Zt+3− ϕ−2Zt+2− ϕ−1Zt+1= . . .= −ϕ−1Zt+1− ϕ−2Zt+2− ϕ−3Zt+3− . . .= −∞Xj=1ϕjZt+j. (4)Indeed, this candidate solution also satisfies (i) and (ii). The solution (3) is called ”causal”(4) is called ”non-causal”. When ϕ = ±1, there is no stationary solution (Homwork prob-lem). Finally, in modeling, we choose the causal model because it is more natural in thesense that we regress on past observed values, and not on the future

View Full Document