Lecture Notes 7:Model with More Components..ECMT 475: Economic Forecasting (Spring 2011)Guangyi MaRegression Models Textbook Chapter 11 So far we have studied modelling and forecasting techniques for a univariatetime series. Multivariate model might be more useful: a variable can be explained andforecast on the history of other variables, as well as their own history. In the trend and seasonal components, one or more variables other thanthe lagged y are used, but they are deterministic. Consider regression modelsyt= 0+ 1xt+ etwhere et W N0; 2e Correlation VS causation Endogenous VS exogenous variables Bivariate data: (yt; xt) Multivariate data: (yt; x1t; x2t; ; xkt)Forecasting in Regression Models Forecast for y at h-period-ahead, conditional on x, isyT +h;TjxT +h= 0+ 1xT +h This forecast of yT +hrequires knowledge of xT +h. This is not typically feasible. We might use a forecast of x. Suppose x follows an AR(1) processxt= xt1+ utwhere ut W N0; 2u Then we can combine the modelsyt= 0+ 1xt+ etwhere et W N0; 2ext= xt1+ utwhere ut W N0; 2uto haveyt= 0+ xt1+ "twhere "t W N0; 2 Here = 1, "t= 1ut+ et. If etand utare independent, then 2=?Distributed Lags If instead we assume an AR(q) model for x, thenyt= 0+ 1xt1+ 2xt2+ + qxtq+ "t= 0+ A (L) xt1+ "twhere "t W N0; 2 This is called a "distributed lags" model. The coe¢ cients can be interpretted as the e ¤ect of x on y:– 1is the immediate impact– 1+ + qis the long-run impactDistributed Lags with Lagged DependentVariables We can also include past dynamics of y to explain itselfyt= 0+ 1yt1+ + pytp+ 1xt1+ + qxtq+ "tyt= 0+ B (L) yt1+ A (L) xt1+ "t Can use arima or regress command Forecast procedures as beforeDistributed Lags with ARMA Disturbances We can further include ARMA disturbancesyt= 0+ A (L) xt1+ "t"t=(L)(L)twhere t W N0; 2 Distributed lags with AR(1) disturbances is equivalent to? Example:yt= 0+ 1xt1+ "t"t= '"t1+ twhere t W N0; 2Most General Framework This is the so-called transfer functionyt=A(L)B(L)xt+C (L)D (L)"twhere "t W N0; 2 All previous models can be viewed as special cases of this framework.Vector Autoregressions VAR is a linear regression model for a set of variables. System of equations, each for one variable. In each equation, VAR(p) uses p lags of eac h variable to explain currentvalue of the L.H.S. variable. Allow for cross-variable dynamics. Consider a two-variable VAR(1):y1;t= '11y1;t1+ '12y2;t1+ "1;ty2;t= '21y1;t1+ '22y2;t1+ "2;twhere"1;t W N0; 21"2;t W N0; 22cov"1;t; "2;t= 12 Correlated disturbance: a contemporary shock on one variable might a¤ectanother variable as well. Estimation: equation-by-equation OLS or system estimation procedure.Predictive Causality Statistical notion of causality is di¤erent from the "common sense" causal-ity. Predictive causality means a variable is helpful in forecasting of anothervariable. True causality could actually be the reverse. We say a variable y2(predictively) causes y1if lagged values of y2havetrue non-zero coe¢ cients in the y1equation. In the two-variable VAR(1) model, "y2causes y1" if '126= 0. In economics, often called “Granger causality”. Clive Granger (1934-2009): winner of 2003 Nobel Prize, famous time-serieseconometrician. Test of predictive causality: F testImpulse-Response Functions Question of interest: how does 1 unit innovation to a variable a¤ect itselfas well as other variables in the future? Need to convert the system into MA(1) form Also need normalization (a di¤erent MA(1) representation) to generateIRF from structural innovation. Consider the two-variable VAR(1) model,y1;t= '11y1;t1+ '12y2;t1+ "1;ty2;t= '21y1;t1+ '22y2;t1+ "2;t"1;t W N0; 21"2;t W N0; 22cov"1;t; "2;t= 12 The standard MA(1) representation isy1;t= "1;t+ '11"1;t1+ '12"2;t1+ y2;t= "2;t+ '21"1;t1+ '22"2;t1+ Disturbances are contemporarily correlated! Normalization by Cholesky decomposition to have a di¤erent MA(1) rep-resentationy1;t= b011"01;t+ 0 + b111"01;t1+ b112"02;t1+ y2;t= b021"01;t+ b022"02;t+ b121"01;t1+ b122"02;t1+ "01;t W N (0; 1)"02;t W N (0; 1)cov"01;t; "02;t= 0 Then bkijrepresents the response of the i-th variable at time (t + k) tothe j-th structural innovation at time t.Variance Decompositions Another way of characterizing the dynamics associated with VARs Closely related to impulse-response functions Answers to the question "how much of the h-period-ahead forecast errorvariance of variable i is explained by innovations to variable j?"VAR in STATA Use the "housing stars and completions" data as example Run a VAR(4) model for 1968.05~1991.12:var starts comps in 1/288, lags(1/4) dfk small Test of predictive causality:vargranger Obtain results for impulse-response functions and variance decompositions,saved in "my…le.irf "irf create myirf, set(myfile, replace) step(36) Generate graphs of the IRFsirf graph irf Generate graphs of the variance decompositionsirf graph fevd, lstep(1) Show how the model …ts the datapredict s_hat in 1/288, eq(#1) xbpredict c_hat in 1/288, eq(#2) xb Obtain out-of-sample forecastfcast compute myfc_ , step(54) Graphs for out-of-sample forecastfcast graph myfc_starts, observedfcast graph myfc_comps, observed In-sample …t and out-of-sample forecast together!tsline starts s_hat myfc_startstsline comps c_hat
View Full Document