DOC PREVIEW
TAMU ECMT 475 - note-7

This preview shows page 1-2-19-20 out of 20 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 20 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture Notes 7:Model with More Components..ECMT 475: Economic Forecasting (Spring 2011)Guangyi MaRegression Models Textbook Chapter 11 So far we have studied modelling and forecasting techniques for a univariatetime series. Multivariate model might be more useful: a variable can be explained andforecast on the history of other variables, as well as their own history. In the trend and seasonal components, one or more variables other thanthe lagged y are used, but they are deterministic. Consider regression modelsyt= 0+ 1xt+ etwhere et W N0; 2e Correlation VS causation Endogenous VS exogenous variables Bivariate data: (yt; xt) Multivariate data: (yt; x1t; x2t;    ; xkt)Forecasting in Regression Models Forecast for y at h-period-ahead, conditional on x, isyT +h;TjxT +h= 0+ 1xT +h This forecast of yT +hrequires knowledge of xT +h. This is not typically feasible. We might use a forecast of x. Suppose x follows an AR(1) processxt= xt1+ utwhere ut W N0; 2u Then we can combine the modelsyt= 0+ 1xt+ etwhere et W N0; 2ext= xt1+ utwhere ut W N0; 2uto haveyt= 0+ xt1+ "twhere "t W N0; 2 Here  = 1, "t= 1ut+ et. If etand utare independent, then 2=?Distributed Lags If instead we assume an AR(q) model for x, thenyt= 0+ 1xt1+ 2xt2+    + qxtq+ "t= 0+ A (L) xt1+ "twhere "t W N0; 2 This is called a "distributed lags" model. The coe¢ cients can be interpretted as the e ¤ect of x on y:– 1is the immediate impact– 1+    + qis the long-run impactDistributed Lags with Lagged DependentVariables We can also include past dynamics of y to explain itselfyt= 0+ 1yt1+    + pytp+ 1xt1+    + qxtq+ "tyt= 0+ B (L) yt1+ A (L) xt1+ "t Can use arima or regress command Forecast procedures as beforeDistributed Lags with ARMA Disturbances We can further include ARMA disturbancesyt= 0+ A (L) xt1+ "t"t=(L)(L)twhere t W N0; 2 Distributed lags with AR(1) disturbances is equivalent to? Example:yt= 0+ 1xt1+ "t"t= '"t1+ twhere t W N0; 2Most General Framework This is the so-called transfer functionyt=A(L)B(L)xt+C (L)D (L)"twhere "t W N0; 2 All previous models can be viewed as special cases of this framework.Vector Autoregressions VAR is a linear regression model for a set of variables. System of equations, each for one variable. In each equation, VAR(p) uses p lags of eac h variable to explain currentvalue of the L.H.S. variable. Allow for cross-variable dynamics. Consider a two-variable VAR(1):y1;t= '11y1;t1+ '12y2;t1+ "1;ty2;t= '21y1;t1+ '22y2;t1+ "2;twhere"1;t W N0; 21"2;t W N0; 22cov"1;t; "2;t= 12 Correlated disturbance: a contemporary shock on one variable might a¤ectanother variable as well. Estimation: equation-by-equation OLS or system estimation procedure.Predictive Causality Statistical notion of causality is di¤erent from the "common sense" causal-ity. Predictive causality means a variable is helpful in forecasting of anothervariable. True causality could actually be the reverse. We say a variable y2(predictively) causes y1if lagged values of y2havetrue non-zero coe¢ cients in the y1equation. In the two-variable VAR(1) model, "y2causes y1" if '126= 0. In economics, often called “Granger causality”. Clive Granger (1934-2009): winner of 2003 Nobel Prize, famous time-serieseconometrician. Test of predictive causality: F testImpulse-Response Functions Question of interest: how does 1 unit innovation to a variable a¤ect itselfas well as other variables in the future? Need to convert the system into MA(1) form Also need normalization (a di¤erent MA(1) representation) to generateIRF from structural innovation. Consider the two-variable VAR(1) model,y1;t= '11y1;t1+ '12y2;t1+ "1;ty2;t= '21y1;t1+ '22y2;t1+ "2;t"1;t W N0; 21"2;t W N0; 22cov"1;t; "2;t= 12 The standard MA(1) representation isy1;t= "1;t+ '11"1;t1+ '12"2;t1+   y2;t= "2;t+ '21"1;t1+ '22"2;t1+    Disturbances are contemporarily correlated! Normalization by Cholesky decomposition to have a di¤erent MA(1) rep-resentationy1;t= b011"01;t+ 0 + b111"01;t1+ b112"02;t1+   y2;t= b021"01;t+ b022"02;t+ b121"01;t1+ b122"02;t1+   "01;t W N (0; 1)"02;t W N (0; 1)cov"01;t; "02;t= 0 Then bkijrepresents the response of the i-th variable at time (t + k) tothe j-th structural innovation at time t.Variance Decompositions Another way of characterizing the dynamics associated with VARs Closely related to impulse-response functions Answers to the question "how much of the h-period-ahead forecast errorvariance of variable i is explained by innovations to variable j?"VAR in STATA Use the "housing stars and completions" data as example Run a VAR(4) model for 1968.05~1991.12:var starts comps in 1/288, lags(1/4) dfk small Test of predictive causality:vargranger Obtain results for impulse-response functions and variance decompositions,saved in "my…le.irf "irf create myirf, set(myfile, replace) step(36) Generate graphs of the IRFsirf graph irf Generate graphs of the variance decompositionsirf graph fevd, lstep(1) Show how the model …ts the datapredict s_hat in 1/288, eq(#1) xbpredict c_hat in 1/288, eq(#2) xb Obtain out-of-sample forecastfcast compute myfc_ , step(54) Graphs for out-of-sample forecastfcast graph myfc_starts, observedfcast graph myfc_comps, observed In-sample …t and out-of-sample forecast together!tsline starts s_hat myfc_startstsline comps c_hat


View Full Document

TAMU ECMT 475 - note-7

Documents in this Course
Load more
Download note-7
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view note-7 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view note-7 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?