Unformatted text preview:

Lecture 19 - Decomposing a Time Series into its Trend and Cyclical Components It is often assumed that many macroeconomic time series are subject to two sorts of forces: those that influence the long-run behavior of the series and those that influence the short-run behavior of the series. So, for example, growth theory focuses on the forces that influence long-run behavior whereas business cycle theory focuses on the forces that influence short-run behavior. Therefore, it is useful to have econometric methods that can be used to decompose a time series into its trend and cyclical components. [Other reasons – to transform a nonstationary series into a stationary series by removing the trend; alleviate a spurious regression problem.] Several Methods – • Polynomial trend removal • Beveridge-Nelson type decompositions • Hodrick-Prescott FilterI. Polynomial Trend Removal Assume that yt is a trend stationary process. That is: yt = τt + ct τt is a deterministic function, typically a low-order polynomial, of t called the trend (or secular or long-run or permanent) component of yt and ct is a stationary process, called the cyclical component (or short-run or transitory) component of yt. Assume that the trend is a polynomial in t, so that yt = τt + ct = β0 + β1t + … + βptp + ct This is a standard linear regression model with a serially correlated process, ct. How to efficiently estimate the β’s? OLS. (Grenander and Rosenblatt, 1957, show that OLS is asymptotically equivalent to GLS when xt =[1 t … tp].)Then it makes sense to set pptttβββτˆ...ˆˆˆ10+++=and tttycτˆˆ−= where β-hat denotes the OLS estimator. Notes: • In practice p = 1 (linear trend) and, in the case of rapidly growing nominal variables, p = 2 (quadratic trend) tend to be the most common choices of p. • Suppose yt and some or all of the xt’s are increasing (on average) over time. A regression of y on x is likely to be statistically significant even if there is no meaningful or causal relationship between the two. That is, a regression of a trending depending variable on a trending regressor is likely to suggest a statistically significant relationship even when none is present. (What is such a regression called?)In this case it makes some sense to add the explanatory variable t as a separate regressor in the regression: (*) yt = β0 + β1xt + β2t + εt. An alternative: Formulate the model as (**) yt* = β1xt* + εt where yt* is the residual series from the regression of yt on a linear time trend and xt* is the residual series obtained from the regression of xt on a linear trend. Fact: The OLS estimate of β1 obtained from regression (*) and the OLS estimator of β1 obtained from regression (**) will be exactly the same. The residual series will be exactly the same, too. (Frisch-Waugh-Lovell Theorem; see, Greene or Davidson & MacKinnon) [This result extends to regressions with higher order terms in t.]II. Decompositions Based on Differences An alternative to the trend stationary assumption to account for trend behavior in a time series is to assume that the series is difference stationary, i.e., yt is stationary in differenced form.A. Diffence Stationarity A time series yt difference stationary of order d, d a positive integer, if • ∆d yt is stationary • ∆d-1yt is not stationary • The MA form of ∆d yt does not have a “unit root” In practice, d = 1 and, for some rapidly growing nominal time series, d = 2 are the most commonly used values of d. {Suppose yt is the trend stationary process yt = β0 + β1t + εt, where εt is a stationary process. Is yt difference stationary of order one? (Why or why not?)}• The number of differences required to make a time series stationary is also called the “order of integration of the series.” • A stationary series is called an integrated of order zero, I(0), series. A difference stationary series with d = 1 is called an intergrated of order one, I(1), series… • An I(1) process is also called a unit root process because the characteristic polynomial of the AR representation of an I(1) process will have a root equal to 1.B. The Random Walk Process The simplest case of an I(1) process is the random walk: yt = yt-1 + εt , εt a zero-mean iid process Note that for the rw – • ∆yt is an iid process: changes in yt are serially uncorrelated, independent, identically distributed. • dyt+s/dεt = 1 for all s > 0: Innovations have completely permanent effects on the time series! yt+1 = yt + εt+1 = yt+2 = yt+1 + εt+2 = yt-1 + εt +εt+1 + εt+2 … yt+s = yt-1 + (εt +εt+1 + εt+2 + … +


View Full Document

ISU ECON 674 - Lecture 19

Download Lecture 19
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 19 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 19 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?