Unbiasedness of Multivariate OLS Estimators

(2 pages)
Previewing page 1 of actual document.

Could not display document, please try refreshing this page a few times.

Please contact support if you are unable to view this document.

Unbiasedness of Multivariate OLS Estimators

Lecture 13


Lecture number:
13
Pages:
2
Type:
Lecture Note
School:
Cornell University
Course:
Econ 3120 - Applied Econometrics
Edition:
1

Unformatted text preview:

Lecture 13 Outline of Current Lecture I. Goodness of Fit II. Unbiasedness Current Lecture III. Unbiasedness of Multivariate OLS Estimators Unbiasedness of Multivariate OLS Estimators In order to show that the OLS estimators described above are unbiased, we need to make a number of assumptions, along the same lines as SLR.1- SLR.4: • MLR.1: The model is given by y = β0 +β1x1 +β2x2 +...+βkxk +u • MLR.2: Random sampling: our sample {(x1, x2,..., xk , y) : i = 1,...,n} is a random sample following the population model in MLR.1 • MLR.3: No perfect collinearity: In the sample (and in the population): 1) Each independent variable has sample variation 6= 0 2) None of the independent variables can be constructed as a linear combination of the other independent variables. • MLR.4: Zero conditional mean: E(u|x1, x2,..., xk) = 0 Under these assumptions, multivariate OLS estimates are unbiased; that is: E( ˆβj |x1, x2,..., xk) = βj The proof is somewhat involved algebraically, so we will not cover it here. See Appendix 3A in Wooldridge for the details. 6 Omitted Variable Bias 6.1 Comparison of Bivariate and Multivariate Estimates In order to understand the effects of omitting relevant variables, let’s return to regression anatomy. We’ll use this to describe how the estimated β’s change when the regression is run without one of the x’s, i.e., we have an omitted variable. Take the example where the true model is of the form: y = β0 +β1x1 +β2x2 +u (3) 6 but we instead run the regression omitting x2 y = β0 +β1x1 +u (4) Let ˆβ1 and ˆβ2 represent the OLS estimators of β1 and β2 using (3), and ˜β1 be the OLS estimator of β1 using (4). How can we use regression anatomy to figure out what’s going on? It turns out that ˜β1 = ˆβ1 + ˆβ2 ˜δ1 where ˜δ1 is the estimate from equation (2) above. Proof: 6.2 Omitted Variable Bias Formula We’re now in a position to show what happens to the expectation of our OLS estimators when we exclude a relevant variable in the analysis. Suppose the true model is y = β0 +β1x1 +β2x2 +u (5) but we instead estimate y = β0 +β1x1 +u (6) We know that estimates from (5) are related to the estimates from (6) by the formula ˜β1 = ˆβ1 + ˆβ2 ˜δ1 7 Taking expectations (with implicit conditioning on the x 0 s), E( ˜β1) = E( ˆβ1 + ˆβ2 ˜δ1) = E( ˆβ1) +E( ˆβ2 ˜δ1) = β1 +E( ˆβ2) ˜δ1 = β1 +β2 ˜δ1 This implies that Bias( ˜β1) = E( ˜β1)−β1 = β2 ˜δ1 = β2 cov(dx1, x2) vard(x1) (7) If we have omitted a relevant variable, the bias of our estimator is going to depend on two relationships: 1) the the effect of the excluded independent variable x2 on the dependent variable, and 2) the correlation between the included independent variable (x1) and the excluded variable (x2). Example. Suppose you know that log wages depend ...


View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams