# Unbiasedness of Multivariate OLS Estimators (2 pages)

Previewing page*1*of actual document.

**View the full content.**
Previewing page *1*
of
actual document.

**View the full content.**View Full Document

## Unbiasedness of Multivariate OLS Estimators

0 0 639 views

Lecture 13

- Lecture number:
- 13
- Pages:
- 2
- Type:
- Lecture Note
- School:
- Cornell University
- Course:
- Econ 3120 - Applied Econometrics
- Edition:
- 1

**Unformatted text preview: **

Lecture 13 Outline of Current Lecture I. Goodness of Fit II. Unbiasedness Current Lecture III. Unbiasedness of Multivariate OLS Estimators Unbiasedness of Multivariate OLS Estimators In order to show that the OLS estimators described above are unbiased, we need to make a number of assumptions, along the same lines as SLR.1- SLR.4: â€¢ MLR.1: The model is given by y = Î²0 +Î²1x1 +Î²2x2 +...+Î²kxk +u â€¢ MLR.2: Random sampling: our sample {(x1, x2,..., xk , y) : i = 1,...,n} is a random sample following the population model in MLR.1 â€¢ MLR.3: No perfect collinearity: In the sample (and in the population): 1) Each independent variable has sample variation 6= 0 2) None of the independent variables can be constructed as a linear combination of the other independent variables. â€¢ MLR.4: Zero conditional mean: E(u|x1, x2,..., xk) = 0 Under these assumptions, multivariate OLS estimates are unbiased; that is: E( Ë†Î²j |x1, x2,..., xk) = Î²j The proof is somewhat involved algebraically, so we will not cover it here. See Appendix 3A in Wooldridge for the details. 6 Omitted Variable Bias 6.1 Comparison of Bivariate and Multivariate Estimates In order to understand the effects of omitting relevant variables, letâ€™s return to regression anatomy. Weâ€™ll use this to describe how the estimated Î²â€™s change when the regression is run without one of the xâ€™s, i.e., we have an omitted variable. Take the example where the true model is of the form: y = Î²0 +Î²1x1 +Î²2x2 +u (3) 6 but we instead run the regression omitting x2 y = Î²0 +Î²1x1 +u (4) Let Ë†Î²1 and Ë†Î²2 represent the OLS estimators of Î²1 and Î²2 using (3), and ËœÎ²1 be the OLS estimator of Î²1 using (4). How can we use regression anatomy to figure out whatâ€™s going on? It turns out that ËœÎ²1 = Ë†Î²1 + Ë†Î²2 ËœÎ´1 where ËœÎ´1 is the estimate from equation (2) above. Proof: 6.2 Omitted Variable Bias Formula Weâ€™re now in a position to show what happens to the expectation of our OLS estimators when we exclude a relevant variable in the analysis. Suppose the true model is y = Î²0 +Î²1x1 +Î²2x2 +u (5) but we instead estimate y = Î²0 +Î²1x1 +u (6) We know that estimates from (5) are related to the estimates from (6) by the formula ËœÎ²1 = Ë†Î²1 + Ë†Î²2 ËœÎ´1 7 Taking expectations (with implicit conditioning on the x 0 s), E( ËœÎ²1) = E( Ë†Î²1 + Ë†Î²2 ËœÎ´1) = E( Ë†Î²1) +E( Ë†Î²2 ËœÎ´1) = Î²1 +E( Ë†Î²2) ËœÎ´1 = Î²1 +Î²2 ËœÎ´1 This implies that Bias( ËœÎ²1) = E( ËœÎ²1)âˆ’Î²1 = Î²2 ËœÎ´1 = Î²2 cov(dx1, x2) vard(x1) (7) If we have omitted a relevant variable, the bias of our estimator is going to depend on two relationships: 1) the the effect of the excluded independent variable x2 on the dependent variable, and 2) the correlation between the included independent variable (x1) and the excluded variable (x2). Example. Suppose you know that log wages ...

View Full Document