# Omitted Variable Bias with Many Regressors (2 pages)

Previewing page*1*of actual document.

**View the full content.**
Previewing page *1*
of
actual document.

**View the full content.**View Full Document

## Omitted Variable Bias with Many Regressors

0 0 623 views

Lecture 14

- Lecture number:
- 14
- Pages:
- 2
- Type:
- Lecture Note
- School:
- Cornell University
- Course:
- Econ 3120 - Applied Econometrics
- Edition:
- 1

**Unformatted text preview: **

Lecture 14 Outline of Current Lecture I. Unbiasedness of Multivariate OLS Estimators Current Lecture I. Omitted Variable Bias with Many Regressors 6.3 Omitted Variable Bias with Many Regressors The above discussion only applies to a model with two independent variables. When there are more than two independent variables, and we leave one out, things get much more complicated. If this happens, all of the estimated Î² 0 s can be biased, and the bias of ËœÎ²j in the short regression generally depends on the relationship between the omitted variable and all the xâ€™s, and on the relationship between all the xâ€™s. But generally speaking, if we assume that x j is uncorrelated with the other included xâ€™s, then we can say something about the bias of the estimated coefficient. Suppose the true model is log(wage) = Î²0 +Î²1educ+Î²2exper +Î²3abil +u (where exper is years of experience) and we leave out abil. If we assume that education and experience are uncorrelated, then the expectation of ËœÎ²1 in the regression log(wage) = Î²0 +Î²1educ+Î²2exper +u will be E( ËœÎ²1) = Î²1 +Î²3 Cov(educ d,abil) Vard(educ) , which is essentially the same thing as the formula (7) above. Therefore, we might expect ËœÎ²1 to be biased upwards if Î²3 is positive and education and ability are positively correlated. Economists often try to sign the bias using this formula regardless of whether x j (the variable of interest in the short regression) is uncorrelated with the other included x 0 s. This is a reasonable shortcut, but itâ€™s important to know that it isnâ€™t exactly right, especially if the included xâ€™s are highly correlated. 2 2The general form for the estimate of Î²j in the short regression when xk is omitted is ËœÎ²j = Ë†Î²j + Ë†Î²k ËœÎ´j , where ËœÎ´j is the coefficient on xj in the regression of xk on all of the included regressors. See Wooldridge Section 3A for more detail. 9 7 Variance of OLS Estimators To obtain the variance of OLS estimators, we need make an assumption analogous to SLR.5: â€¢ MLR.5 Homoskedasticity: The error term in the OLS equation described by MLR.1 has constant variance: Var(u|x1,..., xk) = Ïƒ 2 With assumptions MLR.1-MLR.5, the variance of an OLS estimator Ë†Î²j is given by Var( Ë†Î²j) = Ïƒ 2 SSTj(1âˆ’R 2 j ) where SSTj = âˆ‘(xi j âˆ’ xÂ¯j) 2 and R 2 j is the R-squared from the regression of xi j on all of the other independent variables. 8 Estimating Ïƒ 2 We obtain an unbiased estimator of Ïƒ 2 in a similar manner to the bivariate case: ÏƒË† 2 = 1 nâˆ’k âˆ’1 âˆ‘uË† 2 i The denominator nâˆ’kâˆ’1 equals the number of observations minus the total number of parameters estimated in the model. Using this estimate, we can estimate the variances of OLS estimators as Vard( Ë†Î²j) = ÏƒË† 2 SSTj(1âˆ’R 2 j ) And the standard errors as se( Ë†Î²j) = q Vard( Ë†Î²j) The Gauss- Markov Theorem Our OLS estimators are ...

View Full Document