Previewing page *1*
of
actual document.

**View the full content.**View Full Document

## Motivation

0 0 522 views

Lecture 12

- Lecture number:
- 12
- Pages:
- 2
- Type:
- Lecture Note
- School:
- Cornell University
- Course:
- Econ 3120 - Applied Econometrics
- Edition:
- 1

**Unformatted text preview: **

Lecture 11 Outline of Current Lecture I. Goodness of Fit II. Unbiasedness Current Lecture III. Motivation Motivation Multiple regression allows us to account for more than one factor in explaining our dependent variable y. Consider the familiar example of the relationship between schooling and wages. Suppose we have data on the SAT score of the individual while she was in high school. We might be interested in estimated a relationship of the form: log(wage) = Î²0 +Î²1educ+Î²2SAT +u Or, to take a simple model from macroeconomics, suppose we want to estimate the determinants of a countryâ€™s growth rate. We may model the growth rate of a country (from 1980-2000) as a function of per capita income in 1980 and income inequality (as measured by the Gini coefficient): growthrate = Î²0 +Î²1inc80+Î²2Gini+u A multivariate model with two independent variables, x1 and x2, takes the form: y = Î²0 +Î²1x1 +Î²2x2 +u In this case, Î²1 represents the change in the y for a one-unit change in x1, holding all other factors (x2 and u) fixed. This is the partial derivative of y with respect to x1, holding x2 and u fixed. Our x 0 s donâ€™t have to be separate variables, they can actually be f unctions of the same variable. For example, suppose we are studying the relationship between household consumption and income, and we model the relationship as follows: cons = Î²0 +Î²inc+Î²2inc2 +u 1 In this case, the effect of income on consumption depends on both Î²1 and Î²2: âˆ‚ âˆ‚ inc cons|u constant = Î²1 +Î²2inc The general form of the multivariate model with k independent variables is y = Î²0 +Î²1x1i +Î²2x2i +...+Î²kxki +ui (1) (Note that I use the notation x ji for observation i and variable x j , while Wooldridge uses xi j.) Analogous to the bivariate model, the key assumption is the independence of the error term and the regressors (independent variables): E(u|x1, x2,..., xk) = 0 This implies that u must be independent of (and uncorrelated with) all of the explanatory variables x j . If u is correlated with any of these variables, the assumption does not hold and our estimates will be unbiased (more on this later on). 2 Estimating Multivariate Regression Parameters Estimation of Î²â€™s in a multivariate models follows a similar procedure to bivariate estimation. We first start with the independence assumption E(u|x1, x2,..., xk) = 0 â‡’ Cov(x j ,u) = 0 and impose the sample analog on our estimates, using equation (1). This implies 1 n âˆ‘x1(yi âˆ’ Ë†Î²0 âˆ’ Ë†Î²1x1i âˆ’...âˆ’ Ë†Î²kxki) = 0 1 n âˆ‘x2(yi âˆ’ Ë†Î²0 âˆ’ Ë†Î²1x1i âˆ’...âˆ’ Ë†Î²kxki) = 0 ... 1 n âˆ‘xk(yi âˆ’ Ë†Î²0 âˆ’ Ë†Î²1x1i âˆ’...âˆ’ Ë†Î²kxki) = 0 Note that these equations are the same as the first-order conditions from the minimization of the sum of squared residuals: min Ë†Î²0,.., Ë†Î²kâˆ‘uË† 2 i 2 min Ë†Î²0,.., Ë†Î²kâˆ‘(yi âˆ’ Ë†Î²0 âˆ’ Ë†Î²1x1i âˆ’...âˆ’ Ë†Î²kxki) 2 The actual computation of these estimates is very involved and ...

View Full Document