# One Sided Tests, Confidence Intervals (2 pages)

Previewing page 1 of actual document.
View Full Document

# One Sided Tests, Confidence Intervals

Previewing page 1 of actual document.

View Full Document
View Full Document

## One Sided Tests, Confidence Intervals

1085 views

II. One Sided Tests III. Confidence Intervals

Lecture number:
9
Pages:
2
Type:
Lecture Note
School:
Cornell University
Course:
Econ 3120 - Applied Econometrics
Edition:
1
##### Documents in this Packet
• 1 pages

• 1 pages

• 2 pages

• 1 pages

• 2 pages

• 1 pages

• 1 pages

• 1 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 1 pages

• 2 pages

• 3 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 6 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

• 2 pages

Unformatted text preview:

Econ 3120 1st Edition Lecture 9 Outline of Last Lecture I Confidence Intervals and T tests Outline of Current Lecture II Regression Over the next several lectures we ll start by examining two variables Examples the relationship between education and wages firm output and production inputs state cigarette taxes and cigarette consumption In these relationships we define the following dependent variable the variable to be explained e g education independent variable the variable that gives us information on the dependent variable e g wages also called an explanatory variable We typically denote a dependent variable as y and an independent variable as x To describe the relationship between the variables we define some function y f x For our purposes we ll assume that this relationship is linear y 0 1x u where 0 and 1 are parameters which describe this relationship and u is an error term Clearly x is not the only factor that affects y The variable u represents all of the other factors in determining y For example in the schooling earnings relationship all of the other factors that influence wages e g experience ability location etc go into u 12 Assumptions In order to analyze the relationship between x and y and to derive estimates for 0 and 1 we need to make a few assumptions We first assume that the unconditional expectation of u is 0 E u 0 We can always make this assumption because we can simply re scale the intercept 0 such that it is true A key assumption in regression modeling is that x and u are independent that is E u x E u 0 We call this the zero conditional mean assumption Under this second assumption we can write the conditional expectation function CEF of y as E y x E 0 1x u x 0 1x E u x 0 1x From this equation we have the primary interpretation of the parameter 1 it represents the slope of the CEF with respect to x In other words it represents the change in the expected value of y conditional on x with respect to x 3 Estimating 0 and 1 using ordinary least

View Full Document

Unlocking...