New version page

MASON PSYC 611 - Lecture 9: Multiple Regression and Correlation

Upgrade to remove ads
Upgrade to remove ads
Unformatted text preview:

PSYC 611, FALL 2011Lecture 9: Multiple Regression and Correlation (cont.)Lecture Date: 11/ 1/201 1Contents0.1 Preliminary Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Part I: Multiple Regression and Correlation (MRC) (75 min; 5 min break) 21.1 Regression Assumptions - Redux . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2 Brief Review of the Reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.3 The purpose of multiple regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 71.4 Parameter estimation in multiple regression . . . . . . . . . . . . . . . . . . . . . . 91.5 Hypothesis testing in multiple regression . . . . . . . . . . . . . . . . . . . . . . . . 111.6 Complexities of MRC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.6.1 Standardized (β) versus unstandardized parameters (b) . . . . . . . . . . . . 121.6.2 Multicollinearity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.6.3 Partitioning variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.6.4 Mixed variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.7 Regression in Practice: An Example . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.7.1 Residual Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 Part II: More Regression Details (50 minutes; 10 minute break) 162.1 Shared variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.2 Sums of Squares (SS) Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.2.1 Type I: Hierarchical partitioning - ordered or sequential . . . . . . . . . . . . 182.2.2 Type II: Partial hierarchical - non-ordered but hierarchical . . . . . . . . . . 192.2.3 Type III: Simultaneous - non-ordered . . . . . . . . . . . . . . . . . . . . . . 212.2.4 Others . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223 Part III: Advanced Material (10 minutes or time permitting) 230.1 Preliminary Questions•Have you read all the assigned reading for today?• Do you have any lingering questions about last week’s lecture?11 Part I: Multiple Regression and Correlation (MRC) (75min; 5 min break)1.1 Regression Assumptions - Redux1. Relationship between X and Y is linear (i.e., rectilinear)5 10 154 6 8 10 12x1y15 10 154 6 8 10 12x2y25 10 154 6 8 10 12x3y35 10 154 6 8 10 12x4y4Anscombe’s 4 Regression data sets2. The predictor is sensible for the outcome (no errors of commission)2−2 0 2−2 0 2 4xy3. The prediction of Y does not require other predictors than X (no errors of omission)This will be more relevant when we start to consider other predictors. For now, be concernedabout selecting the best predictor.4. No measurement errorMeasurement error is a complex concept for many of us to gr asp. We will cover this materialin greater depth when we discuss psychometrics in two weeks. For now, consider the basicpsychometric equation:Yobs= Ytrue+ errorError - from the standpoint of psychometrics - is random. Consider a dart board for a clearexplanation of measurement error. If every throw towards the “bulls eye” were off by somerandom amount and in some ra ndom direction away then the throws would conform to randomerror of measurement. If, however, every throw were off by the same amount and in r oughlythe same direction (e.g., o ne inch below the target on every throw) then t ha t would constitutea bias. Multiple regression assumes that there is no error in measurement, although this isnot taken seriously by anyone who uses MRC. We all acknowledge that measurement containsrandom error or noise.5. Mean error (residual) equals zeroFor this to ma ke sense, you need to have a gr asp of what takes place in regression. Recall3back t o your lessons on ANOVA. Predictions were made at the group level and every personwithin that group was expected to be represented by the group mean. Any deviations fromthe group mean by an individual was pooled into the MSW ithinor “noise” term. In MRC,each individual gets a ˆy and a ǫ or error term. If we compute the arithmetic mean for all ǫvalues in our data set, we ought to get a value equal to approximately zero. Why is this soimportant? If the mean r esidual is not zero, then the prediction will likely drift; overshootingor undershooting the observed value more often tha n we can tolera t e.Xǫi= 06. The variance is constant fo r all values of X (ho mo skedasticity)0 2 4 6 8 10−100 0 100 200 300xy7. Errors are random and uncorrelated (no auto or serial correlation)8. Predictor (x) uncorrelated with error term9. Errors are normally distributed4Residual ValuesFrequency−150 −100 −50 0 50 100 150 2000 50 100 1501.2 Brief Review of the ReadingMRC revolves ar ound measures of association - particularly the cor relation. To fully understandMRC, you need to know the correlation coefficient as if it were second nature to you. I assignedthe Rosenthal and Rosnow chapter ( Chapter 11) because they not only covered the correlationcoefficient but also provided alternative measures of association. Furthermore, they provided asomewhat different perspective than what I already discussed in class. Below is a brief review of …


View Full Document
Download Lecture 9: Multiple Regression and Correlation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 9: Multiple Regression and Correlation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 9: Multiple Regression and Correlation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?