MASON PSYC 612 - Lecture 8: GLM - Multiple Regression and Correlation

Unformatted text preview:

PSCY 612, FALL 2010Lecture 8: GLM - Multiple Regression and CorrelationLecture Date: 10/20/201 0Contents0.1 Preliminary Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Part I: Introding the General Linear Model (GLM) and Multiple Regression andCorrelation (MRC) (75 min; 5 min break) 21.1 Ladies and gentlemen, the GLM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Relevant background material 32.1 Purpose: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.2 Objectives: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.3 The axes, the point(s), and the line . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.4 The Pearson correlation (re-introduction) . . . . . . . . . . . . . . . . . . . . . . . . 102.5 The line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.6 MRC: Parameter Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.6.1 An introduction to Least Squares . . . . . . . . . . . . . . . . . . . . . . . . 222.6.2 Computing Bivariate Regression Parameters by Hand . . . . . . . . . . . . . 252.6.3 Calculate F and R2from the regression . . . . . . . . . . . . . . . . . . . . . 282.6.4 MRC: Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302.7 Bivariate Regression Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313 Part II: Bivariate Regression Details (50 minutes; 10 minute break) 343.1 Important statistical concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343.2 Observed, expected and residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.3 Explaining Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384 Part III: A dvanced Material (10 minutes or time permitting) 384.1 Differentiating ANOVA and MRC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.2 Platonic Essentialism vs. Aristotelean Nominalism . . . . . . . . . . . . . . . . . . . 394.3 Classes (Factors) vs. Predictos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.4 The General Linear Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3910.1 Preliminary Questions•Have you read all the assigned reading for today?•Have you ever run a multiple regression?•Are you familiar with the purpose of MRC?1 Part I: Introd i ng the General Linear Model (GLM) andMultiple Regression and C orrelation (MRC) (75 min; 5min break)1.1 Ladies and gentlemen, the GLMThe two most prominent methods you will learn in your graduate stastistics courses are ANOVAand Multiple Regression (MRC). Both of these procedures come from the same underlying model -the General Linear Model. The equation of the model looks like this:Y = b0+ b ∗ X + eWhere,Y– is the dependent variable– is predicted by X in the g eneral equation above– is imperfectly predicted (see explanation of e below)b0– represents the intercept– is the “best guess” when no predictor is present– often serves as a representat ive to other important predictors– is o nly present when the intercept is meaningfulX– is a matrix of predictors– may represent a single variable (i.e., a vector)– or multiple variables (i.e., a matrix)– and can be easily distinguished from other basic predictors by the boldface and UP-PERCASE typeb– is the weight assigned to the predictors– when b0is present, b is always represented in a raw weight (i.e., unstandardized)2– when b0is absent, b is represented as a standardized weight– represents the weight that provides the b est fit between the observed and the residual ofYe– is the residual– expressed in terms of Y units– recall that I said this is often termed “error” but we shall broaden our use of the conceptto call it “residual”– often the term does not appear in the equation but I put it in so we fully appreciate allthe aspects o f statistical modelingThe reason why it is called the “GLM” is because it1. generalizes to multiple, more specific models2. restricts the model to an additive “+” and thus can be considered linear3. and is model basedThroughout the semester, I will use this general equation and expand it to fit each model wediscuss. The equation above is based upon the general equation of a line. The general GLM equationwill be the basis for all model comparisons and model introductions - it might be desirable for youto learn it (hint...hint...nudge...nudge...you know what I mean?...you know what I mean?).2 Relevant background material2.1 Purpose:To introduce you to basic bivaria t e regression through numerical and graphical examples2.2 Objectives:1. Introduce the basics of regression beginning with a line2. Expand the line concept to data fitt ing3. Explain the nature of regression through least squares4. Discuss the assumptions32.3 The axes, the point(s), and the line0 2 4 6 8 100 2 4 6 8 10XY40 2 4 6 8 100 2 4 6 8 10One PointXY(3,4)50 2 4 6 8 100 2 4 6 8 10Two PointsXY(3,4)(7,6)60 2 …


View Full Document

MASON PSYC 612 - Lecture 8: GLM - Multiple Regression and Correlation

Download Lecture 8: GLM - Multiple Regression and Correlation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 8: GLM - Multiple Regression and Correlation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 8: GLM - Multiple Regression and Correlation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?