DOC PREVIEW
UW-Madison STAT 333 - 333disc07

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

STAT 333 Discussion 7Multiple Linear Regression> set.seed(123)> library(MASS)> Sigma <- matrix(c(1,0.7,0.7,1),2,2)> X=mvrnorm(n=20, rep(0, 2), Sigma)> x1=X[,1]> x2=X[,2]> print(cor(x1,x2))[1] 0.773609> y=3*x1+2+rnorm(20,0,1);> # fit the linear model y verus x1 and x2> out1=lm(y~x1+x2)> summary(out1)Call:lm(formula = y ~ x1 + x2)Residuals:Min 1Q Median 3Q Max-1.6749 -0.6103 -0.1499 0.3488 2.1223Coefficients:Estimate Std. Error t value Pr(>|t|)(Intercept) 2.12797 0.22643 9.398 3.82e-08 ***x1 2.80783 0.36955 7.598 7.31e-07 ***x2 0.06707 0.39175 0.171 0.866---Signif. codes: 0 ^a˘A¨Y***^a˘A´Z 0.001 ^a˘A¨Y**^a˘A´Z 0.01 ^a˘A¨Y*^a˘A´Z 0.05 ^a˘A¨Y.^a˘A´Z 0.1 ^a˘A¨Y ^a˘A´Z 1Residual standard error: 1 on 17 degrees of freedomMultiple R-squared: 0.8975, Adjusted R-squared: 0.8854F-statistic: 74.43 on 2 and 17 DF, p-value: 3.9e-09> # we find that x2 is not significant and delete it; refit the model again> out2=lm(y~x1)> summary(out2)Call:lm(formula = y ~ x1)Residuals:Min 1Q Median 3Q Max-1.6418 -0.5799 -0.1302 0.3535 2.0907Coefficients:Estimate Std. Error t value Pr(>|t|)(Intercept) 2.1280 0.2202 9.662 1.51e-08 ***x1 2.8568 0.2278 12.542 2.47e-10 ***---Signif. codes: 0 ^a˘A¨Y***^a˘A´Z 0.001 ^a˘A¨Y**^a˘A´Z 0.01 ^a˘A¨Y*^a˘A´Z 0.05 ^a˘A¨Y.^a˘A´Z 0.1 ^a˘A¨Y ^a˘A´Z 1Residual standard error: 0.9729 on 18 degrees of freedom1Multiple R-squared: 0.8973, Adjusted R-squared: 0.8916F-statistic: 157.3 on 1 and 18 DF, p-value: 2.469e-10> # generate the plot of y versus x1 or x2;> par(mfrow=c(2,2))> plot(x1,y)> plot(x2,y)> plot(x1,x2)●●●●●●●●●●●●●●●●●●●●−1 0 1 2−2 0 2 4 6 8x1y●●●●●●●●●●●●●●●●●●●●−1 0 1 2−2 0 2 4 6 8x2y●●●●●●●●●●●●●●●●●●●●−1 0 1 2−1 0 1 2x1x2Projection matrix in multiple linear regression1. If A ∈ Rd×nis a matrix, Z ∈ Rnis a random vector and c ∈ Rdis a constant vector, thenE(AZ + c) = AE(Z) + cV ar(AZ + c) = AV ar(Z)A>2. Model Setting: Suppose the linear model Y = Xβ + ε, where Y, ε ∈ Rn, X ∈ Rn×p, β ∈ Rp. Furthermore,we assume that ε ∼ N (0, σ2In). Letbβ be the least squares estimators of β. Try to show the following parts:E(Y ) = XβE(bβ) = βWhat about E(bY )?Define the residual vector r = Y −bY . What is E[r]?V ar(bβ)?V ar(r)?Define by Hx= X(X>X)−1X>the projection matrix. Show that HxHx= Hx.Show that Hx(In− Hx) =


View Full Document

UW-Madison STAT 333 - 333disc07

Download 333disc07
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view 333disc07 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view 333disc07 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?