DOC PREVIEW
UIUC STAT 420 - HW #7

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Boyang DaiSTAT 6120HW #3#7.22The significance of the predictors is relied on the other variables in the model. When we add four additional variables, if they are related to the three variables, which have already been in the model, it is possible for none of the new variables to be significant and those previously in the model can go from significant to not significant. This does not mean the model is not good. #7.24a)data=read.table("http://www.stat.lsu.edu/exstweb/statlab/datasets/KNNLData/CH06PR05.txt")y = data[,1] x1 = data[,2] x2 = data[,3] fo = lm(y ~ x1) foCall: lm(formula = y ~ x1)Coefficients: (Intercept) x1 50.775 4.425 b) The coefficient of X1 when both X1 and X2 are used to predict Y is also 4.425, so it doesn't matter whether X2 is present or not.c) fo2 = lm(y ~ x2) fit = lm(y ~ x1+x2) SSR_x1x2 <- deviance(fo2) - deviance(fit)SSR_x1 <- sum(anova(fo)$`Sum Sq`) - deviance(fo)print(paste0("SSR(X1|X2) is ",SSR_x1x2," and SSR(X1) is ",SSR_x1))[1] "SSR(X1|X2) is 1566.45 and SSR(X1) is 1566.45"They are equal.d) From the correlation matrix, the correlation between X1 and X2 is 0. Since the two predictor variables are uncorrelated, the regression SS for each is the same whether or not we adjust for the presence of the other.#8.6a)data=read.table("http://www.stat.lsu.edu/exstweb/statlab/datasets/KNNLData/CH08PR06.txt")y = data[,1] x = data[,2] lm = lm(y~x) xsquare = x^2lm2 = lm(y ~ xsquare + x) summary(lm2) plot(lm2, which = c(1,2), pch = 16)The quadratic regression function appears to be a good fit here.summary(lm2)$r.squared[1] 0.8143372b) Alternatives: H0: β1 = β11 = 0 vs Ha: β1 ≠ 0, or β11 ≠ 0By F test:F* = MSRMSE=52.63; F(0.99; 2, 24) = 5.61359 P-value < 0.0001 Conclusion: F* > F(0.99; 2, 24), we reject H0, which implies that there is a regression relationship. The p-value is less than 0.001. c)The prediction intervals could be Y⏞h± Ss{pred}∨Y⏞h± Bs{pred}.In this case, S2 = gf(1-α; g, n-3) = 3F(0.99; 3, 24) = 14.1542, S = 3.7622, B = t(1-α6;24 ¿=3.25838 .Therefore, for Xh=10,15, 20, we obtainthe following:Xh=10, fitted value for Yh=10.5 2, s2{pred} = 11.3342, s = 3.3666The prediction interval is (12.6103, 34.5497)Xh=15, fitted value for Yh=20.02, s2{pred}=0.5668, s=0.7529The prediction interval is (21.1268, 26.0332)Xh=20, fitted value for Yh=23.58, s2=6.2241, s=2.4948The prediction interval is (15.4509, 31.7091)d)Xh=10, fitted value for Yh=20.02, B=t(1−α6;24)=3.25838s2{pred}=10.5058, s=3.2431The prediction interval is (9.4587, 30.5813)e) Alternatives: H0: β11 = 0 vs Ha: β11 ≠ 0By F test:F* = SSR( X2∨X )1÷SSE( X , X2)n−2=25.4538; F(0.99; 1, 24) = 7.82287 P-value < 0.0001 Conclusion: F* > F(0.99; 1, 24), we reject H0, which implies that the quadratic term can not be dropped. The P-value is less than 0.001f)~Y = -26.3254 + 4.8736X −0.1184 X2 = -26.3254 + 4.8736(X −´X )−0.1184( X−´X )2 =-132.7134 + 8.61X -0.1184X2#8.10a)E{Y}=14+7 X1+5 X2−4 X1X2If X1=1, then E{Y}=14 +7+5 X2−4 X2=21+X2If X1=4,then E{Y}=14+28+5 X2−16 X2=44−11 X2If the effects for X1 and X2 are additive, then the two lines should be parallel to each other. But from the graph, they have very different slope, so the effect are not additive.


View Full Document

UIUC STAT 420 - HW #7

Documents in this Course
Load more
Download HW #7
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view HW #7 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view HW #7 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?