DOC PREVIEW
LSU EXST 7015 - Simple Linear Regression

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

EXST7015 : Statistical Techniques II Geaghan Simple Linear Regression With equations & calculations Page 1 03d-Slr-TreeCalc&Equations.doc Freund & Wilson (1997) : Prediction of weight of wood from trees (Table 8.24) Observation Dbh Weight Dbh*Dbh Wt*Wt Dbh*Wt Predicted Residual1 5.7 174 32.49 30276 991.8 288.42 -114.422 8.1 745 65.61 555025 6034.5 716.97 28.033 8.3 814 68.89 662596 6756.2 752.68 61.324 7.0 408 49.00 166464 2856.0 520.55 -112.555 6.2 226 38.44 51076 1401.2 377.7 -151.76 11.4 1675 129.96 2805625 19095.0 1306.23 368.777 11.6 1491 134.56 2223081 17295.6 1341.94 149.068 4.5 121 20.25 14641 544.5 74.14 46.869 3.5 58 12.25 3364 203.0 -104.42 162.4210 6.2 278 38.44 77284 1723.6 377.7 -99.711 5.7 220 32.49 48400 1254.0 288.42 -68.4212 6.0 342 36.00 116964 2052.0 341.99 0.0113 5.6 209 31.36 43681 1170.4 270.56 -61.5614 4.0 84 16.00 7056 336.0 -15.14 99.1415 6.7 313 44.89 97969 2097.1 466.98 -153.9816 4.0 60 16.00 3600 240.0 -15.14 75.1417 12.1 1692 146.41 2862864 20473.2 1431.22 260.7818 4.5 74 20.25 5476 333.0 74.14 -0.1419 8.6 515 73.96 265225 4429.0 806.25 -291.2520 9.3 766 86.49 586756 7123.8 931.25 -165.2521 6.5 345 42.25 119025 2242.5 431.27 -86.2722 5.6 210 31.36 44100 1176.0 270.56 -60.5623 4.3 100 18.49 10000 430.0 38.43 61.5724 4.5 122 20.25 14884 549.0 74.14 47.8625 7.7 539 59.29 290521 4150.3 645.54 -106.5426 8.8 815 77.44 664225 7172.0 841.96 -26.9627 5.0 194 25.00 37636 970.0 163.42 30.5828 5.4 280 29.16 78400 1512.0 234.85 45.1529 6.0 296 36.00 87616 1776.0 341.99 -45.9930 7.4 462 54.76 213444 3418.8 591.98 -129.9831 5.6 200 31.36 40000 1120.0 270.56 -70.5632 5.5 229 30.25 52441 1259.5 252.7 -23.733 4.3 125 18.49 15625 537.5 38.43 86.5734 4.2 84 17.64 7056 352.8 20.57 63.4335 3.7 70 13.69 4900 259.0 -68.71 138.7136 6.1 224 37.21 50176 1366.4 359.84 -135.8437 3.9 99 15.21 9801 386.1 -33 13238 5.2 200 27.04 40000 1040.0 199.14 0.8639 5.6 214 31.36 45796 1198.4 270.56 -56.5640 7.8 712 60.84 506944 5553.6 663.4 48.641 6.1 297 37.21 88209 1811.7 359.84 -62.8442 6.1 238 37.21 56644 1451.8 359.84 -121.8443 4.0 89 16.00 7921 356.0 -15.14 104.1444 4.0 76 16.00 5776 304.0 -15.14 91.1445 8.0 614 64.00 376996 4912.0 699.11 -85.1146 5.2 194 27.04 37636 1008.8 199.14 -5.1447 3.7 66 13.69 4356 244.2 -68.71 134.71Sum 289.2 17359 1981.98 13537551 142968.3 Sum = 0Mean 6.15 369.34 42.17 288033 3041.9 SS = 670190.732n 47 47 47 47 47EXST7015 : Statistical Techniques II Geaghan Simple Linear Regression With equations & calculations Page 2 03d-Slr-TreeCalc&Equations.doc Intermediate Calculations Sum X = 289.2 Sum Y = 17359 Sum X2 = 1981.98 Sum Y2 = 13537551 Mean X= 6.153191489 Mean Y= 369.3404255 Sum XY = 142968.3 n = 47 Correction factors and Corrected values (Sums of squares and cross-products) CF for X = Cxx = 1779.502979 Corrected SS X = Sxx = 202.4770213 CF for Y = Cyy = 6411380.447 Corrected SS Y = Syy = 7126170.553 CF for XY = Cxy = 106813.2511 Corrected SS XY = Sxy = 36155.04894 Model Parameter Estimates Slope = b1 = 36155.04894 / 202.4770213 =178.5637141 Intercept = b0 = 369.3404255 - 178.5637141 * 6.153191489 = -729.3963003 Regression Line Yi = b0 + b1 * Xi + ei Yi = -729.3963003 + 178.5637141 * Xi + ei ANOVA Table SSTotal =7126170.553 SSRegression 36155.048942 / 202.4770213 = 6455979.821 SSError = 7126170.553 - 6455979.821 = 670190.7322 Source df SS MS F Regression 1 6455979.821 6455979.821 433.4871821 Error 45 670190.7322 14893.12738 Total 46 7126170.553 Standard error of b1 : where t(0.05/2, 45 df) = 2.014103 ; SMSEXbi12=∑ = 8.576401034 P(178.5637 - 2.0141*8.5764 ≤ β1 ≤ 178.5637 + 2.0141*8.5764 ) = 0.95 P( 161.289956 ≤ β1 ≤ 195.8375 ) = 0.95 Testing b1 against a specified value : H0: β1 = 200 versus H1: β1 ≠ 200 tbSHb=−1101β|= (178.5637141 - 200) / 8.576401034 = -2.49945 Note that t2 = F = 6.247251 ; This test would be done in SAS as an F statementEXST7015 : Statistical Techniques II Geaghan Simple Linear Regression With equations & calculations Page 3 03d-Slr-TreeCalc&Equations.doc The variance of a linear combination is given by the sum of the variances plus twice the covariances. e.g. for A = aX + bY + cZ then Var(A) = a2σ2X + b2σ2Y + c2σ2Z + 2(abσXY + acσXZ + bcσYZ) where the covariances are equal to zero if the variables are independent For the linear combination YbbXii=+01, the standard error of Yi is as follows. Standard error of the regression line ( Yi): SMSEnXXXXYiiyx |..=+−−FHGGIKJJ∑122chch The calculation above DOES NOT assume that the covariances of the regression coefficients are independent. However, for the variance of individual points the linear combination is YbbXeYeiiiii=+ +=+01 . For this linear combination the terms for the predicted value and residuals are assumed independent (i.e. Yi is independent of ei). SMSEnXXSMSE MSEnXXSYixxixxyx |..=+−FHGGIKJJ+= ++−FHGGIKJJ11122ch ch Standard error of b0 is the same as the standard error of the regression line where Xi = 0 SQRT(14893.12738(0.021276596 + (0 - 37.8617655) / 202.4770213 = 55.69366336 Confidence interval on b0 where b0 = -729.3963003 and t(0.05/2, 45 df) = 2.014103 P(-729.3963 - 2.0141*55.6937 ≤ β0 ≤ -729.3963+2.0141*55.6937)=0.95 P( -841.5690916 ≤ β0 ≤ -617.223509 ) = 0.95 Estimate and standard error of an individual observation (e.g. the weight of wood for a ten-inch-diameter tree) Y =-729.3963003 + 178.5637141* X = -729.3963003 + 178.5637141 * 10 =1056.240841 se(bx=10) =14893.1274*(1 + 0.02128 + (10 - 14.79794) / 202.4770) = 127.6654 P(1056.2408-2.0141*127.6654 ≤ µx=10 ≤ 1056.2408+2.0141*127.6654)=0.95 P( 799.1094964 ≤ µx=10 ≤ 1313.372185 ) = 0.95 Calculate the coefficient of Determination and correlation R2 = 0.905953594 or 90.59535936 % r =


View Full Document
Download Simple Linear Regression
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Simple Linear Regression and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Simple Linear Regression 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?