LSU EXST 7034 - Coefficient of Partial Determination (15 pages)

Previewing pages 1, 2, 3, 4, 5 of 15 page document
View Full Document

Coefficient of Partial Determination

Previewing pages 1, 2, 3, 4, 5 of actual document.

View Full Document
View Full Document

Coefficient of Partial Determination

61 views

Pages:
15
School:
Louisiana State University
Course:
Exst 7034 - Regression Analysis
Regression Analysis Documents
• 14 pages

• 9 pages

• 15 pages

• 3 pages

• 5 pages

• 2 pages

• 6 pages

• 6 pages

• 10 pages

• 7 pages

• 12 pages

• 4 pages

• 9 pages

• 5 pages

• 12 pages

• 10 pages

• 2 pages

• 5 pages

• 18 pages

• 2 pages

• 2 pages

• 2 pages

• 9 pages

• 2 pages

• 7 pages

• 4 pages

• 8 pages

• 14 pages

Unformatted text preview:

Coefficient of Partial Determination As the R2 provides information about the SSR X X X there are also Coefficients of PARTIAL Determination this measures how much variation a variable accounts for out of the variation available to that variable when it enters This gives a proportional measure of the contribution of each variable after all other variables are in the model eg Y b b X 3 b X 3 b X 3 e3 Take X What is the Coefficient of Partial Determination 1 How much did the variable account for after other variablesppartial SSR X X X 1 502621 2 What SS was available to it when it entered the model SSE X X SSE X X X SSR X X X 61 443 230 62548 292 06848 Partial RX2 X X r 2 230 62548 292 06848 0 7896281 78 96281 These calculations are available from SAS PROC REG with the PCORR2 option on the MODEL statement SAS will also produce a Partial Correlation of the TYPE I SS Output from PROC REG Parameter Estimates Parameter Standardized Variable DF Estimate Intercept 1 17 84693 X1 1 1 10313 X2 1 0 32152 X3 1 1 28894 Variable INTERCEP X1 X2 X3 Variable INTERCEP X1 X2 X3 Standard Error 2 00188 0 32957 0 03711 0 29848 t Value 8 92 3 35 8 66 4 32 DF 1 1 1 1 Type I SS 37446 306 732328 263 794445 57 290222 Type II SS 244 171679 34 418508 230 625476 57 290222 Standardized Estimate 0 00000000 0 26023468 0 65915439 0 30693999 DF 1 1 1 1 Squared Partial Corr Type I 0 44501687 0 68960879 0 48251214 Squared Semi partial Corr Type II 0 04993545 0 33459866 0 08311845 Squared Partial Corr Type II 0 35904408 0 78962809 0 48251214 Pr t 0001 0 0032 0001 0 0003 Squared Semi partial Corr Type I 0 44501687 0 38272125 0 08311845 Tolerance 0 73735836 0 77010493 0 88224762 Standardized Regression Coefficients This technique addresses two aspects of estimating 5 values 1 There is some potential difficulty with rounding errors in the calculations particularly for the Xw X matrix calculations These roundoff errors are aggravated by 1 more variables in the model 2 multicolinearity and 3 b values of very different magnitudes Standardized regression coefficients can help with the last problem 2 The magnitude of the regression coefficients cannot be compared Since the regression coefficients have units which are with the units of X and Y Y units X unit they will vary eg If different people do the same study and various investigators take measurements on X in 1 inches 2 feet 3 meters and 4 mm then the same study will very different values for b The same is true of if a dependent variable Y is measured in 1 dollars 2 thousands of dollars or 3 median family income units multiples of about 18 thousand As a result of these scaling factors the regression coefficients have an interpretation in terms of the regression coefficients but the regression coefficients will differ for different units and must be examined within the context of those units Standardized Regression Coefficients however have no units but their size can be interpreted as a measure of impact or importance of each variable on the calculation of the predicted value There are several ways to calculated Standardized Regression Coefficients 1 The variables can be standardized prior to doing the regression Yw Y3 Y 1 n 1 s Y Xw35 X35 X5 1 n 1 sx5 where sY and sx5 are ordinary standard deviations regression on these variables gives the standardized regression model Y3 w X w i w X w i w X w i 3 where 0 2 If the matrix calculations are done with the standardized values of X and Y then the Xw X and and Xw Y matrices are 1 r Xw X r r 1 r r r2 1 r r Xw Y r Note that there is no row for the intercept so another way to get the standardized regression coefficients is to calculated the matrix formula for B Xw X Xw Y using the correlation matrices or Bw R 1 R 3 There is also a relationship between the standardized regression coefficient and the ordinary least squares regression coefficient The relationship is 5 5w ss The interpretation of the standardized regression coefficient is as a measure of relative impact on the calculations or as relative importance of the variable to the model The size of the variable is not longer influenced by units and standardized regression coefficients are unitless The SIGN of the regression coefficient is retained so negative and positive effects can still be interpreted Example The standard deviations are given by for the mathematician example for X s DY23 DY n n 1 s DX235 n 1 s 2 38135 26 948 24 23 899 49 128 6 24 23 2 5 47429 DX5 2 n DX 2 n DX23 n 1 2 1 303 1 303 w ss so w ss 1 2889 5 472 0 30694 all values are available from the Xw X Xw Y and Yw Y matrices Interpretation 1 Size of value magnitude regardless of sign is important This is an indicator of importance or impact in the calculation of the predicted value This would generally agree with observations and evaluations made by P t and SSII and Partial R2 but not always 2 The SIGN is important and will match the sign on the regression coefficient Effect of Correlation among the X5 variables 1 If the X5 variables are uncorrelated then they will describe a certain variation whether alone or in concert with other variables and the variables describe the same variation no matter which other variables they are adjusted for Also the regression coefficients will stay the same will be stable There are several ways of creating this type of design a Orthogonal variables result from transformation which extract the attributes of a variable while retaining a 0 correlation with other variables orthogonal polynomial multipliers are a good example of this from any table Orthogonal Polynomial Multipliers equally spaced X variable with 5 levels X 1 2 3 4 5 Linear 2 1 0 1 Quadratic 2 1 2 1 2 Cubic 1 2 0 2 1 Quartic 1 4 6 4 1 Note that all crossproducts sum to zero 2 b Some multivariate analyses such as PCA will create orthogonal variables these can be used as independent variables c many designed experiments are orthogonal factorials are a good example eg 2x2x2 factorial Test A B X 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 AB 1 1 1 1 1 1 1 1 C 1 1 1 1 1 1 1 1 AC 1 1 1 1 1 1 1 1 BC 1 1 1 1 1 1 1 1 ABC 1 1 1 1 1 1 1 1 abc Abc aBc ABc abC AbC aBC ABC What happens to the EXTRA SS If X and X are uncorrelated then SSR X SSR X X SSR X SSR …

View Full Document

Unlocking...