# LSU EXST 7037 - Dimension Reduction and Extraction of Meaningful Factors (4 pages)

Previewing page 1 of 4 page document
View Full Document

## Dimension Reduction and Extraction of Meaningful Factors

Previewing page 1 of actual document.

View Full Document
View Full Document

## Dimension Reduction and Extraction of Meaningful Factors

46 views

Lecture Notes

Pages:
4
School:
Louisiana State University
Course:
Exst 7037 - Multivariate Stat
##### Multivariate Stat Documents
• 18 pages

• 2 pages

• 4 pages

• 4 pages

• 7 pages

• 29 pages

• 4 pages

• 6 pages

• 7 pages

Unformatted text preview:

Chapter 5 Section 5 1 Dimension Reduction and Extraction of Meaningful Factors Principal Components Analysis Objectives Too Many Variables Explain the basic concepts for principal components analysis Identify several strategies for selecting the number of components Perform principal components analysis using the PRINCOMP procedure Diastolic blood pressure LDL Cholesterol Systolic blood pressure Medication Diet HDL Cholesterol Exercise 3 4 Solutions 5 An Easy Choice Eliminate some redundant variables May lose important information that was uniquely reflected in the eliminated variables Create composite scores from variables sum or average Lost variability among the variables Multiple scale scores may still be collinear Create weighted linear combinations of variables while retaining most of the variability in the data Fewer variables little or no lost variation No collinear scales To retain most of the information in the data while reducing the number of variables you must deal with try principal components analysis Most of the variability in the original data can be retained but Components may not be directly interpretable 6 1 Principal Components Analysis Principal Components PCA is a dimension reduction method that creates variables called principal components creates as many components as there are input variables Principal components are weighted linear combinations of input variables are orthogonal to and independent of other components are generated so that the first component accounts for the most variation in the xs followed by the second component and so on 7 8 First Principal Component Second Principal Component 9 10 More on the Geometric Properties Details of Principal Components Least squares regression minimizes the sum of squared vertical distances to the fitted line perpendicular to x The j principal components provide a least squares solution to the following model Y XB where Y n by p matrix of scores on the components X n by j matrix of centered

View Full Document

Unlocking...