Eigen Decomposition and Singular Value DecompositionIntroductionWhat are eigenvalues?Calculating the Eigenvectors/valuesEigenvalue examplePhysical interpretationSlide 7Slide 8Spectral Decomposition theoremExample for spectral decompositionSingular Value DecompositionSingular Value Decomposition (contd.)Example for SVDSlide 14Applications of SVD in Linear AlgebraSlide 16What is the use of SVD?Image Compression using SVDEigen Decomposition and Singular Value DecompositionMani ThomasCISC 489/689IntroductionEigenvalue decompositionSpectral decomposition theoremPhysical interpretation of eigenvalue/eigenvectorsSingular Value DecompositionImportance of SVDMatrix inversionSolution to linear system of equationsSolution to a homogeneous system of equationsSVD applicationWhat are eigenvalues?Given a matrix, A, x is the eigenvector and is the corresponding eigenvalue if Ax = xA must be square the determinant of A - I must be equal to zeroAx - x = 0 ! x(A - I) = 0Trivial solution is if x = 0The non trivial solution occurs when det(A - I) = 0Are eigenvectors are unique?If x is an eigenvector, then x is also an eigenvector and is an eigenvalueA(x) = (Ax) = ( x) = (x)Calculating the Eigenvectors/valuesExpand the det(A - I) = 0 for a 2 £ 2 matrixFor a 2 £ 2 matrix, this is a simple quadratic equation with two solutions (maybe complex)This “characteristic equation” can be used to solve for x 000det01001detdet2112221122112211222112221121122211211aaaaaaaaaaaaaaaaaaIA 211222112221122114 aaaaaaaaEigenvalue exampleConsider,The corresponding eigenvectors can be computed asFor = 0, one possible solution is x = (2, -1)For = 5, one possible solution is x = (1, 2) 5,0)41(02241)41(04221222112221122112aaaaaaA0012241224050054221500422142210000042210yxyxyxyxyxyxyxyxFor more information: Demos in Linear algebra by G. Strang, http://web.mit.edu/18.06/www/Physical interpretationConsider a correlation matrix, AError ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue25.0,75.1175.75.121APhysical interpretationOrthogonal directions of greatest variance in dataProjections along PC1 (Principal Component) discriminate the data most along any one axisOriginal Variable AOriginal Variable BPC 1PC 2Physical interpretationFirst principal component is the direction of greatest variability (covariance) in the dataSecond is the next orthogonal (uncorrelated) direction of greatest variabilitySo first remove all the variability along the first component, and then find the next direction of greatest variabilityAnd so on …Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvaluesFor more information: See Gram-Schmidt Orthogonalization in G. Strang’s lecturesSpectral Decomposition theoremIf A is a symmetric and positive definite k £ k matrix (xTAx > 0) with i (i > 0) and ei, i = 1 k being the k eigenvector and eigenvalue pairs, thenThis is also called the eigen decomposition theoremAny symmetric matrix can be reconstructed using its eigenvalues and eigenvectorsAny similarity to what has been discussed before? TkikTikiikkkTkkkkkTkkTkkkPPeeAeeeeeeA 111111212211111 kkkkkk000000,,2121eeePExample for spectral decompositionLet A be a symmetric, positive definite matrixThe eigenvectors for the corresponding eigenvalues areConsequently, 02316.016.650det8.24.04.02.22IAA51,52,52,5121TTee4.08.08.06.14.22.12.16.05152515225251525138.24.04.02.2ASingular Value DecompositionIf A is a rectangular m £ k matrix of real numbers, then there exists an m £ m orthogonal matrix U and a k £ k orthogonal matrix V such that is an m £ k matrix where the (i, j)th entry i ¸ 0, i = 1 min(m, k) and the other entries are zeroThe positive constants i are the singular values of AIf A has rank r, then there exists r positive constants 1, 2,r, r orthogonal m £ 1 unit vectors u1,u2,,ur and r orthogonal k £ 1 unit vectors v1,v2,,vr such thatSimilar to the spectral decomposition theorem IVVUUVUA TTkkTkmmmkmriTiii1vuASingular Value Decomposition (contd.)If A is a symmetric and positive definite then SVD = Eigen decompositionEIG(i) = SVD(i2)Here AAT has an eigenvalue-eigenvector pair (i2,ui)Alternatively, the vi are the eigenvectors of ATA with the same non zero eigenvalue i2TTVVAA2 TTTTTTTUUUVVUVUVUAA2Example for SVDLet A be a symmetric, positive definite matrixU can be computed asV can be computed as 21,21,21,2110,120det1111111131131311131311132121TTTTuuIAAAAA
View Full Document