UD CISC 689 - Eigen Decomposition and Singular Value Decomposition

Unformatted text preview:

Eigen Decomposition and Singular Value DecompositionIntroductionWhat are eigenvalues?Calculating the Eigenvectors/valuesEigenvalue examplePhysical interpretationSlide 7Slide 8Spectral Decomposition theoremExample for spectral decompositionSingular Value DecompositionSingular Value Decomposition (contd.)Example for SVDSlide 14Applications of SVD in Linear AlgebraSlide 16What is the use of SVD?Image Compression using SVDEigen Decomposition and Singular Value DecompositionMani ThomasCISC 489/689IntroductionEigenvalue decompositionSpectral decomposition theoremPhysical interpretation of eigenvalue/eigenvectorsSingular Value DecompositionImportance of SVDMatrix inversionSolution to linear system of equationsSolution to a homogeneous system of equationsSVD applicationWhat are eigenvalues?Given a matrix, A, x is the eigenvector and  is the corresponding eigenvalue if Ax = xA must be square the determinant of A -  I must be equal to zeroAx - x = 0 ! x(A - I) = 0Trivial solution is if x = 0The non trivial solution occurs when det(A - I) = 0Are eigenvectors are unique?If x is an eigenvector, then  x is also an eigenvector and  is an eigenvalueA(x) = (Ax) = ( x) = (x)Calculating the Eigenvectors/valuesExpand the det(A - I) = 0 for a 2 £ 2 matrixFor a 2 £ 2 matrix, this is a simple quadratic equation with two solutions (maybe complex)This “characteristic equation” can be used to solve for x      000det01001detdet2112221122112211222112221121122211211aaaaaaaaaaaaaaaaaaIA   211222112221122114 aaaaaaaaEigenvalue exampleConsider,The corresponding eigenvectors can be computed asFor  = 0, one possible solution is x = (2, -1)For  = 5, one possible solution is x = (1, 2)    5,0)41(02241)41(04221222112221122112aaaaaaA0012241224050054221500422142210000042210yxyxyxyxyxyxyxyxFor more information: Demos in Linear algebra by G. Strang, http://web.mit.edu/18.06/www/Physical interpretationConsider a correlation matrix, AError ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue25.0,75.1175.75.121APhysical interpretationOrthogonal directions of greatest variance in dataProjections along PC1 (Principal Component) discriminate the data most along any one axisOriginal Variable AOriginal Variable BPC 1PC 2Physical interpretationFirst principal component is the direction of greatest variability (covariance) in the dataSecond is the next orthogonal (uncorrelated) direction of greatest variabilitySo first remove all the variability along the first component, and then find the next direction of greatest variabilityAnd so on …Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvaluesFor more information: See Gram-Schmidt Orthogonalization in G. Strang’s lecturesSpectral Decomposition theoremIf A is a symmetric and positive definite k £ k matrix (xTAx > 0) with i (i > 0) and ei, i = 1  k being the k eigenvector and eigenvalue pairs, thenThis is also called the eigen decomposition theoremAny symmetric matrix can be reconstructed using its eigenvalues and eigenvectorsAny similarity to what has been discussed before?               TkikTikiikkkTkkkkkTkkTkkkPPeeAeeeeeeA 111111212211111   kkkkkk000000,,2121eeePExample for spectral decompositionLet A be a symmetric, positive definite matrixThe eigenvectors for the corresponding eigenvalues areConsequently,     02316.016.650det8.24.04.02.22IAA51,52,52,5121TTee4.08.08.06.14.22.12.16.05152515225251525138.24.04.02.2ASingular Value DecompositionIf A is a rectangular m £ k matrix of real numbers, then there exists an m £ m orthogonal matrix U and a k £ k orthogonal matrix V such that  is an m £ k matrix where the (i, j)th entry i ¸ 0, i = 1  min(m, k) and the other entries are zeroThe positive constants i are the singular values of AIf A has rank r, then there exists r positive constants 1, 2,r, r orthogonal m £ 1 unit vectors u1,u2,,ur and r orthogonal k £ 1 unit vectors v1,v2,,vr such thatSimilar to the spectral decomposition theorem    IVVUUVUA TTkkTkmmmkmriTiii1vuASingular Value Decomposition (contd.)If A is a symmetric and positive definite then SVD = Eigen decompositionEIG(i) = SVD(i2)Here AAT has an eigenvalue-eigenvector pair (i2,ui)Alternatively, the vi are the eigenvectors of ATA with the same non zero eigenvalue i2TTVVAA2  TTTTTTTUUUVVUVUVUAA2Example for SVDLet A be a symmetric, positive definite matrixU can be computed asV can be computed as  21,21,21,2110,120det1111111131131311131311132121TTTTuuIAAAAA


View Full Document

UD CISC 689 - Eigen Decomposition and Singular Value Decomposition

Download Eigen Decomposition and Singular Value Decomposition
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Eigen Decomposition and Singular Value Decomposition and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Eigen Decomposition and Singular Value Decomposition 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?