DOC PREVIEW
UT EE 381K - Literature Survey Non-negative Matrix Factorization

This preview shows page 1-2 out of 5 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 5 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Literature SurveyNon-negative Matrix Factorization❦Joel A. Tropp<[email protected]>Texas Institute for Computational and Applied MathematicsThe University of Texas at AustinAustin, TX 78712❦Multi-DSPEE 381K5 March 20031INTRODUCTION❦❧ Suppose we have a set of N non-negative, d-dimensionaldata vectors. Form the d × N matrix X whose columnsare the data points.❧ We would like to find a set of K  N representatives.We hope that linear combinations of representatives willapproximate the data well.❧ Problem can be expressed as factoring X ≈ V H withVd×Kand HK×N.❧ Applications: Dimensionality reduction, feature extraction,matrix approximation, compression.2L’ANCIEN R´EGIME❦Principal Component Analysis (PCA)❧ The goal of PCA is to find the best K-dimensionalprojection of the data in the least-squares sense.❧ How? First, form the (scaled) covariance matrixS =NXn=1(xn− µ)(xn− µ)t.Then select the top K eigenvectors of S as the basis forthe projection subspace.❧ Pros: Straightforward to implement; provably optimal interms of least-squares.❧ Cons: The eigenvectors lack intuitive meanings; optimalityis achieved through cancelation effects.Source: R. O. Duda, P. E. Hart and D. G. Stork. PatternClassification. 2nd ed. New York: John Wiley and Sons, 2001.3AN ALTERNATIVE❦Non-negative Matrix Factorization (NNMF)❧ To avoid cancelation effects, constrain the factors V andH to have non-negative entries.❧ Lee and Seung have shown empirically NNMF yields aparts-based representation for images.❧ For text, NNMF seems able to distill semantic groups.Source: D. D. Lee and H. S. Seung. ”Learning the parts ofobjects by non-negative matrix factorization.” Nature. Vol. 401,pp. 788–791. 21 October 1999.4IMPLEMENTATIONS❦Least Squares❧ Measures quality of approximation with kX − V HkF.❧ Update rules:Hjk← Hjk(VtX)jk(VtV H)jkVij← Vij(XHt)ij(V HHt)ij.Kullback-Leibler Divergence❧ Measures quality of approximation withXi,kXiklogXik(V H)ik− Xik+ (V H)ij.❧ Update rules:Hjk← HjkPiVijXik/(V H)ikPiXijVij← VijPkHjkXik/(V H)ikPiXij.Source: D. D. Lee and H. S. Seung. ”Algorithms fornon-negative matrix factorization.” NIPS. pp. 556–562.


View Full Document

UT EE 381K - Literature Survey Non-negative Matrix Factorization

Documents in this Course
Load more
Download Literature Survey Non-negative Matrix Factorization
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Literature Survey Non-negative Matrix Factorization and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Literature Survey Non-negative Matrix Factorization 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?