Dimension Reduction (PCA, ICA, CCA, FLD, To p i c M o d e l s )Yi Zhang10-701, Machine Learning, Spring 2011April 6th, 2011Parts of the PCA slides are from previous 10-701 lectures1Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation2Dimension reduction Feature selection – select a subset of features More generally, feature extraction◦ Not limited to the original features◦ “Dimension reduction” usually refers to this case3Dimension reduction Assumption: data (approximately) lies on a lower dimensional space Examples:4Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation5Principal components analysis6Principal components analysis7Principal components analysis8Principal components analysis9Principal components analysis Assume data is centered For a projection direction v◦ Variance of projected data10Principal components analysis Assume data is centered For a projection direction v◦ Variance of projected data◦ Maximize the variance of projected data 11Principal components analysis Assume data is centered For a projection direction v◦ Variance of projected data◦ Maximize the variance of projected data ◦ How to solve this ? 12Principal components analysis PCA formulation As a result …13Principal components analysis14Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation15Source separation The classical “cocktail party” problem◦ Separate the mixed signal into sources16Source separation The classical “cocktail party” problem◦ Separate the mixed signal into sources◦ Assumption: different sources are independent17Independent component analysis Let v1, v2, v3, … vddenote the projection directions of independent components ICA: find these directions such that data projected onto these directions have maximum statistical independence18Independent component analysis Let v1, v2, v3, … vddenote the projection directions of independent components ICA: find these directions such that data projected onto these directions have maximum statistical independence How to actually maximize independence?◦ Minimize the mutual information◦ Or maximize the non-Gaussianity◦ Actual formulation quite complicated !19Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation20Recall: PCA Principal component analysis◦ Note: ◦ Find the projection direction v such that the variance of projected data is maximized◦ Intuitively, find the intrinsic subspace of the original feature space (in terms of retaining the data variability)21Canonical correlation analysis Now consider two sets of variables x and y◦ x is a vector of p variables◦ y is a vector of q variables◦ Basically, two feature spaces How to find the connection between two set of variables (or two feature spaces)?22Canonical correlation analysis Now consider two sets of variables x and y◦ x is a vector of p variables◦ y is a vector of q variables◦ Basically, two feature spaces How to find the connection between two set of variables (or two feature spaces)?◦ CCA: find a projection direction u in the space of x, and a projection direction v in the space of y, so that projected data onto u and v has max correlation◦ Note: CCA simultaneously finds dimension reduction for two feature spaces23Canonical correlation analysis CCA formulation◦ X is n by p: n samples in p-dimensional space◦ Y is n by q: n samples in q-dimensional space◦ The n samples are paired in X and Y24Canonical correlation analysis CCA formulation◦ X is n by p: n samples in p-dimensional space◦ Y is n by q: n samples in q-dimensional space◦ The n samples are paired in X and Y How to solve? … kind of complicated …25Canonical correlation analysis CCA formulation◦ X is n by p: n samples in p-dimensional space◦ Y is n by q: n samples in q-dimensional space◦ The n samples are paired in X and Y How to solve? Generalized eigenproblems !26Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation27Fisher’s linear discriminant Now come back to one feature space In addition to features, we also have label◦ Find the dimension reduction that helps separate different classes of examples !◦ Let’s consider 2-class case28Fisher’s linear discriminant Idea: maximize the ratio of “between-class variance” over “within-class variance” for the projected data29Fisher’s linear discriminant30Fisher’s linear discriminant Generalize to multi-class cases Still, maximizing the ratio of “between-class variance” over “within-class variance” of the projected data:31Outline Dimension reduction Principal Components Analysis Independent Component Analysis Canonical Correlation Analysis Fisher’s Linear Discriminant Topic Models and Latent DirichletAllocation32Topic models Topic models: a class of dimension reduction models on text (from words to topics)33Topic models Topic models: a class of dimension reduction models on text (from words to topics) Bag-of-words representation of documents34Topic models Topic models: a class of dimension reduction models on text (from words to topics) Bag-of-words representation of documents Topic models for representing documents35Latent Dirichlet allocation A fully Bayesian specification of topic models36◦ Data: words on each documents◦ Estimation: maximizing the data likelihood – difficult!Latent Dirichlet
View Full Document