CORNELL CS 6670 - Study Notes (36 pages)

Previewing pages 1, 2, 17, 18, 19, 35, 36 of 36 page document View the full content.
View Full Document

Study Notes



Previewing pages 1, 2, 17, 18, 19, 35, 36 of actual document.

View the full content.
View Full Document

Unformatted text preview:

CS6670 Computer Vision Noah Snavely Lecture 15 Eigenfaces Announcements Final project page up at http www cs cornell edu courses cs6670 2009fa projects p4 One person from each team should submit a proposal to CMS by tomorrow at 11 59pm Project 3 Eigenfaces Skin detection results General classification This same procedure applies in more general circumstances More than two classes More than one dimension Example face detection Here X is an image region dimension pixels each face can be thought of as a point in a high dimensional space H Schneiderman T Kanade A Statistical Method for 3D Object Detection Applied to Faces and Cars IEEE Conference on Computer Vision and Pattern Recognition CVPR 2000 http www 2 cs cmu edu afs cs cmu edu user hws www CVPR00 pdf H Schneiderman and T Kanade Linear subspaces convert x into v1 v2 coordinates What does the v2 coordinate measure distance to line use it for classification near 0 for orange pts What does the v1 coordinate measure position along line use it to specify which orange point it is Classification can be expensive Must either search e g nearest neighbors or store large PDF s Suppose the data points are arranged as above Idea fit a line classifier measures distance to line Dimensionality reduction How to find v1 and v2 Dimensionality reduction We can represent the orange points with only their v1 coordinates since v2 coordinates are all essentially 0 This makes it much cheaper to store and compare points A bigger deal for higher dimensional problems Linear subspaces Consider the variation along direction v among all of the orange points What unit vector v minimizes var What unit vector v maximizes var 2 Solution v1 is eigenvector of A with largest eigenvalue v2 is eigenvector of A with smallest eigenvalue Principal component analysis Suppose each data point is N dimensional Same procedure applies The eigenvectors of A define a new coordinate system eigenvector with largest eigenvalue captures the most variation among



View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams

Loading Unlocking...
Login

Join to view Study Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study Notes and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?