DOC PREVIEW
UCI ICS 273A - Homework CS 273A Intro ML

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

ICS 273A Fall 2011 Homework CS 273A Intro ML Due Mo Oct.17 2011 in EEE dropbox (pdf) Read section 9.1 and 12.1 of Bishop’s book. 1) Imagine you have a dataset, X. You wish to send the data to a friend. You decide to discover the regularities in the data by fitting a model. You send the model-specification (say, the cluster means), a code-vector for each data-case (say, the assignment of each data-case to a cluster) and the errors in predicting the data-case from the code and the model (say, the vector ). Your objective is to send as few bits as possible without losing any information (assuming you are only interested in knowing the data up to finite precision or a fixed quantization level). Argue why for small datasets you expect simple models to be optimal while for large datasets you would expect more complex models to be optimal for this purpose. 2) Derive K-means from the cost function C (see slides) by showing that the two steps correspond to coordinate descend on C with respect to the variables . 3) Remove the labels from the Iris data set and run k-means clustering on it. Can you recover the three original clusters if you set k=3. Do you get different results if you initialize differently? Why? 3) Is it possible for a probability density such as a normal density to have p(x) > 1? 4) Derive a linear transformation y=Ax based on a principal components analysis, such that the covariance Cov(y)=Identity. 5) Imagine you have a collection of gray-level images of faces. Provide pseudo-code to compute the “eigenfaces” of this collection. (see below for 4 examples of eigenfaces).ICS 273A Fall 2011 6) Perform a PCA on the Iris data. Make sure you first center the data. Extract the first two eigenvalues and eigenvectors. Project the data down to a two-dimensional subspace and produce a scatter plot of the data. Make sure you plot each of the three classes differently (using color or different


View Full Document

UCI ICS 273A - Homework CS 273A Intro ML

Documents in this Course
Load more
Download Homework CS 273A Intro ML
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Homework CS 273A Intro ML and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Homework CS 273A Intro ML 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?