DOC PREVIEW
UCLA STAT 231 - Syllabus

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Stat 231--- CS 276APattern Recognition and Machine LearningTR 3:30-4:45 PM, Fall 2014, Geology 4660 www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.htmlCourse DescriptionThis course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics. Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis, boosting techniques, kernel methods and support vector machine.PrerequisitesMath 33A Linear Algebra and Its Applications, Matrix AnalysisStat 100B Intro to Mathematical Statistics,CS 180 Intro to Algorithms and Complexity.Programming skills in Matlab or R.TextbookR. Duda, P. Hart, D. Stork, "Pattern Classification", second edition, 2000. [Good for CS students]T. Hastie, R. Tibshurani, and J.H. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, andPrediction", Spinger Series in Statistics, 2nd edition 2009. [Good for Statistics students]InstructorsProf. Song-Chun Zhu, [email protected], 310-206-8693, office: Boelter Hall 9404. Office Hours: Tuesday 1:00-3:00pm Reader: Jungseock Joo, [email protected], office: Boelter Hall 9410.Office hours: Thursday 1:00-3:00pmGrading Plan: 4 units, letter gradesTwo Homework assignments 20%Three projects:1. Face modeling by AAM Model how many bits do you need to represent a face?2. Face detection by AdaBoost how many features do you need to detect a face?3, Face social attributes scoring by SVM how important is face in political elections and social network? 15%15%15% Middle-Term Exam: No. 0%Final Exam: Dec 15, Monday 3:00-5:00pm (close book exam) 35%Grading policyHomework policy: Homework must be finished independently. Do not discuss with classmates or others.Project policy:You are encouraged to work and discuss in a group, but each person must finish his/her own project. Hand in (i) a brief description of the experiment in hard copy, (ii) results and plots in hard copy, (iii) your code in e-copy to thereader.Late policy:You have a total of three late days (not including weekends) for the entire class, but after using the three late days, nocredit will be given for late homework/project.Tentative Schedule for 2014Lecture Date Topics Reading Materials Handouts1 10-02Introduction to Pattern Recognition[Problems, applications, examples, and project introduction]Ch 1syllabus.pdfLect1.pdf2 10-07Bayesian Decision Theory I[Bayes rule, discriminant functions]Ch 2.1-2.6 Lect2.pdf 3 10-09Bayesian Decision Theory II [loss functions and Bayesian error analysis]Ch 2.1-2.6 Lect3.pdf 4 10-14Component Analysis and Dimension Reduction I:[principal component analysis (PCA)], face modeling][Explanation of Project 1: code and data format]Ch 3.8.1, Ch 10.13.1Project 1 HW1Lect4-5.pdf5 10-16Component Analysis and Dimension Reduction II:[Fisher Linear Discriminant ][Multi-dimensional scaling (MDS)]Ch 3.8.2, Ch10.14FisherFace.pdfLect5-6.pdf6 10-21Component Analysis and Dimension Reduction III:[Local Linear Embedding (LLE), Intrinsic dimension]paper LLE paper 7 10-23Boosting Techniques I: [perceptron, backpropagation and Adaboost]Ch 9.5 Lect7-9.pdf8 10-28Boosting Techniques II:[RealBoost and Example on face detection][ Explanation of project II ]Tutorial Handout 1Handout 29 10-30Boosting Techniques III:[Probabilistic analysis, logit boost, cascade and decision Policy]10 11-04Non-metric method I: [tree structured Classification: principle and example]Ch 8.1-8.3 Lect10.pdf11 11-06Non-metric method II:Syntactic pattern recognition and example on human parsingCh 8.5-8.8 Lect11.pdf11-11 Veterans day holiday 12 11-13Support vector machine I: Kernel-induced feature space Tutorial paper Lect12-15.pdf13 11-18Support vector machine II: [Support vector classifier][Explanation of project III]Ch 5.1114 11-20Support vector machine III:[Loss functions, Latent SVM, Neual networks and DeepNet]15 11-25Parametric Learning [ Maximum Likelihood Estimation (MLE) ] [ Sufficient Statistics and Maximum entropy ]Ch 3.1-3.6 Lect16.pdf11-27 ThanksGivings Holiday 16 12-02Non-parametric Learning I [ Parzen window and K-nn classifer]Ch 4.1-4.5 Lect17.pdf17 12-04Non-parametric Learning II:[K-nn classifer and Error analysis]Ch 4.6handoutLect18.pdf18 12-09Non-parametric Learning III: [K-nn fast approximate computing:KD-tree and Hashing ]paper1paper2Lect 19.pdf 19 12-11 Data Clustering and Bi-clustering: [K-mean clustering, EM clustering by MLE, Provable 2-step EM,mean-shift and landscape ]Ch 10.1-10.4HandoutLect


View Full Document

UCLA STAT 231 - Syllabus

Download Syllabus
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Syllabus and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Syllabus 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?