DOC PREVIEW
UCLA STAT 231 - Lecture 14

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1. Stat 231. A.L. Yuille. Fall 20042. Basic AdaBoost Review3. Basic AdaBoost Algorithm4. Basic AdaBoost Algorithm5. AdaBoost Variant 1.6. AdaBoost Variant 1.7. AdaBoost Variant 2.8. AdaBoost Variant 2.9. AdaBoost Extensions10. AdaBoost SummaryLecture notes for Stat 231: Pattern Recognition and Machine Learning1. Stat 231. A.L. Yuille. Fall 2004AdaBoost..Summary and Extensions.Read Viola and Jones Handout.Lecture notes for Stat 231: Pattern Recognition and Machine Learning2. Basic AdaBoost ReviewData Set of weak classifiersWeights Parameters Strong Classifier:Lecture notes for Stat 231: Pattern Recognition and Machine Learning3. Basic AdaBoost AlgorithmInitializeUpdate Rule: where Z is the normalization constant.LetPick classifier to minimizeSetRepeat.Lecture notes for Stat 231: Pattern Recognition and Machine Learning4. Basic AdaBoost Algorithm.Errors:Bounded by, which equalsAdaBoost is a greedy algorithm that tries to minimize the bound by minimizing the Z’s in order w.r.t.Lecture notes for Stat 231: Pattern Recognition and Machine Learning5. AdaBoost Variant 1.In preparation for Viola and Jones. New parameterStrong classifierModify update rule:Let be the sum of weights if weak class is p, true class q.Pick weak classifier to minimize setLecture notes for Stat 231: Pattern Recognition and Machine Learning6. AdaBoost Variant 1.As before: the error is bounded bySame “trick” If weak classifier is right then: If weak classifier is wrong then:Lecture notes for Stat 231: Pattern Recognition and Machine Learning7. AdaBoost Variant 2.We have assumed a loss function which pays equal penalties for false positives and false negatives.But we may want false negatives to cost more (Viola and Jones).Use loss function:Lecture notes for Stat 231: Pattern Recognition and Machine Learning8. AdaBoost Variant 2.Modify the update rule:Verify that the loss:Same update rule as for Variant 1, exceptLecture notes for Stat 231: Pattern Recognition and Machine Learning9. AdaBoost ExtensionsAdaBoost can be extended to multiclasses: (Singer and Schapire)The weak classifiers can have take multiple values.The conditional probability interpretation applies to these extensions.Lecture notes for Stat 231: Pattern Recognition and Machine Learning10. AdaBoost SummaryBasic AdaBoost:. Combine weak classifiers to make a strong classifier.Dynamically weight the data, so that misclassified data weighs more (like SVM pay more attention to hard-to-classify data).Exponential convergence to empirical risk (weak conditions).Useful for combining weak cues for Visual Detection tasks.Probabilistic Interpretation/Multiclass/Multivalued


View Full Document

UCLA STAT 231 - Lecture 14

Download Lecture 14
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 14 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 14 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?