1. Stat 231. A.L. Yuille. Fall 20042. Basic AdaBoost Review3. Basic AdaBoost Algorithm4. Basic AdaBoost Algorithm5. AdaBoost Variant 1.6. AdaBoost Variant 1.7. AdaBoost Variant 2.8. AdaBoost Variant 2.9. AdaBoost Extensions10. AdaBoost SummaryLecture notes for Stat 231: Pattern Recognition and Machine Learning1. Stat 231. A.L. Yuille. Fall 2004AdaBoost..Summary and Extensions.Read Viola and Jones Handout.Lecture notes for Stat 231: Pattern Recognition and Machine Learning2. Basic AdaBoost ReviewData Set of weak classifiersWeights Parameters Strong Classifier:Lecture notes for Stat 231: Pattern Recognition and Machine Learning3. Basic AdaBoost AlgorithmInitializeUpdate Rule: where Z is the normalization constant.LetPick classifier to minimizeSetRepeat.Lecture notes for Stat 231: Pattern Recognition and Machine Learning4. Basic AdaBoost Algorithm.Errors:Bounded by, which equalsAdaBoost is a greedy algorithm that tries to minimize the bound by minimizing the Z’s in order w.r.t.Lecture notes for Stat 231: Pattern Recognition and Machine Learning5. AdaBoost Variant 1.In preparation for Viola and Jones. New parameterStrong classifierModify update rule:Let be the sum of weights if weak class is p, true class q.Pick weak classifier to minimize setLecture notes for Stat 231: Pattern Recognition and Machine Learning6. AdaBoost Variant 1.As before: the error is bounded bySame “trick” If weak classifier is right then: If weak classifier is wrong then:Lecture notes for Stat 231: Pattern Recognition and Machine Learning7. AdaBoost Variant 2.We have assumed a loss function which pays equal penalties for false positives and false negatives.But we may want false negatives to cost more (Viola and Jones).Use loss function:Lecture notes for Stat 231: Pattern Recognition and Machine Learning8. AdaBoost Variant 2.Modify the update rule:Verify that the loss:Same update rule as for Variant 1, exceptLecture notes for Stat 231: Pattern Recognition and Machine Learning9. AdaBoost ExtensionsAdaBoost can be extended to multiclasses: (Singer and Schapire)The weak classifiers can have take multiple values.The conditional probability interpretation applies to these extensions.Lecture notes for Stat 231: Pattern Recognition and Machine Learning10. AdaBoost SummaryBasic AdaBoost:. Combine weak classifiers to make a strong classifier.Dynamically weight the data, so that misclassified data weighs more (like SVM pay more attention to hard-to-classify data).Exponential convergence to empirical risk (weak conditions).Useful for combining weak cues for Visual Detection tasks.Probabilistic Interpretation/Multiclass/Multivalued
View Full Document