Machine Learning Basics: 3. Ensemble LearningOutlineEnsemble Methods in Machine LearningBoostingEnsemble Methods in Machine LearningMachine Learning Basics: 3. Ensemble LearningDifferent Classifiers (1)Different ClassifiersConduct classification on a same set of class labelsMay use different input or have different parametersMay produce different output for a certain exampleLearning Different ClassifiersUse different training examplesUse different featuresMachine Learning Basics: 3. Ensemble LearningDifferent Classifiers (2)PerformanceEach of the classifiers is not perfectComplementaryExamples which are not correctly classified by one classifier may be correctly classified by the other classifiersPotential Improvements?Utilize the complementary propertyMachine Learning Basics: 3. Ensemble LearningEnsembles of ClassifiersIdeaCombine the classifiers to improve the performanceEnsembles of ClassifiersCombine the classification results from different classifiers to produce the final outputUnweighted votingWeighted votingMachine Learning Basics: 3. Ensemble LearningExample: Weather ForecastReality12345CombineXXXXX XXX XXXXXMachine Learning Basics: 3. Ensemble LearningEnsemble LearningEnsemble LearningRelatively new field in machine learningAchieve state-of-the-art performanceCentral Issues in Ensemble LearningHow to create classifiers with complementary performancesHow to conduct votingMachine Learning Basics: 3. Ensemble LearningStrong and Weak LearnersStrong LearnerTake labeled data for trainingProduce a classifier which can be arbitrarily accurateObjective of machine learningWeak LearnerTake labeled data for trainingProduce a classifier which is more accurate than random guessingMachine Learning Basics: 3. Ensemble LearningBoostingLearnersStrong learners are very difficult to constructConstructing weaker Learners is relatively easyStrategyDerive strong learner from weak learnerBoost weak classifiers to a strong learnerMachine Learning Basics: 3. Ensemble LearningConstruct Weak ClassifiersUsing Different Data Distribution Start with uniform weightingDuring each step of learningIncrease weights of the examples which are not correctly learned by the weak learnerDecrease weights of the examples which are correctly learned by the weak learner IdeaFocus on difficult examples which are not correctly classified in the previous stepsMachine Learning Basics: 3. Ensemble LearningCombine Weak ClassifiersWeighted Voting Construct strong classifier by weighted voting of the weak classifiersIdeaBetter weak classifier gets a larger weightIteratively add weak classifiersIncrease accuracy of the combined classifier through minimization of a cost functionMachine Learning Basics: 3. Ensemble LearningExampleTraining Combined classifierMachine Learning Basics: 3. Ensemble LearningPerformanceData Set27 data sets from UCI ML RepositoryMethods for ComparisonDecision tree classifier: C4.5Boosting: AdaBoost using C4.5 as the weak learnerMachine Learning Basics: 3. Ensemble LearningResults (Freund and Schapire 1996)Error rate of boosting C4.5Error rate of
View Full Document