Machine Learning 1010 701 15701 15 781 Spring 2008 Decision Trees Eric Xing Lecture 6 February 4 2008 Reading Chap 1 6 CB Chap 3 TM Learning non linear functions f X Y z X vector of continuous and or discrete vars z Y discrete vars z Linear separator z f might be non linear function The XOR gate Speech recognition 1 A hypothesis for TaxFraud z Input a vector of attributes X Refund MarSt TaxInc z z Output Y Cheating or Not z z H as a procedure Refund Yes No NO MarSt Each internal node test one attribute Xi z Each branch from a node selects one value for Xi z Each leaf node predict Y Married Single Divorced TaxInc 80K z NO 80K YES NO Apply Model to Query Data Query Data Start from the root of tree Refund No NO MarSt Single Divorced TaxInc NO Taxable Income Cheat No 80K Married 10 Yes 80K Refund Marital Status Married NO 80K YES 2 Apply Model to Test Data Query Data Refund Taxable Income Cheat No 80K Married 10 Yes No NO MarSt Single Divorced TaxInc 80K Refund Marital Status Married NO 80K YES NO Apply Model to Test Data Query Data Refund No NO MarSt Single Divorced TaxInc NO Taxable Income Cheat No 80K Married 10 Yes 80K Refund Marital Status Married NO 80K YES 3 Apply Model to Test Data Query Data Refund Taxable Income Cheat No 80K Married 10 Yes No NO MarSt Single Divorced TaxInc 80K Refund Marital Status Married NO 80K YES NO Apply Model to Test Data Query Data Refund No NO MarSt Single Divorced TaxInc NO Taxable Income Cheat No 80K Married 10 Yes 80K Refund Marital Status Married NO 80K YES 4 Apply Model to Test Data Query Data Refund No NO MarSt Single Divorced TaxInc NO Taxable Income Cheat No 80K Married 10 Yes 80K Refund Marital Status Married Assign Cheat to No NO 80K YES A Tree to Predict C Section Risk z Learned from medical records of 1000 wonman Negative examples are C sections 5 Expressiveness z Decision trees can express any function of the input attributes z E g for Boolean functions truth table row path to leaf z Trivially there is a consistent decision tree for any training set with one path to leaf for each example unless f nondeterministic in x but it probably won t generalize to new examples z Prefer to find more compact decision trees Hypothesis spaces How many distinct decision trees with n Boolean attributes number of Boolean functions number of distinct truth tables with 2n rows 22 z n E g with 6 Boolean attributes there are 18 446 744 073 709 551 616 trees 6 Hypothesis spaces How many distinct decision trees with n Boolean attributes number of Boolean functions number of distinct truth tables with 2n rows 22 z n E g with 6 Boolean attributes there are 18 446 744 073 709 551 616 trees How many purely conjunctive hypotheses e g Hungry Rain z Each attribute can be in positive in negative or out 3n distinct conjunctive hypotheses z More expressive hypothesis space z increases chance that target function can be expressed z increases number of hypotheses consistent with training set may get worse predictions Decision Tree Learning Tid Attrib1 1 Yes Large 125K No 2 No Medium Attrib2 100K Attrib3 No Class 3 No Small 70K No 4 Yes Medium 120K No 5 No Large 95K Yes 6 No Medium 60K No 7 Yes Large 220K No 8 No Small 85K Yes 9 No Medium 75K No 10 No Small 90K Yes Tid Attrib1 11 No Small 55K 12 Yes Medium 80K 13 Yes Large 110K 14 No Small 95K 15 No Large 67K Learn Model 10 Attrib2 Attrib3 Class Apply Model Decision Tree 10 7 Example of a Decision Tree te ca ric go al te ca ric go al n co uo ti n us s as cl Tid Refund Marital Status Taxable Income Cheat 1 Yes Single 125K No 2 No Married 100K No 3 No Single 70K No 4 Yes Married 120K No 5 No Divorced 95K Yes 6 No Married No 7 Yes Divorced 220K 60K Splitting Attributes Refund Yes No NO MarSt TaxInc No 8 No Single 85K Yes 9 No Married 75K No 10 No Single 90K Yes Married Single Divorced 80K NO 80K YES NO 10 Model Decision Tree Training Data Another Example of Decision Tree te ca ric go al te ca ric go al n co uo ti n us s as cl Tid Refund Marital Status Taxable Income Cheat 1 Yes Single 125K No 2 No Married 100K No 3 No Single 70K No 4 Yes Married 120K No 5 No Divorced 95K Yes 6 No Married No 7 Yes Divorced 220K No 8 No Single 85K Yes 9 No Married 75K No 10 No Single 90K Yes 60K Married MarSt NO Single Divorced Refund Yes NO No TaxInc 80K NO 80K YES There could be more than one tree that fits the same data 10 Training Data 8 Top Down Induction of DT Tree Induction z Greedy strategy z z Split the records based on an attribute test that optimizes certain criterion Issues z z Determine how to split the records z How to specify the attribute test condition z How to determine the best split Determine when to stop splitting 9 Tree Induction z Greedy strategy z z Split the records based on an attribute test that optimizes certain criterion Issues z z Determine how to split the records z How to specify the attribute test condition z How to determine the best split Determine when to stop splitting How to Specify Test Condition z z Depends on attribute types z Nominal z Ordinal z Continuous Depends on number of ways to split z 2 way split z Multi way split 10 Splitting Based on Nominal Attributes z Multi way split Use as many partitions as distinct values CarType Family Luxury Sports z Binary split Divides values into two subsets Need to find optimal partitioning Sports Luxury CarType CarType Family Luxury OR Family Sports Splitting Based on Ordinal Attributes z Multi way split Use as many partitions as distinct values Size Small Large Medium z Binary split Divides values into two subsets Need to find optimal partitioning Small Medium z Size Large What about this split OR Small Large Medium Large Size Small Size Medium 11 Splitting Based on Continuous Attributes z Different ways of handling z z Discretization to form an ordinal categorical attribute z Static discretize once at the beginning z Dynamic ranges can be found by equal interval bucketing equal frequency bucketing percentiles or clustering Binary Decision A v or A v z consider all possible splits and finds the best cut z can be more compute intensive Splitting Based on Continuous Attributes 12 Tree Induction z Greedy strategy z z Split the records based on an attribute test that optimizes certain criterion Issues z z Determine how to split the records z How to specify the attribute test condition z How to determine the best split Determine …
View Full Document