U of M CS 5541 - Decision Trees (32 pages)

Previewing pages 1, 2, 15, 16, 31, 32 of 32 page document View the full content.
View Full Document

Decision Trees



Previewing pages 1, 2, 15, 16, 31, 32 of actual document.

View the full content.
View Full Document
View Full Document

Decision Trees

64 views


Pages:
32
School:
University of Minnesota- Twin Cities
Course:
Cs 5541 - Artificial Intelligence

Unformatted text preview:

Decision Trees Decision tree representation ID3 learning algorithm Entropy Entropy Information gain Overfitting CS 5541 Chapter 3 Decision Tree Learning 1 Another Example Problem Positive Examples Negative Examples CS 5751 Machine Learning Chapter 3 Decision Tree Learning 2 A Decision Tree Type Car SUV Minivan Doors 2 CS 5751 Machine Learning 4 Tires Blackwall Chapter 3 Decision Tree Learning Whitewall 3 Decision Trees Decision tree representation Each internal node tests an attribute Each branch corresponds to an attribute value Each leaf node assigns a classification How would you represent XOR A B C D E M of N CS 5751 Machine Learning Chapter 3 Decision Tree Learning 4 When to Consider Decision Trees Instances describable by attribute value pairs Target function is discrete valued Disjunctive hypothesis may be required Possibly noisy training data Examples Equipment or medical diagnosis Credit risk analysis Modeling calendar scheduling preferences CS 5751 Machine Learning Chapter 3 Decision Tree Learning 5 Top Down Induction of Decision Trees Main loop 1 A the best decision attribute for next node 2 Assign A as decision attribute for node 3 For each value of A create descendant of node 4 Divide training examples among child nodes 5 If training examples perfectly classified STOP El iterate Else i over new leaf l f nodes d 29 35 29 35 A1 A2 Whichh attribute Whi ib is best 21 5 21 5 CS 5751 Machine Learning 8 30 8 30 Chapter 3 Decision Tree Learning 18 33 11 2 18 33 11 2 6 Entropy S sample of training examples 1 p proportion of Entropy S S 0 8 positive examples in S 06 0 6 p proportion of negative examples in S 0 4 Entropy measures the 0 2 impurity of S 0 0 1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 1 0 Probability CS 5751 Machine Learning Entropy S p log l 2 p p log l 2 p Chapter 3 Decision Tree Learning 7 Entropy Entropy S expected number of bits need to encode class or of randomlyy drawn member of S using an optimal shortest length code Why Information theory optimal



View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams

Loading Unlocking...
Login

Join to view Decision Trees and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Decision Trees and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?