Slide 1Slide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Computational Learning Theory CS446-Spring 06 1• What general laws constrain inductive learning ? What learning problems can be solved ? When can we trust the output of a learning algorithm ? Computational Learning TheoryComputational Learning Theory CS446-Spring 06 2• What general laws constrain inductive learning ? We seek theory to relate• Probability of successful Learning• Number of training examples• Complexity of hypothesis space• Accuracy to which target concept is approximated• Manner in which training examples are presented Computational Learning TheoryComputational Learning Theory CS446-Spring 06 3Learning Conjunctions100x•Protocol I: The learner proposes instances as queries to the teacher• Suppose we are after a monotone conjunction:•Shall we ask about (1,1,1…,1,1)?• Is in ? <(1,1,1…,1,0), ?> f(x)=0. (conclusion: Yes)• Is in ? <(1,1,…1,0,1), ?> f(x)=1. (conclusion: No)•……• Is in ? <(0,1,…1,1,1), ?> f(x)=1. (conclusion: No)•A straight forward algorithm requires n=100 queries, and will produce as a result the hidden conjunction (exactly).1x99x1005432xxxxxh Computational Learning Theory CS446-Spring 06 4Learning Conjunctions•Protocol II: The teacher (who knows f) provides training examples• <(0,1,1,1,1,0,…,0,1), 1> (We learned a superset of the good variables)• To show you that all these variables are required……• <(0,0,1,1,1,0,…,0,1), 0> need • <(0,1,0,1,1,0,…,0,1), 0> need •…..• <(0,1,1,1,1,0,…,0,0), 0> need• A straight forward algorithm requires k = 6 examples to produce the hidden conjunction (exactly).1005432xxxxxf 2x3x100xComputational Learning Theory CS446-Spring 06 5Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • <(1,1,1,1,1,1,…,1,1), 1>• <(1,1,1,0,0,0,…,0,0), 0>• <(1,1,1,1,1,0,...0,1,1), 1>• <(1,0,1,1,1,0,...0,1,1), 0>• <(1,1,1,1,1,0,...0,0,1), 1>• <(1,0,1,0,0,0,...0,1,1), 0>• <(1,1,1,1,1,1,…,0,1), 1>• <(0,1,0,1,0,0,...0,1,1), 0>Computational Learning Theory CS446-Spring 06 6Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination Start with the set of all literals as candidates Eliminate a literal that is not active (0) in a positive exampleComputational Learning Theory CS446-Spring 06 7Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> 10054321... xxxxxxf Computational Learning Theory CS446-Spring 06 8Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> • <(1,1,1,0,0,0,…,0,0), 0>10054321... xxxxxxf Computational Learning Theory CS446-Spring 06 9Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> • <(1,1,1,0,0,0,…,0,0), 0> learned nothing• <(1,1,1,1,1,0,...0,1,1), 1>10054321... xxxxxxf Computational Learning Theory CS446-Spring 06 10Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> • <(1,1,1,0,0,0,…,0,0), 0> learned nothing• <(1,1,1,1,1,0,...0,1,1), 1>• <(1,0,1,1,0,0,...0,0,1), 0> learned nothing• <(1,1,1,1,1,0,...0,0,1), 1>10054321... xxxxxxf 10 09954321xxxxxxxf Computational Learning Theory CS446-Spring 06 11Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> • <(1,1,1,0,0,0,…,0,0), 0> learned nothing• <(1,1,1,1,1,0,...0,1,1), 1>• <(1,0,1,1,0,0,...0,0,1), 0> learned nothing• <(1,1,1,1,1,0,...0,0,1), 1>• <(1,0,1,0,0,0,...0,1,1), 0>• <(1,1,1,1,1,1,…,0,1), 1>• <(0,1,0,1,0,0,...0,1,1), 0>10054321... xxxxxxf 10 09954321xxxxxxxf 10054321xxxxxxf Computational Learning Theory CS446-Spring 06 12Learning Conjunctions• Protocol III: Some random source (e.g., Nature) provides training examples Teacher (Nature) provides the labels (f(x)) • Algorithm: Elimination • Start with the set of all literals as candidates• Eliminate a literal that is not active (0) in a positive example• <(1,1,1,1,1,1,…,1,1), 1> • <(1,1,1,0,0,0,…,0,0), 0> learned nothing• <(1,1,1,1,1,0,...0,1,1), 1>• <(1,0,1,1,0,0,...0,0,1), 0> learned nothing• <(1,1,1,1,1,0,...0,0,1), 1>• <(1,0,1,0,0,0,...0,1,1), 0>• <(1,1,1,1,1,1,…,0,1), 1>• <(0,1,0,1,0,0,...0,1,1), 0>Final hypothesis:10054321... xxxxxxf 10 09954321xxxxxxxf 10054321xxxxxxf 10054321xxxxxxh Computational Learning Theory CS446-Spring 06 13Prototypical Concept Learning• Instance Space: X (animals; described by attributes, such as Barks (Y/N), has_4_legs (Y/N),…)• Concept Space: C
View Full Document