DOC PREVIEW
UW-Madison CS 760 - Theoretical Approaches to Machine Learning

This preview shows page 1-2-3-20-21-22-41-42-43 out of 43 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Theoretical Approaches to Theoretical Approaches to Machine LearningMachine LearningEarly work (eg. Gold) ignored efficiencyEarly work (eg. Gold) ignored efficiency••Only considers computabilityOnly considers computability••““Learning in the limitLearning in the limit””Later work considers Later work considers tractable itractable inductive learningnductive learning••With high probability, approximately learnWith high probability, approximately learn••Polynomial runtime, polynomial # of examples neededPolynomial runtime, polynomial # of examples needed••Results (usually) independent of probability distribution Results (usually) independent of probability distribution for the examplesfor the examples©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Identification in the LimitIdentification in the LimitDefinitionDefinitionAfter some finite number of examples, After some finite number of examples, learner will have learned the correct concept learner will have learned the correct concept (though might not even know it!). Correct means (though might not even know it!). Correct means agrees with target concept on labels for all data.agrees with target concept on labels for all data.ExampleExampleConsider noiseConsider noise--free learning from the free learning from the class class {f | f(n) = a*n mod b}{f | f(n) = a*n mod b}where where aaand and bbare are natural numbersnatural numbersGeneral TechniqueGeneral Technique““Innocent Until Proven GuiltyInnocent Until Proven Guilty””Enumerate all possible answersEnumerate all possible answersSearch for simplest answer consistent with training examples Search for simplest answer consistent with training examples seen so far; sooner or later will hit solutionseen so far; sooner or later will hit solution©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Some Results (Gold)Some Results (Gold)••Computable languages (Turing Computable languages (Turing machines) can be learned in the limit machines) can be learned in the limit using inference by enumeration.using inference by enumeration.••If data set is limited to positive If data set is limited to positive examples only, then only finite examples only, then only finite languages can be learned in the limit.languages can be learned in the limit.©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Solution for Solution for {f | f(n) = {f | f(n) = aa*n mod *n mod bb} } 1122334411223344abn f(n)9 17©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)The MistakeThe Mistake--Bound Model Bound Model (Littlestone)(Littlestone)FrameworkFramework••Teacher shows input ITeacher shows input I••ML algorithm guesses output OML algorithm guesses output O••Teacher shows correct answerTeacher shows correct answer••Can we upper bound the Can we upper bound the number number of errorsof errorsthe learner will make?the learner will make?©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)The MistakeThe Mistake--Bound ModelBound ModelExampleExampleLearn a conjunct from Learn a conjunct from NNpredicates and their negationspredicates and their negations••Initial Initial h = ph = p1 1 ∧∧¬¬pp1 1 ∧∧……∧∧ppn n ∧∧¬¬ppnn••For each For each ++ex, remove the ex, remove the remaining terms that do not matchremaining terms that do not match©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)The MistakeThe Mistake--Bound ModelBound ModelWorst case # of mistakes? Worst case # of mistakes? 1 + 1 + NN••First + ex will remove First + ex will remove NNterms from terms from hhinitialinitial••Each subsequent error on a Each subsequent error on a + + will remove at will remove at least one more term (never make a mistake least one more term (never make a mistake on on --exex’’s)s)©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Equivalence Query Model Equivalence Query Model (Angluin)(Angluin)FrameworkFramework••ML algorithm guesses concept: is target ML algorithm guesses concept: is target equivalentequivalentto this guess?)to this guess?)••Teacher either says Teacher either says ““yesyes””or returns a or returns a counterexample (example labeled counterexample (example labeled differently by target and guess)differently by target and guess)••Can we upper bound the Can we upper bound the number number of errorsof errorsthe learner will make?the learner will make?••Time to compute next guess bounded by Time to compute next guess bounded by Poly(|data seen so far|)Poly(|data seen so far|)©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Probably Approximately Probably Approximately Correct (PAC) LearningCorrect (PAC) LearningPAC learning (Valiant PAC learning (Valiant ’’84)84)GivenGivenXXdomain of possible examplesdomain of possible examplesC C class of possible concepts to label Xclass of possible concepts to label Xc c ∈∈CCtarget concepttarget conceptδδ, , εεcorrectness boundscorrectness bounds©©Jude Shavlik 2006 Jude Shavlik 2006 David Page 2010David Page 2010CS 760 CS 760 ––Machine Learning (UWMachine Learning (UW--Madison)Madison)Probably Approximately Probably Approximately Correct (PAC) LearningCorrect (PAC)


View Full Document

UW-Madison CS 760 - Theoretical Approaches to Machine Learning

Documents in this Course
Load more
Download Theoretical Approaches to Machine Learning
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Theoretical Approaches to Machine Learning and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Theoretical Approaches to Machine Learning 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?