Outline[read Chapter 2][suggested exercises 2.2, 2.3, 2.4, 2.6]Learning from examplesGeneral-to-sp ecic ordering over hyp othesesVersion spaces and candidate eliminationalgorithmPicking new examplesThe need for inductive biasNote: simple approach assuming no noise,illustrates key concepts22 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Training Examples forEnjoySp ortSky Temp Humid Wind Water Forecst EnjoySptSunny Warm Normal Strong Warm Same YesSunny Warm High Strong Warm Same YesRainy Cold High Strong Warm Change NoSunny Warm High Strong Co ol Change YesWhat is the general concept?23 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Representing Hyp othesesMany p ossible representationsHere,his conjunction of constraints on attributesEach constraint can b ea sp ecc value (e.g.,W ater=W ar m)don't care (e.g., \W ater=?")no value allowed (e.g.,\Water=;")For example,Sky AirTemp Humid Wind Water ForecsthS unny? ?S tr ong?S amei24 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Prototypical Concept Learning TaskGiven:{InstancesX: Possible days, each describ ed bythe attributesSky, AirTemp, Humidity,Wind, Water, Forecast{Target functionc:E nj oy S por t:X! f0;1g{Hyp othesesH: Conjunctions of literals. E.g.h?; C ol d; H ig h;?;?;?i:{Training examplesD: Positive and negativeexamples of the target functionhx1; c(x1)i; : : :hxm; c(xm)iDetermine:A hyp othesishinHsuch thath(x) =c(x) for allxinD.25 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997The inductive learning hyp othesis:Anyhyp othesis found to approximate the targetfunction well over a suciently large set oftraining examples will also approximate thetarget function well over other unobservedexamples.26 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Instance, Hyp otheses, and More-General-Thanh = <Sunny, ?, ?, Strong, ?, ?>h = <Sunny, ?, ?, ?, ?, ?>h = <Sunny, ?, ?, ?, Cool, ?>2hh3hInstances XHypotheses HSpecificGeneral1x2xx = <Sunny, Warm, High, Strong, Cool, Same>x = <Sunny, Warm, High, Light, Warm, Same>11212327 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Find-SAlgorithm1. Initializehto the most sp ecic hypothesis inH2. For each p ositive training instancexFor each attribute constraintaiinhIf the constraintaiinhis satised byxThen do nothingElse replaceaiinhby the next moregeneral constraint that is satised byx3. Output hypothesish28 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Hyp othesis Space Search byFind-SInstances XHypotheses HSpecificGeneral1x2xx3x4h0h1h2,3h4+++x = <Sunny Warm High Strong Cool Change>, +4x = <Sunny Warm Normal Strong Warm Same>, +1x = <Sunny Warm High Strong Warm Same>, +2x = <Rainy Cold High Strong Warm Change>, -3h = <Sunny Warm Normal Strong Warm Same>1h = <Sunny Warm ? Strong Warm Same>2h = <Sunny Warm ? Strong ? ? >4 h = <Sunny Warm ? Strong Warm Same>30h = <∅, ∅, ∅, ∅, ∅, ∅>-29 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Complaints ab outFind-SCan't tell whether it has learned conceptCan't tell when training data inconsistentPicks a maximally sp ecich(why?)Dep ending onH, there might b e several!30 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Version SpacesA hyp othesishisconsistentwith a set oftraining examplesDof target conceptcif andonly ifh(x) =c(x) for each training examplehx; c(x)iinD.C onsistent(h; D)(8hx; c(x)i 2D)h(x) =c(x)Theversion space,V SH ;D, with resp ect tohyp othesis spaceHand training examplesD,is the subset of hypotheses fromHconsistentwith all training examples inD.V SH ;D fh2HjC onsistent(h; D)g31 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997TheList-Then-EliminateAlgorithm:1.V er sionS pace a list containing everyhyp othesis inH2. For each training example,hx; c(x)iremove fromV er sionS paceany hypothesishforwhichh(x)6=c(x)3. Output the list of hyp otheses inV er sionS pace32 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Example Version SpaceS:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }33 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Representing Version SpacesTheGeneral b oundary, G, of version spaceV SH ;Dis the set of its maximally generalmemb ersTheSp ecic b oundary, S, of version spaceV SH ;Dis the set of its maximally sp ecicmemb ersEvery member of the version space lies b etweenthese b oundariesV SH ;D=fh2Hj(9s2S)(9g2G)(ghs)gwherexymeansxis more general or equal toy34 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Candidate Eliminati on AlgorithmG maximally general hyp otheses inHS maximally sp ecic hypotheses inHFor each training exampled, doIfdis a p ositive example{Remove fromGany hypothesis inconsistentwithd{For each hypothesissinSthat is notconsistent withd RemovesfromS Add toSall minimal generalizationshofssuch that1.his consistent withd, and2. some member ofGis more general thanh Remove fromSany hyp othesis that is moregeneral than another hypothesis inSIfdis a negative example35 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997{Remove fromSany hypothesis inconsistentwithd{For each hypothesisginGthat is notconsistent withd RemovegfromG Add toGall minimal sp ecializati onshofgsuch that1.his consistent withd, and2. some member ofSis more sp ecic thanh Remove fromGany hyp othesis that is lessgeneral than another hypothesis inG36 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Example Trace{<?, ?, ?, ?, ?, ?>}S0:{<Ø, Ø, Ø, Ø, Ø, Ø>}G0:37 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997What Next Training Example?S:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }38 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997How Should These Be Classied?S:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?,
View Full Document