DOC PREVIEW
UTD CS 6375 - Chapter 2

This preview shows page 1-2-23-24 out of 24 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 24 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Outline[read Chapter 2][suggested exercises 2.2, 2.3, 2.4, 2.6]Learning from examplesGeneral-to-sp ecic ordering over hyp othesesVersion spaces and candidate eliminationalgorithmPicking new examplesThe need for inductive biasNote: simple approach assuming no noise,illustrates key concepts22 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Training Examples forEnjoySp ortSky Temp Humid Wind Water Forecst EnjoySptSunny Warm Normal Strong Warm Same YesSunny Warm High Strong Warm Same YesRainy Cold High Strong Warm Change NoSunny Warm High Strong Co ol Change YesWhat is the general concept?23 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Representing Hyp othesesMany p ossible representationsHere,his conjunction of constraints on attributesEach constraint can b ea sp ecc value (e.g.,W ater=W ar m)don't care (e.g., \W ater=?")no value allowed (e.g.,\Water=;")For example,Sky AirTemp Humid Wind Water ForecsthS unny? ?S tr ong?S amei24 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Prototypical Concept Learning TaskGiven:{InstancesX: Possible days, each describ ed bythe attributesSky, AirTemp, Humidity,Wind, Water, Forecast{Target functionc:E nj oy S por t:X! f0;1g{Hyp othesesH: Conjunctions of literals. E.g.h?; C ol d; H ig h;?;?;?i:{Training examplesD: Positive and negativeexamples of the target functionhx1; c(x1)i; : : :hxm; c(xm)iDetermine:A hyp othesishinHsuch thath(x) =c(x) for allxinD.25 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997The inductive learning hyp othesis:Anyhyp othesis found to approximate the targetfunction well over a suciently large set oftraining examples will also approximate thetarget function well over other unobservedexamples.26 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Instance, Hyp otheses, and More-General-Thanh = <Sunny, ?, ?, Strong, ?, ?>h = <Sunny, ?, ?, ?, ?, ?>h = <Sunny, ?, ?, ?, Cool, ?>2hh3hInstances XHypotheses HSpecificGeneral1x2xx = <Sunny, Warm, High, Strong, Cool, Same>x = <Sunny, Warm, High, Light, Warm, Same>11212327 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Find-SAlgorithm1. Initializehto the most sp ecic hypothesis inH2. For each p ositive training instancexFor each attribute constraintaiinhIf the constraintaiinhis satised byxThen do nothingElse replaceaiinhby the next moregeneral constraint that is satised byx3. Output hypothesish28 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Hyp othesis Space Search byFind-SInstances XHypotheses HSpecificGeneral1x2xx3x4h0h1h2,3h4+++x = <Sunny Warm High Strong Cool Change>, +4x = <Sunny Warm Normal Strong Warm Same>, +1x = <Sunny Warm High Strong Warm Same>, +2x = <Rainy Cold High Strong Warm Change>, -3h = <Sunny Warm Normal Strong Warm Same>1h = <Sunny Warm ? Strong Warm Same>2h = <Sunny Warm ? Strong ? ? >4 h = <Sunny Warm ? Strong Warm Same>30h = <∅, ∅, ∅, ∅, ∅, ∅>-29 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Complaints ab outFind-SCan't tell whether it has learned conceptCan't tell when training data inconsistentPicks a maximally sp ecich(why?)Dep ending onH, there might b e several!30 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Version SpacesA hyp othesishisconsistentwith a set oftraining examplesDof target conceptcif andonly ifh(x) =c(x) for each training examplehx; c(x)iinD.C onsistent(h; D)(8hx; c(x)i 2D)h(x) =c(x)Theversion space,V SH ;D, with resp ect tohyp othesis spaceHand training examplesD,is the subset of hypotheses fromHconsistentwith all training examples inD.V SH ;D fh2HjC onsistent(h; D)g31 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997TheList-Then-EliminateAlgorithm:1.V er sionS pace a list containing everyhyp othesis inH2. For each training example,hx; c(x)iremove fromV er sionS paceany hypothesishforwhichh(x)6=c(x)3. Output the list of hyp otheses inV er sionS pace32 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Example Version SpaceS:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }33 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Representing Version SpacesTheGeneral b oundary, G, of version spaceV SH ;Dis the set of its maximally generalmemb ersTheSp ecic b oundary, S, of version spaceV SH ;Dis the set of its maximally sp ecicmemb ersEvery member of the version space lies b etweenthese b oundariesV SH ;D=fh2Hj(9s2S)(9g2G)(ghs)gwherexymeansxis more general or equal toy34 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Candidate Eliminati on AlgorithmG maximally general hyp otheses inHS maximally sp ecic hypotheses inHFor each training exampled, doIfdis a p ositive example{Remove fromGany hypothesis inconsistentwithd{For each hypothesissinSthat is notconsistent withd RemovesfromS Add toSall minimal generalizationshofssuch that1.his consistent withd, and2. some member ofGis more general thanh Remove fromSany hyp othesis that is moregeneral than another hypothesis inSIfdis a negative example35 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997{Remove fromSany hypothesis inconsistentwithd{For each hypothesisginGthat is notconsistent withd RemovegfromG Add toGall minimal sp ecializati onshofgsuch that1.his consistent withd, and2. some member ofSis more sp ecic thanh Remove fromGany hyp othesis that is lessgeneral than another hypothesis inG36 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997Example Trace{<?, ?, ?, ?, ?, ?>}S0:{<Ø, Ø, Ø, Ø, Ø, Ø>}G0:37 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997What Next Training Example?S:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?> { }38 lecture slides for textbo okMachine Learning, T. Mitchell, McGraw Hill, 1997How Should These Be Classied?S:<Sunny, Warm, ?, ?, ?, ?><Sunny, ?, ?, Strong, ?, ?> <?, Warm, ?, Strong, ?, ?><Sunny, Warm, ?, Strong, ?, ?>{ }G:<Sunny, ?, ?, ?, ?,


View Full Document

UTD CS 6375 - Chapter 2

Download Chapter 2
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Chapter 2 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Chapter 2 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?