DOC PREVIEW
UMD CMSC 421 - Learning from Observations

This preview shows page 1-2-15-16-31-32 out of 32 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Last update: May 17, 2010Learning from ObservationsCMSC 421: Chapter 18: Sections 1–3CMSC 421: Chapter 18: Sections 1–3 1Outline♦ Learning agents♦ Inductive learning♦ Decision tree learning♦ Measuring learning performanceCMSC 421: Chapter 18: Sections 1–3 2LearningLearning is essential for unknown environments,i.e., when designer lacks omniscienceLearning is useful as a system construction method,i.e., expose the agent to reality rather than trying to write it downLearning modifies the agent’s decision mechanisms to improve performanceCMSC 421: Chapter 18: Sections 1–3 3Learning agentsPerformance standardAgentEnvironmentSensorsEffectorsPerformance elementchangesknowledgelearning goals Problem generator feedback Learning elementCriticexperimentsCMSC 421: Chapter 18: Sections 1–3 4Learning elementDesign of learning element is dictated by♦ what type of performance element is used♦ which functional component is to be learned♦ how that functional compoent is represented♦ what kind of feedback is availableExample scenarios:Performance elementAlpha−beta searchLogical agentSimple reflex agentComponentEval. fn.Transition modelTransition modelRepresentationWeighted linear functionSuccessor−state axiomsNeural netDynamic Bayes netUtility−based agentPercept−action fnFeedbackOutcomeOutcomeWin/lossCorrect actionSupervised learning: correct answers for each instanceReinforcement learning: occasional rewardsCMSC 421: Chapter 18: Sections 1–3 5Inductive learningSimplest form: learn a function from examplesf is the target functionAn example is a pair x, f(x), e.g.,O O XXX, +1Problem: find a hypothesis hsuch that h ≈ fgiven a training set of examplesHighly simplified model of real learning:– Ignores prior knowledge– Assumes f is deterministic– Assumes f and its arguments are observable– Assumes examples are given– Assumes that the agent wants to learn fCMSC 421: Chapter 18: Sections 1–3 6Inductive learning methodConstruct/adjust h to agree with f on training set(h is consistent if it agrees with f on all examples)E.g., curve fitting:xf(x)CMSC 421: Chapter 18: Sections 1–3 7Inductive learning methodConstruct/adjust h to agree with f on training set(h is consistent if it agrees with f on all examples)E.g., curve fitting:xf(x)CMSC 421: Chapter 18: Sections 1–3 8Inductive learning methodConstruct/adjust h to agree with f on training set(h is consistent if it agrees with f on all examples)E.g., curve fitting:xf(x)CMSC 421: Chapter 18: Sections 1–3 9Inductive learning methodConstruct/adjust h to agree with f on training set(h is consistent if it agrees with f on all examples)E.g., curve fitting:xf(x)CMSC 421: Chapter 18: Sections 1–3 10Inductive learning methodConstruct/adjust h to agree with f on training set(h is consistent if it agrees with f on all examples)E.g., curve fitting:xf(x)CMSC 421: Chapter 18: Sections 1–3 11Ockham’s razorWilliam of Ockham, fourteenth century:“Pluralitas non est ponenda sine neccesitate”This translates as“entities should not be multiplied unnecessarily”Maximize a combination of consistency and simplicityxf(x)CMSC 421: Chapter 18: Sections 1–3 12Attribute-based representationsExamples described by attribute values (Boolean, discrete, continuous, etc.)E.g., examples of situations where a friend will/won’t wait for a table:ExampleAttributes TargetAlt Bar F ri Hun P at P rice Rain Res T ype Est WillWaitX1T F F T Some $$$ F T French 0–10 TX2T F F T Full $ F F Thai 30–60 FX3F T F F Some $ F F Burger 0–10 TX4T F T T Full $ F F Thai 10–30 TX5T F T F Full $$$ F T French >60 FX6F T F T Some $$ T T Italian 0–10 TX7F T F F None $ T F Burger 0–10 FX8F F F T Some $$ T T Thai 0–10 TX9F T T F Full $ T F Burger >60 FX10T T T T Full $$$ F T Italian 10–30 FX11F F F F None $ F F Thai 0–10 FX12T T T T Full $ F F Burger 30–60 TClassification of examples is positive (T) or negative (F)CMSC 421: Chapter 18: Sections 1–3 13Decision treesOne possible representation for hypothesesE.g., here is your friend’s “true” tree for deciding whether to wait:No YesNo YesNo YesNo YesNo YesNo YesNone Some Full>60 30−60 10−30 0−10No YesAlternate?Hungry?Reservation?Bar? Raining?Alternate?Patrons?Fri/Sat?WaitEstimate?F TF TTTF TTFTTFCMSC 421: Chapter 18: Sections 1–3 14ExpressivenessDecision trees can express any function of the input attributes.E.g., for Boolean functions, truth table row → path to leaf:FTABF TBA B A xor BF F FF T TT F TT T FFF F T T TTrivially, there’s a consistent decision tree for any training set that hasa different path to a leaf for each example (unless f nondeterministic in x)but it probably won’t generalize to new examplesOckham’s razor ⇒Prefer to find more compact decision treesCMSC 421: Chapter 18: Sections 1–3 15Hypothesis spacesHow many distinct decision trees with n Boolean attributes?CMSC 421: Chapter 18: Sections 1–3 16Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functionsCMSC 421: Chapter 18: Sections 1–3 17Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functions= number of distinct truth tables with 2nrowsCMSC 421: Chapter 18: Sections 1–3 18Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functions= number of distinct truth tables with 2nrows = 22nCMSC 421: Chapter 18: Sections 1–3 19Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functions= number of distinct truth tables with 2nrows = 22nE.g., with 6 Boolean attributes, there are 18,446,744,073,709,551,616 treesCMSC 421: Chapter 18: Sections 1–3 20Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functions= number of distinct truth tables with 2nrows = 22nE.g., with 6 Boolean attributes, there are 18,446,744,073,709,551,616 treesHow many purely conjunctive hypotheses (e.g., Hungry ∧ ¬Rain)?CMSC 421: Chapter 18: Sections 1–3 21Hypothesis spacesHow many distinct decision trees with n Boolean attributes?= number of Boolean functions= number of distinct truth tables with 2nrows = 22nE.g., with 6 Boolean attributes, there are 18,446,744,073,709,551,616 treesHow many purely conjunctive hypotheses (e.g., Hungry ∧ ¬Rain)?Each attribute can be in (positive), in (negative), or out⇒ 3ndistinct


View Full Document

UMD CMSC 421 - Learning from Observations

Download Learning from Observations
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Learning from Observations and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Learning from Observations 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?