New version page

UCI ICS 171 - Learning from Observations

Documents in this Course
Prolog

Prolog

16 pages

PROJECT

PROJECT

3 pages

Quiz 6

Quiz 6

9 pages

Load more

This preview shows page 1-2-20-21 out of 21 pages.

View Full Document
View Full Document

End of preview. Want to read all 21 pages?

Upload your study docs or become a GradeBuddy member to access this document.

View Full Document
Unformatted text preview:

Learning from ObservationsOutlineLearning agentsLearning elementSlide 5Slide 6Learning from 1 ExampleSlide 8Inductive learningInductive learning methodSlide 11Slide 12Slide 13Slide 14Slide 15Supervised Learning I1 nearest neighbors (your first ML algorithm!)1NN Decision SurfaceDistance MetricRemarks on NN methodsSummaryLearning from ObservationsChapter 18Section 1 – 4Outline•Learning agents•Inductive learning•Nearest NeighborsLearning agentsSometimes we want to invest time and effort to observe the feedback from our environment to our actions in order to improve these actionsso we can more effectively optimize our utility in the future.Learning element•Design of a learning element is affected by–Which components of the performance element are to be learned (e.g. learn to stop for traffic light)–What feedback is available to learn these components (e.g. visual feedback form camera)–What representation is used for the components (e.g. logic, probabilistic descriptions, attributes,...)•Type of feedback:–Supervised learning: correct answers for each example (label).–Unsupervised learning: correct answers not given.–Reinforcement learning: occasional rewardsTwo Examples of Learning Object Categories.Here is your training set (2 classes):Here is your test set:Does it belong to one of the above classes?S. Savarese, 2003Copied from P. Peronatalk slides. Learning from 1 ExampleP. Buegel, 1562Inductive learning•Simplest form: learn a function from examplesf is the target functionAn example is a pair (x, f(x))Problem: find a hypothesis hsuch that h ≈ fgiven a training set of examplesInductive learning method•Construct/adjust h to agree with f on training set•(h is consistent if it agrees with f on all examples)•E.g., curve fitting:Inductive learning method•Construct/adjust h to agree with f on training set•(h is consistent if it agrees with f on all examples)•E.g., curve fitting:Inductive learning methodInductive learning methodInductive learning methodwhich curve is best?•Ockham’s razor: prefer the simplest hypothesis consistent with dataInductive learning methodSupervised Learning IExample: Imagine you want to classify versus Data: 100 monkey images and 200 human images with labels what is what.,,{ 0}, 1,...,100{ 1}, 1,...,200i ij jx y ix y j= == =rrwhere x represents the greyscale of the image pixels andy=0 means “monkey” while y=1 means “human”.Task: Here is a new image: monkey or human?1 nearest neighbors(your first ML algorithm!)Idea: 1. Find the picture in the database which is closest to your query image.2. Check its label.3. Declare the class of your query image to be the same as that of the closest picture.queryclosest image1NN Decision Surfacedecision curveDistance Metric•How do we measure what it means to be “close”?•Depending on the problem we should choose an appropriate “distance” metric (or more generally, a (dis)similarity measure)Hamming distance:( , ) | | { discrete};Scaled Euclidean Distance:( , ) ( ) ( ) { .};n m n mTn m n m n mD x x x x xD x x x x A x x x cont= - == - - =r r r rr r r r r r-Demo: http://www.comp.lancs.ac.uk/~kristof/research/notes/nearb/cluster.html-Matlab Demo.Remarks on NN methods• We only need to construct a classifier that works locally for each query. Hence: We don’t need to construct a classifier everywhere in space.• Classifying is done at query time. This can be computationally taxing at a time where you might want to be fast. • Memory inefficient.• Curse of dimensionality: imagine many features are irrelevant / noisy  distances are always large.• Very flexible, not many prior assumptions.• k-NN variants robust against “bad examples”.Summary•Learning agent = performance element + learning element•For supervised learning, the aim is to find a simple hypothesis approximately consistent with training examples•Learning performance = prediction accuracy measured on test


View Full Document
Loading Unlocking...
Login

Join to view Learning from Observations and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Learning from Observations and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?