CORNELL CS 664 - Lecture #3: Density estimation in vision

Unformatted text preview:

CS664 Lecture #3: Density estimation in visionAnnouncementsLast lecture we saw:Density estimationHistogram representationHistogram-based estimatesNearest-neighbor estimateParzen estimationParzen exampleImportance of scaleRelationship to Efros/LeungComputing local modesMean shift algorithmImage and histogramLocal modesMean shift segmentationsBack to density estimationML estimate of the meanWhat is a density?Interpreting densitiesDiscrete case is easierSampling from a PDFSample likelihoodDefinition of likelihoodCS664 Lecture #3: Density estimation in visionSome slides taken from: David Lowe et al.www.cs.ubc.ca/~lowe/425/slides/11-Classifiers.ppt2Announcements Lecture notes are on the web First quiz will be on Thursday– Coverage through today’s lecture We will use CMS, the Course Management System– We’ll be setting this up soon Guest lecture a week from Tuesday by Bryan Kressler– We’ll have a few of these over the semester3Last lecture we saw: Trigrams are an elegant way to generate text– Density estimation and sampling Same basic idea used by Efros & Leung to perform texture synthesis– Some surprisingly good results Parametric density estimation– i.e., fitting the data with a Gaussian Non-parametric density estimation– i.e., using a histogram4Density estimationCD Rates7.4 7.6 7.8 8 8.2 8.4 8.6 8.8 9PercentTrue density5Histogram representation6Histogram-based estimates You can use a variety of fitting techniques to produce a curve from a histogram– Lines, polynomials, splines, etc.– Also called regression/function approximation– Normalize to make this a density If you know quite a bit about the underlying density you can compute a good bin size– But that’s rarely realistic in vision– And defeats the whole purpose of the non-parametric approach7Nearest-neighbor estimate To estimate the density, count the number of nearby data points– Like histogramming with sliding bins– Avoid bin-placement artifacts– We can fix ² and compute this quantity, or we can fix the quantity and compute ²8Parzen estimation Each observed datapoint increases our estimate of the probability nearby– Simplest case: raise the probability uniformly within a fixed radius• Place a fixed-height “box” at each datapoint, add them up to get the density estimate– This is nearest neighbor with fixed ² More generally, you can use some slowly decreasing function (such as a Gaussian)– Called the kernel9Parzen examplefrom Hastieet al.10Importance of scalefrom Duda et al.11Relationship to Efros/Leung Can we store histograms of 11-by-11 patches?– How many such patches are there? – 256121is a lot (> 10240) They don’t quite do density estimation– The method is procedural • And slightly ad hoc– But the effect is close to Parzen estimation• With some kind of unusual kernel This would be a natural follow-up paper12Computing local modes Often we don’t need the entire density– Vision often has very high #/dimensions Suppose that we could find the nearest local maximum (mode)– Doing this repeatedly gives a simple clustering scheme– There is an elegant way to do this– One of the more successful methods in vision13Mean shift algorithm Non-parametric method to compute the nearest mode of a distribution– Density increases as we get near “center”14Image and histogram15Local modes16Mean shift segmentationshttp://www.caip.rutgers.edu/~comanici/MSPAMI/msPamiResults.html17Back to density estimation Many density estimates for same data– E.g., different Parzen windows, or mean– Is there a natural sense in which one estimate might be “optimal”? Maximum likelihood principle– If a particular hypothesized density were correct, it would have some probability of resulting in the data we observed– Pick the hypothesis with the largest likelihood18ML estimate of the mean Consider parametric density estimation with a Gaussian– What choice of mean μ and width σmaximizes the likelihood? Need to be a little more precise – What does it mean to say that a particular density actually generated the data we saw?19What is a density? Consider an arbitrary function p such that– Can view it as a probability density function• The PDF for a real-valued random variable• If we asked ∞ banks, what frequency of CD rates would we get?20Interpreting densities The value of the PDF p at x is not the probability we would get rate x– Which is always zero (think about it!)– Instead, p gives the probability of getting a rate in a given interval21Discrete case is easier If the values of the random variable are discrete, things are simpler– Instead of a PDF you have a probability mass function (PMF)– I.e., a histogram whose entries sum to 1• No bucket has a value greater than 1 This is the true relative frequencies (i.e., what we would get in the limit)– What is the bin size of the PMF histogram?22Sampling from a PDF Suppose we call up a number of banks and get their CD rates– This generates our sample (data set)– How does this relate to the true PDF? It simplifies life considerably to assume:– All the banks generate their rates from the same PDF (identical distributions)– There is no effect between the rate you get from one bank and another (independence)23Sample likelihood24Definition of likelihood Intuition: the true PDF should not make the sample (data) you saw a “fluke”– It’s possible that the coin is fair even though you saw 106heads in a row… The likelihood of a hypothesis is the probability that it would have resulted in the data you saw– Think of the data as fixed, and try to chose among the possible


View Full Document
Download Lecture #3: Density estimation in vision
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture #3: Density estimation in vision and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture #3: Density estimation in vision 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?