DOC PREVIEW
TAMU CSCE 643 - Presentation 1

This preview shows page 1-2-22-23 out of 23 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 23 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Slide 1Slide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated CartographyMartin A. Fischler, Robert C. BollesArtificial Intelligence CenterSRI International, CACPSC 643, Presentation 1Graphics and Image Processing, Volume 24, Number 6, June 1981.Martin A. FischlerResearch FocusArtificial Intelligence, Machine Vision, Switching Theory, Computer Organization, Information TheoryB.E.E Degree – City College of New York, NYM.S and PhD – Stanford University, CAComputer Scientist – SRI International in 1977•Published the RANSIC paper firstly in work report of SRI International in 1980 •Published the RANSIC paper in Graphics and Image Processing in 1981•Currently working on Visual Odometry and Visual SLAMComputer Vision in 1981• Focused on classification and recognition• Science-based (hadn’t gotten to applications yet)• Initially focused largely on artificial worlds.• Images were hard to come by.• 3-D range sensing was almost viewed as cheating. • Research was driven by sponsor’s interests. Back to 1981IBM first PC, 19814.77MHzApple II-Plus, 1981Max of 64K RAMAdapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/MotivationLeast Square AlgorithmOptimize the fit of a functional description to ALL of the presented data.Adapted from http://en.wikipedia.org/wiki/Linear_least_squares21 1minm mij j ji jX yb= =- ���MotivationLeast Square AlgorithmLeast square is an averaging technique that considers all the presented data, and therefore is sensitive to outliers.Adapted from http://www.cs.unc.edu/~lazebnik/spring09/MotivationRobust Estimator•The robust function ρ behaves like squared distance to small ri but saturates to large ri , where ri is the residual of point i w.r.t. model parameters θ, σ is scale parameter.•Nonlinear optimization that must be solved iteratively.•Least squares solution can be used for initialization.( )( ), ; mini iir xr q s ��Adapted from http://www.cs.unc.edu/~lazebnik/spring09/MotivationTwo types of error•Measurement error – inliers•Classification error – outliersExisting Problem•Least square and robust estimator (initialization) treat inliers and outliers equally, as a whole.•Robust estimator tries to extract the outliers in the later iteration, while fitting inliers and extracting outliers should be in the same process.•Why not randomly choose data subset to fit – RANSAC.RANSACNotationsU= {xi} set of data points, |U|=Np model parametersfunction f computes model parameters p given a sample S from U the cost function for a single data point xk times of iterationAlgorithm•Select random set ,•Compute parameters •Compute cost•Stop of Ck < C* or k > k*RANSACExampleAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subsetAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters pAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters p•Calculate cost for each data pointAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters p•Calculate cost for each data point•Select the data that fit the current modelAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters p•Calculate cost for each data point•Select the data that fit the current model•Repeat samplingAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters p•Calculate cost for each data point•Select the data that fit the current model•Repeat samplingAdapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACExample•Select data subset•Calculate model parameters p•Calculate cost for each data point•Select the data that fit the current model•Repeat sampling•Ck < C* or k > k*Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/RANSACHow many iterations•The average step number k is a function of the sample size m and the fraction of outliers •Choose K so that, with probability p, at least one random sample is free from outliers( )( ) 1 1mE k e= -( ) ( )( )log 1 / log 1 1mk p e= - - -( )( )1 1 1kmpe- - = -proportion of outliers , p=0.95m 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177eRANSACApplication: Location Determination Problem•Existence proofs of multiple solutions for the P3P, P4P, and P5P problems.•An algorithm for solving the general P3P.•An algorithm for solving the planar P4P problem.•An automatic gross-error filtering technique (RANSAC).Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/RANSACResults: Location Determination ProblemFinal result (Deviations)X: 0.1 ft Heading: 0.01OY: 0.1 ft Pith: 0.10OZ: 0.1 ft Roll: 0.12OAdapted from http://www.ai.sri.com/people/fischler/eeRANSACOther ApplicationsAdapted from http://graphics.cs.cmu.edu/courses/15-463/2006_fall/www/463.htmlRANSACOther ApplicationsAdapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/RANSACPros• Simple and general.• Applicable to many different problems.• Often works well in practice.Cons• Sometimes too many iterations are required. • Can fail for extremely low inlier ratios.• Lots of parameters to tune.• Can’t always get a good initialization of the model.• We can often do better than brute-force sampling.1981 Ford F150 For more information please visit the website of 25 Years RANSAC Workshop: http://cmp.felk.cvut.cz/ransac-cvpr2006/ENDAdapted from


View Full Document

TAMU CSCE 643 - Presentation 1

Download Presentation 1
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Presentation 1 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Presentation 1 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?