DOC PREVIEW
UMD CMSC 828 - Embedding Gestalt Laws in Markov Random Fields

This preview shows page 1-2-16-17-18-33-34 out of 34 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Slide 1Slide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Slide 28Slide 29Slide 30Slide 31Slide 32Slide 33Slide 34Embedding Gestalt LawsEmbedding Gestalt Lawsin Markov Random Fieldsin Markov Random Fieldsby Song-Chun Zhuby Song-Chun ZhuPurpose of the PaperPurpose of the PaperProposes functions to measure Gestalt features of shapesAdapts [Zhu, Wu Mumford] FRAME method to shapesExhibits effect of MRF model obtained by putting these together.Recall Recall GestaltGestalt Features Features(à la [Lowe], and others)ColinearityCocircularityProximityParallelismSymmetryContinuityClosureFamiliarityFRAMEFRAME[Zhu, Wu, Mumford]F iltersR andom fieldsA ndM aximumE ntropyA general procedure for constructing MRF modelsThree Main PartsThree Main PartsDataLearn MRF models from dataTest generative power of learned modelElements of DataElements of DataA set of images representative of the chosen application domainAn adequate collection of feature measures or filtersThe (marginal) statistics of applying the feature measures or filters to the set of imagesData: ImagesData: ImagesZhu considers 22 animal shapes and their horizontal flips The resulting histograms are symmetricMore data can be obtainedBut are there other effects?Sample Animate ImagesSample Animate ImagesContour-based Feature MeasuresContour-based Feature MeasuresGoal is to be genericBut generic shape features are hard to findφ1 = κ(s), the curvatureκ(s) = 0 implies the linelets on either side of Γ(s) are colinearφ2 = κ'(s), its derivativeκ'(s) = 0 implies three sequential linelets are cocircular“Other contour-based shape filters can be defined in the same way”Zhu's Symmetry FunctionZhu's Symmetry FunctionΨ(s) pairs linelets across medial axesDefined and computed by minimizing an energy functional constructed so thatPaired linelets are as close, parallel and symmetric as possible, andThere are as few discontinuities as possibleRegion-based Feature MeasuresRegion-based Feature Measuresφ3(s) = dist(s, ψ(s))Measures proximity of paired linelets across a regionφ4(s) = φ3'(s), the derivativeφ4(s) = 0 implies paired linelets are parallelφ5(s) = φ'4(s) = φ3''(s)φ5(s) = 0 implies paired linelets are symmetricAnother Possible Shape FeatureAnother Possible Shape Featureφ6(s) = 1 where ψ(s) is discontinuous0 otherwiseCounts the number of “parts” a shape hasCan Gestalt “familiarity” be (statistically?) measured?The StatisticThe StatisticThe histogram of feature φ over curve Γ isH(z; φk, Γ) = ∫δ(z-φk(s)) ds δ is the Dirac function: mass 1 at 0, and 0 otherwiseμ(z; φk) denotes the average over all imagesZhu claims μ is a close estimation of the marginal distribution of the “true distribution” over shape space, assuming the total number of linelets is small.Statistical ObservationsStatistical Observationsφ1 at scales 0, 1, 2φ3 φ4 φ5 On 22 images and their flipsConstruct a ModelConstruct a ModelΩ is the space of shapesΦ is a finite subset of feature filtersWe seek a probability distribution p on Ω∫Ω p(Γ) dΓ = 1 (1)That reproduces the statistics for all φ in Ω∫Ω p(Γ) δ(z-φ(s)) dΓ = μ(z; φ) (2)Construct a Model, 2Construct a Model, 2Idea: Choose the p with maximal entropySeems reasonable and fair, but is it really the best target/energy function?Lagrange multipliers and calculus of variations lead top(Γ; Φ, Λ) = exp(–∑φЄΦ ∫ λφ(z) H(φ, Γ, z) dz) / Zwhere Z is the usual normalizing factorΛ = { λφ | φЄΦ }It's a Gibbs DistributionIt's a Gibbs DistributionIn other words, it has the form of a Gibbs distribution, and therefore determines a Markov Random Field (MRF) model.Markov Chain Monte CarloMarkov Chain Monte CarloToo hard to compute λ's and p analyticallyIdea: Sample Ω according to the distribution p, stochastically update Λ to update p, and repeat until p reproduces all μ(z; φ) for φ Є ΦMonte Carlo because of random walkMarkov Chain in the nature of the loopMarkov Chain Monte Carlo, 2Markov Chain Monte Carlo, 2From the sampling produce μ'(z; φ)Same as μ(z; φ) except based on a random sample of shape spaceFor the purposes of today's discussion, the details are not importantFor φ Є Φ μ'(z; φ) = μ(z; φ)Zhu et al. assume there exists a “true underlying distribution”The Nonaccidental StatisticThe Nonaccidental StatisticFor φ' not in the set Φ we expectμ'(z; φ') ≠ μ(z; φ')μ'(z; φ') is the accidental statistic for φ'It is a measure of correlation between φ' and ΦThe “distance” (L1, L2, or other) between μ'(z; φ') and μ(z; φ') is the nonaccidental statistic for φ'It is a measure of how much “additional information” φ' carries above what is already in ΦThe Algorithm (simplified)The Algorithm (simplified)Enter your set Γ = { γ } of shapesEnter a (large) set { φ } of candidate feature measuresCompute μ(φ, Γ) for all φ in ΦCompute μ'(φ) relative to a uniform distribution on ΩUntil the nonaccidental statistic of all unused features is small enough, repeat:Algorithm, 2Algorithm, 2Of the remaining φ , add to Φ one with maximal nonaccidental statisticUpdate:Set of Lagrange multipliers Λ = { λ }Probability distribution model p(Φ, Λ)The μ'(φ) for remaining candidate features φExperiments and DiscussionExperiments and DiscussionLet my description of these experiments stimulate your thoughts on such issues asAre there better Gestalt feature measures?What is the best possible outcome of a generative model of shape?What feature measures should be added to the Gestalt ones?How useful were these experiments and what other might be worth doing?Experiment 1Experiment 1When the only feature used is the curvature the model generatedExperiment 1, continuedExperiment 1, continuedA Gaussian model (with the same κ-variance) producedExperiment 2Experiment 2Experiment 2 uses both κ and κ'The nonaccidental statistic of κ' with respect to the model based on κ can be seen hereExperiment 2, continuedExperiment 2, continuedThis time the model generated these shapes, purported to be smoother and more scale invariantExperiment 3Experiment 3The nonaccidental statistics of the three region-based shape features relative to the model produced in Experiment 2Experiment 3,


View Full Document

UMD CMSC 828 - Embedding Gestalt Laws in Markov Random Fields

Download Embedding Gestalt Laws in Markov Random Fields
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Embedding Gestalt Laws in Markov Random Fields and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Embedding Gestalt Laws in Markov Random Fields 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?