DOC PREVIEW
UMD CMSC 828 - Markov Random Fields

This preview shows page 1-2-24-25 out of 25 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

AnnouncementsMarkov Random FieldsExample: Image RestorationSlide 4Example: StereoDefinitionsNeighborhoodsUsing MRFsGibbs DistributionGibbs Distribution (2)MRF=GRFExample: Piecewise Constant Image RestorationExample, cont’dOptimizationOptimization (2)Optimization (3)Optimization (4)SkeletonsSlide 19Grassfire Transform (Blum)Alternate DefinitionSensitive to noiseSkeletons of 3D ObjectsGeneralized CylindersProblemAnnouncements•Readings for today:–Markov Random Field Modeling in Computer Vision. Li. First two chapters on reserve.–``Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images,’’ Geman and Geman. On reserve.Markov Random Fields•Markov chains, HMMs have 1D structure–At every time, there is one state.–This enabled use of dynamic programming.•Markov Random Fields break this 1D structure.–Field of sites, each of which has a label, simultaneously.–Label at one site dependent on others, no 1D structure to dependencies.–This means no optimal, efficient algorithms.•Why MRFs? Objects have parts with complex dependencies. We need to model these. MRFs (and belief nets) model complex dependencies.Example: Image Restoration•Every pixel is a site.•Label is intensity, uncorrupted by noise.•Label depends on observation; pixel corrupted by noise.•Also depends on other labels.–If you see an image with one pixel missing, you can guess value of missing pixel pretty well.Example: Stereo•Every pixel is a site.•Label of a pixel is its disparity.•Disparity implies two pixels match. Prob. depends on similarity of pixels.•Disparity at one pixel related to others since nearby pixels have similar disparities.Definitions•S indexes a discrete set of sites.–S = {1, …, m}–S = {(i,j) | 1 <= i, j <= n} for nxn grid. •Ld = discrete set of labels, eg. {1, … M}.–Labels could be continuous, but we skip that.•A labeling assigns a label to every site, f = {f1, … fm}. fi is the label of site i.Neighborhoods•Neighborhood specifies dependencies.–N = {Ni | for all i in S}–Ni is neighborhood of i. j in Ni means i and j are neighbors. •A site is not its own neighbor.•Neighborhood is symmetric.•Neighborhood -> conditional indep.–F is an MRF on S w.r.t. N iff:•P(f) > 0•P(fi | fS-{i}) = P(fi | fNi)Using MRFs•We need to define sites and labels.•Define neighborhood structure capturing conditional probability structure.•Assign probabilities that capture problem.•Find most probable labeling.•Gibbs Distribution useful conceptualization.Gibbs Distribution•Cliques capture dependencies of neighborhoods.–{i} is a clique for all i. –{i1, i2, … in} is a clique if ik in Nj for all 1<=i,j<=n.Gibbs Distribution (2)•U(f) is energy function.•Vc(f) is clique potential•Z is normalizing value.–Sum over all labelings.•T is temperature.FfCccTfUeZfVfUTfUeZfP)()()()(1)(MRF=GRF•Given any MRF, we can define an equivalent GRF.–That means, find an appropriate energy U(f)•To find f that maximizes P(f) it suffices to minimize:CccfVfU )()(Example: Piecewise Constant Image Restoration•Every pixel is a site.•Four connected neighborhoodsEach site is a cliqueAll pairs of neighbors are cliques• Observation, di of intensity at site i.Example, cont’d2222222/)()|(2/)(21)|()|()|(),(Gaussian i.i.d. :Suppose)(/)()|()|(iiiiiiiiiiiiiiidffdUdfefdPfdPfdPfNeefddPfPfdPdfPelse 0},{for },{kffVjiclfVicjicilcPrior on labelsPrior on discontinuitiesMinimize Energy: ciiVfdU )|(Optimization•Our problem is going to be to choose f to minimize this energy.•Usually this is NP-hard: heuristics or exponential algorithms.–Greedy:• loop through sites, changing labeling to reduce energy.•Constant time to make this decision.Optimization (2)–Simulated Annealing.•Pick site, i, at random. Let f’ be old labels, f be f’ with fi randomly changed.•p = min(1, P(f/f’)).•Replace f’ with f with probability p.• As T -> 0 method becomesdeterministic. By slowly lowering T states of f becomea Markov chain guaranteed to converge to global optimum. •This takes exponential time.TfUeZfP)(1)(Optimization (3)•Belief Propagation.–At each step, a site collects information from neighbors on their probable labeling. Passes info to each neighbor based on info from other neighbors (avoids repeating to neighbor what that neighbor has told.–In graph with no loops, like dynamic programming, forward-backward method.–In general MRF, heuristic (that has been analyzed). (eg., Yedidia, Freeman and Weiss).Optimization (4)•Graph cuts. (eg., Boykov, Veksler, and Zabih).–Find all sites labeled Relabel a subset of these so that energy is minimized over all possible such relabelings.–This can be posed as a graph cut problem, solved optimally in polynomial time.Skeletons• Intuition: Abstract a 2D shape into a stick figure.• Why is this good? Captures part structure. Represents 2D structure in 1D.(Sebastian and Kimia)Similarity of structure is more apparent in skeleton than in boundary.Grassfire Transform (Blum)Alternate Definition•A point is on the skeleton of a closed curve if and only if:–It is inside the closed curve.–A circle centered on point intersects skeleton in at least two places, but contains no points outside the curve.•Shape is union of these circles.Sensitive to noiseSkeletons of 3D Objects•Grassfire produces a 2D surface.•Intuitively, skeletons seem 1D.•Harder to compare 2D surfaces; extract parts, etc….Generalized CylindersContains •an axis, •a cross-section that sweeps along that axis, changing shape.Problem•How do you define a 1D skeleton of a 3D shape•And relate this to the 1D skeleton of 2D image of that


View Full Document

UMD CMSC 828 - Markov Random Fields

Download Markov Random Fields
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Markov Random Fields and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Markov Random Fields 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?