Unformatted text preview:

How to Solve the Optimization ProblemsMarc NiethammerFormulating an optimization problem needs go hand in hand with a solution method. Many solutionmethods have been proposed in the literature. Giving an exhaustive list is next to impossible, but most ofthe problems fall into a small set of subcategories which will be described in the next Sections with a briefdiscussion on their relation to computer vision and image analysis. For details see the references at the endof this document and the references therein.1 Optimization Methods, Computer Vision, and Image Analysis1.1 OverviewThe material in this section is mainly based on the optimization tree of NEOS (Network Enabled Optimiza-tion System) [1]. See [2, 4, 7] for more background, in particular with respe ct to relations to computer visionand image analysis. Figure 1 shows a hierarchy of optimization problems.Optimization problems (and their respective solution methods) may be subdivided into continuous anddiscrete optimization problems.• Continuous problems: Involve continuous variables, e.g., x ∈ Rn.• Discrete problems: Involve discrete variables, e.g., x ∈ Z, which may be interpreted as group labels.1.2 Continuous problemsContinuous problems frequently arise in computer vision and image analysis. Many parameter estimationproblems (e.g., fitting a model to image information), as well as image reconstruction and segmentationmethods are formulated continuously.1.2.1 Unconstrained continuous problems• Global optimization: min{f (x)}. Aims at finding a globally optimal solution. This may be simplefor convex functions or very hard for non-convex problems. General methods are simulated annealing,genetic algorithms, and graduated non-convexity. Useful to avoid local minima.• Nonlinear equations: min{kf(x)k : x ∈ Rn}. Very general problem setting for a given norm k · k.No special structure that may readily be exploited by an optimization algorithm (other than that thefunction is continuous and differentiable). Newton’s method or direct gradient descent form the basisof most of the general nonlinear equations solvers. The partial differential equations generated in thevariational formulations of image analysis and computer vision mostly fall into this class (with n verylarge).• Nonlinear least-squares: min{r(x) : x ∈ Rn}, r(x) =12kf(x)k22. Nice structure of the problemallows (for sufficiently smooth f) easy computation of the energy gradient and its Hessian which may beexploited by the optimizer. This problem frequently occurs in simple parameter-estimation problems.1Figure 1: Optimization hierarchy. Image from [1].• Nondifferentiable optimization: Most iterative solution methods require smoothness of the ob-jective function to be able to take derivatives to determine the best solution direction. However, thisprecludes their application for nondifferentiable objective functions (for example in the case of minimaxproblems).1.2.2 Constrained continuous problems• Linear programming: min{cTx : Ax = b, x ≥ 0}. Use d for example in an approximation of integerprogramming problems [6] (see Section 1.3) for image segmentation.• Nonlinearity constrained/bounded constrained: Allows for the incorporation of range, equality,and inequality constraints.• Network programming: At the core of graph-cut algorithms [5, 3], which have recently been usedextensively for image segmentation. Minimum cut problems, maximum flow problems, shortest pathproblems, etc.• Stochastic programming: For optimization problems where data is not known accurately, e.g., dueto measurement error or dependency on future data. Solution is sought that is optimal for a set ofscenarios. The Kalman filter is an example, it is an optimal estimator for linear systems with Gaussiannoise.1.3 Discrete problemsSegmentation problems m ay also be regarded as discrete optimization problems. Where each segmentationclass is being assigned a discrete label and the energy to be optimized is based on the labelings and inter-relations (weightings) between them as induced through image information. I nteger programming problemsare in general NP-hard. However, approximate solutions may be computed. See for example the rec entpaper by Komodakis and Tziritas [6]. For certain forms of energies the globally optimal solutions may be2found efficiently (e.g., for binary labelings with submodular energies, reformulated as a network problem andsolved through graph-cuts; see discussion in Section 1.2).References[1] NEOS guide. http://www-fp.mcs.anl.gov/otc/Guide.[2] ICCV tutorial on discrete optimization in computer vision. http://www.csd.uoc.gr/∼komod/ICCV07tutorial/, 2007.[3] Y. Boykov, O. Veksler, and R. Zabih. Fast approximate energy minimization via graph cuts. IEEETransactions on Pattern Analysis and Machine Intelligence, 23(11):1222–1239, 2001.[4] John W. Chinneck. Practical optimization: A gentle introduction. http://www.sce.carleton.ca/faculty/chinneck/po.html.[5] V. Kolmogorov and R. Zabih. What energy functions can be minimized via graph cuts? IEEE Transac-tions on Pattern Analysis and Machine Intelligence, 26(2):147–159, 2004.[6] N. Komodakis and G. Tziritas. Approximate labeling via graph cuts base d on linear programming. IEEETransactions on Pattern Analysis and Machine Intelligence, 29(8):1436–1453, 2007.[7] N.A.Thacker and T.F.Cootes. Vision through optimization. http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL


View Full Document

UNC-Chapel Hill COMP 875 - LECTURE NOTES

Download LECTURE NOTES
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view LECTURE NOTES and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view LECTURE NOTES 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?