DOC PREVIEW
UCF EGN 3420 - Lecture Notes

This preview shows page 1-2-24-25 out of 25 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Engineering Analysis ENG 3420 Fall 2009Lecture 11OptimizationSlide 4One- versus multi-dimensional optimizationGlobal versus local optimizationOne- versus Multi-dimensional OptimizationEuclid’s golden numberGolden-Section SearchSlide 10Golden section versus bisectionSlide 12Parabolic interpolationfminbnd built-in functionfminsearch built-in functionExample: minimize f(x,y)=2+x-y+2x2+2xy+y2Heuristics for global optimizationSimulated annealing (SA)Genetic algorithmsNeural networksNedler Mead optimizationContour and surface/mesh plots are used to visualize functions of two-variablesSlide 23Plot:Slide 25Engineering Analysis ENG 3420 Fall 2009Dan C. MarinescuOffice: HEC 439 BOffice hours: Tu-Th 11:00-12:0022Lecture 11Lecture 11Last time: Newton-RaphsonThe secant methodToday:OptimizationGolden ratio  makes one-dimensional optimization efficient. Parabolic interpolation locate the optimum of a single-variable function.fminbnd function  determine the minimum of a one-dimensional function.fminsearch function  determine the minimum of a multidimensional function.Next TimeLinear algebraOptimizationCritical for solving engineering and scientific problems.One-dimensional versus multi-dimensional optimization.Global versus local optima. A maximization problem can be solved with a minimizing algorithm.Optimization is a hard problem when the search space for the optimal solution is very large. Heuristics such as simulated annealing, genetic algorithms, neural networks.AlgorithmsGolden ratio  makes one-dimensional optimization efficient. Parabolic interpolation locate the optimum of a single-variable function.fminbnd function  determine the minimum of a one-dimensional function.fminsearch function  determine the minimum of a multidimensional function.How to develop contours and surface plots to visualize two-dimensional functions.3OptimizationFind the most effective solution to a problem subject to a certain criteria.Find the maxima and/or minima of a function of one or more variables.One- versus multi-dimensional optimizationOne-dimensional problems  involve functions that depend on a single dependent variable -for example, f(x).Multidimensional problems involve functions that depend on two or more dependent variables - for example, f(x,y)Global versus local optimizationGlobal optimum  the very best solution. Local optimum  solution better than its immediate neighbors. Cases that include local optima are called multimodal.Generally we wish to find the global optimum.One- versus Multi-dimensional OptimizationOne-dimensional problems  involve functions that depend on a single dependent variable -for example, f(x).Multidimensional problems involve functions that depend on two or more dependent variables - for example, f(x,y)Euclid’s golden numberGiven a segment of length the golden number is determined from the condition: The solution of the last equation is 821yy 21yy012221221yyyyy680133.1251Golden-Section SearchAlgorithm for finding a minimum on an interval [xl xu] with a single minimum (unimodal interval); uses the golden ratio =1.6180 to determine location of two interior points x1 and x2; One of the interior points can be re-used in the next iteration.f(x1)<f(x2)  x2 will be the new lower limit and x1 the new x2.f(x2)<f(x1)  x1 will be the new upper limit and x2 the new x1.d ( 1)(xu xl)x1xl dx2xu df(x1)<f(x2)  x2 is the new lower limit and x1 the new x2 .f(x2)<f(x1)  x1 is the new upper limit and x2 the new x1.Golden section versus bisection11Parabolic interpolationParabolic interpolation requires three points to estimate optimum location.The location of the maximum/minimum of a parabola defined as the interpolation of three points (x1, x2, and x3) is:The new point x4 and the two surrounding it (either x1 and x2 or x2 and x3) are used for the next iteration of the algorithm.x4x212x2 x1 2f x2  f x3   x2 x3 2f x2  f x1  x2 x1 f x2  f x3   x2 x3 f x2  f x1  fminbnd built-in functionfminbnd  combines the golden-section search and the parabolic interpolation.Example[xmin, fval] = fminbnd(function, x1, x2)Options may be passed through a fourth argument using optimset, similar to fzero.fminsearch built-in functionfminsearch determine the minimum of a multidimensional function.[xmin, fval] = fminsearch(function, x0)xmin  a row vector containing the location of the minimum x0  an initial guess; must contain as many entries as the function expects.The function must be written in terms of a single variable, where different dimensions are represented by different indices of that variable.Example: minimize f(x,y)=2+x-y+2x2+2xy+y2Step 1: rewrite as: f(x1, x2)=2+x1-x2+2(x1)2+2x1x2+(x2)2Step 2: define the function f using Matlab syntax: f=@(x) 2+x(1)-x(2)+2*x(1)^2+2*x(1)*x(2)+x(2)^2Step 3: invoke fminsearch [x, fval] = fminsearch(f, [-0.5, 0.5])x0 has two entries - f is expecting it to contain two values.the minimum value is 0.7500 at a location of [-1.000 1.5000]Heuristics for global optimizationGlobal optimization is a very hard problem when the search space for the solution is very large.Heurisitic adjective for experience-based techniques that help in problem solving, learning and discovery. A heuristic method is particularly used to rapidly come to a solution that is hoped to be close to the best possible answer, or 'optimal solution'. Heuristics  noun meaning "rules of thumb”, educated guesses, intuitive judgments or simply common sense. Heuristics for global optimizationSimulated annealingGenetic algorithmsNeural networks 17Simulated annealing (SA) Inspired from metallurgy:Annealing is a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower


View Full Document

UCF EGN 3420 - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?