Unformatted text preview:

Project 4.6: Hybrid Optimization Schemes for Parameter Estimation Problems TA4: Enabling Technologies and Advanced Algorithms Principal Investigators- Miguel Argaez and Leticia VelazquezDept. of Mathematical Sciences Collaborator: Pat Teller (6/07-5/08)Dept. of Computer ScienceStudent Support- Carlos Quintero, PhD Computational Science (Arga.,75%)- Yipkei Kwok, Ph.D Computer Science (Teller,100%)- Cristiano Mendes, Ph.D Electrical Engineering (Fall,15%) - Christian Potes, Ph.D Electrical Engineering (Fall,15%)- Lorenzo Ruiz, BS Computer Science (Spring 30%) Objective #1Provide an scalable high performance hybrid optimization code with application to problems of interest to ARLOBJECTIVES PER APP Objective #2Enhance ARL computational predictive capabilities by developing high resolution parameter estimation solvers with a sensitivity analysis information Item #1Reliable parameter estimation is a cornerstone to improving predictions and reducing the risk of decisions on several ARL applications.RELEVANCE TO ARMY & RESEARCHPORTFOLIO Item #2Such applications require substantial computational resources and effort to solve nonlinear functions subject to constraints for different sets of parameters.Therefore development of efficient and novel numerical solutions to key ARL nonlinear parameter estimation problems through the use of HPC is of high importance. Item #1We are implementing a hybrid optimization scheme that is scalable for high-performance computing.RELEVANCE TO HPC Item #2The HPC optimization procedure combines the capabilities of stochastic and deterministic methods via surrogate models in searching for a global solution. Item #1BACKGROUND OF PROBLEM Item #2Finding a global optimal solution is a challenging task due to the fact that data and models are usually nonlinear, non-smooth, subject to different sources of errors, and multiple objective functions. One major challenge in computational science and engineering is finding an optimal global solution for large-scale nonlinear parameter estimation problems. Deliverable #1Formulate and document efficient and portable deterministic and stochastic optimization algorithms, (SPSA and IPM’s methods)DELIVERABLES PER APP Deliverable #2Develop software package based on combining SPSA with NKIP, and perform preliminary testing on benchmark problems provided by ARL users. We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid approach is based on the coupling of the Simultaneous Perturbation Stochastic Approximation (SPSA) and a Newton-Krylov Interior-Point method (NKIP) via a surrogate model. We implemented the hybrid approach on several test cases.TECHNICAL APPROACH & RESULTS()minimize : R R (1)nf x f →where the global solution x* is such thatWe are interested in problem (1) that have many local minima.We consider the global optimization problem in the form:()()nxxfxf R allfor * ∈≤PROBLEM FORMULATION Global Method: SPSA Surrogate Model Local Method: NKIPHYBRID APPROACHStochastic steepest descent direction algorithm (James Spall, 1998) Advantage SPSA gives region(s) where the function value is low, and this allows to conjecture in which region(s) is the global solution.  Disadvantages Slow method Do not take into account equality/inequality constraintsGLOBAL METHOD: SPSAendx;a-x x);)./(2cy-(yx );f(xy );f(xy ;c-x x;cx x1;-(n,1))round(rand*2 ;c/kc ;A)a/(ka iter_spsa:1kfor length(x);n),c,A,a,SPSA(x,[x]function kk---k-kkk∆=∆=∆==∆=∆+==∆=+====++++γαγα0.6021.101,10),iter_spsa/(10,A ),10,10(c ),15,10(a :Note-3-6==∈∈∈γαSPSA: PSEUDOCODEx1x2−5 −4 −3 −2 −1 0 1 2 3 4 5−5−4−3−2−1012345SPSA ITERATIONS SPSA gives region(s) where the function value is low, and this allows to conjecture in which region(s) is the global solution.  This give us a motivation to apply a local method in the region(s) found by SPSA.OBSERVATIONS Advantages  Fast Rate of Convergence: Newton Type Methods Interior-Point Methods: allow to add equality/inequality constraints Disadvantage Needs first/second order information Solution Construct a Surrogate Model using the SPSA function values inside the conjecture region(s)LOCAL METHOD: NEWTONA surrogate model is created by using an interpolation method with the data, , provided by SPSA.This can be performed in different ways, e.g., radial basis functions, kriging, regression analysis, or using artificial neural networks.pkxfxkk,...,1 )),(,(=)(ksxfSURROGATE MODELMost real problems require thousands or millions of objective and constraint function evaluations, and often the associated high cost and time requirements render this infeasible.WHY SURROGATE MODELS?RBF is typically parameterized by two sets of parameters: the center c which defines its position, and shape r that determines its width or formAn RBF interpolation algorithm (Orr,1996) characterizes the uncertainty parameters:.,..,1 ,,...,1 , , , mjniwrcjijij==RADIAL BASIS FUNCTION (RBF)Our goal is to optimize the surrogate functionwhere the RBF can be defined as:that are the multiquadric and the Gaussian basis functions, respectively1( ) ( ),ms j jjf x w h x==∑2212( )1( )( ) 1 or ( )ni ijjiijx cni ijrijix ch x h x ejr      =−∑=−= + =∑SURROGATE MODELRastrigin’s Problem:21 21min ( , ,..., ) 10 ( 10 cos(2 )),s.t. 5.12 5.12 for 1, 2,...,nn i iiif x x x n x xxi nπ== + −− ≤ ≤=∑TEST CASETEST CASEx*=(0,0) global solutionf(x*)=0TEST CASESampling Data from SPSATEST CASEx*=(0,0) global solutionf(x*)=0Sampling Data from SPSATEST CASEWe plot the original model function and the surrogate function:SURROGATE MODELSURROGATE MODEL NKIP is a Newton-Krylov Interior-Point method introduced by Miguel Argaéz and Richard Tapia in 2002.  This method calculates the directions using the conjugate gradient algorithm, and a linesearch is implemented to guarantee a sufficient decrease of the objective function. This method was developed for obtaining an optimal solution for large scale and degenerate problems.LOCAL METHOD: NKIPWe consider the optimization problem in the form:where a and b are


View Full Document

Stanford CME 334 - Hybrid Optimization Schemes

Download Hybrid Optimization Schemes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Hybrid Optimization Schemes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Hybrid Optimization Schemes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?