DOC PREVIEW
MIT 1 204 - Nonlinear unconstrained optimization

This preview shows page 1-2-3-4-5 out of 14 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1.204 Lecture 21 Nonlinear unconstrained optimization: First order conditions: Newton’s method Estimating a logit demand model Nonlinear unconstrained optimization • Network equilibrium was a constrained nonlinearNetwork equilibrium was a constrained nonlinear optimization problem – Nonnegativity constraints on flows – Equality constraints on O-D flows – Other variations (transit, variable demand) have inequality constraints • In these two lectures we examine unconstrained nonlinear optimization problems – No constraints of any sort on the problem; we just find the global minimum or maximum of the function – Lagrangians can be used to write constrained problems as unconstrained problems, but it’s usually best to handle the constraints explicitly for computation 1Solving systems of nonlinear equations • One way to solve for max z(x), where x is aOne way to solve for max z(x), where x is a vector, is to find the first derivatives, set them equal to zero, and solve the resulting system of nonlinear equations • This is the simplest approach and, if the problem is convex (any line between two points on the boundary of the feasible space stays entirely inboundary of the feasible space stays entirely in the feasible space), it is ‘good enough’ • We will estimate a binary logit demand model with this approach in this lecture – We’ll use a true nonlinear unconstrained minimization algorithm in the next lecture, which is a better way Solving nonlinear systems of equations is hard • Press, Numerical Recipes: “There are There are nono good,good,Press, Numerical Recipes: general methods for solving systems of more than one nonlinear equation. Furthermore, it is not hard to see why (very likely) there never will be any good, general methods.” • “Consider the case of two dimensions, where we want to solve simultaneously”want to solve simultaneously – f(x, y)= 0 – g(x, y)= 0 2Example of nonlinear system From Press Example, continued • f and g are two functions – Zero contour lines divide plane in regions where functions are positive or negative – Solutions to f(x,y)=0 and g(x,y)=0 are points in common between these contours • f and g have no relation to each other, in general – To find all common points, which are the solutions to the nonlinear equations, we must map the full zero contours of both functions • Zero contours in general consist of a an unknown number of disjoint closed curves – For problems in more than two dimensions, we need to find points common to n unrelated zero contour hypersurfaces, each of dimension n-1 – Root finding is impossible without insight • We must know approximate number and location of solutions a priori From Press 3 f negf posf posf posg negg posg posg negg posf = 0f = 0Mg = 0g = 0g = 0yxSolution of two nonlinear equations in two unknownsNo root hereTwo roots hereFigure by MIT OpenCourseWare.4Nonlinear minimization is easier• There are efficient general techniques for finding a minimum of a function of many variablesy– Minimization is not the same as finding roots of n first order equations (∂z/∂x= 0 for all xnvariables)– Components of gradient vector (∂z/∂x) are not independent, arbitrary functions• Obey ‘integrability conditions’: You can always find a minimum by going downhill on a single surface– There is no analogous concept for finding the root of N gpgnonlinear equations• We will cover constrained minimization methods in next lecture– Nonlinear root finder has easier code but is less capable– Nonlinear minimization has harder code but works wellFrom PressNewton-Raphson for nonlinear system• We have n equations in n variables xi:0)(f• Near each x value, we can expand fiusing a Taylor series:• Matrix of partial derivatives is Jacobian J:0),...,,,(1210=−nixxxxf∑−=+∂∂+=+102)()()(njjjiiixOxxfxfxxfδδδ• In matrix notation our expansion is:jiijxfJ∂∂≡)()()(2xOxJxfxxf ∂+∂⋅+=∂+5Newton-Raphson∂f/∂xf(x)Initial guess of rootIn multiple dimensions, simultaneouslyNewton-Raphson, p. 2• Ignore O(∂x2) terms and set f(x+∂x)= 0 to find a set of linear equations for the corrections ∂xtomoveof linear equations for the corrections ∂x to move each function in f closer to zero simultaneously:• We solve this system using Gaussian elimination or LU decomposition• We add the corrections to the previous solution and iterate until we converge:fxJ −=⋅δand iterate until we converge:• If high order derivatives are large or first derivative is small, Newton can fail miserably– Converges quickly if assumptions metxxxδ+='fi l d bl TOLERANCE 1E13 G i (fj )Newton class, method public class Newton { public static double[] mnewt(int nTrial, double[] x, MathFunction2 func) { final double TOLERANCE= 1E-13; int n= x.length; double[] p= new double[n]; // δx double[] fvec= new double[n]; // Function value double[][] fjac= new double[n][n]; // Jacobian J for (int k= 0; k < nTrial; k++) { fvec= func.func(x); fjac= func.jacobian(x); double errf= 0.0; for (int i= 0; i < n; i++) // Close enough to 0? errf += Math.abs(fvec[i]); if (errf < TOLERANCE) return x; // Continues on next slide Newton class, method, p.2 // Not close enough, solve for δx (p) for (int i= 0; i < n; i++) p[i]= -fvec[i]; p= Gauss.gaussian(fjac, p); double errx= 0.0; for (int i= 0; i < n; i++) { errx += Math.abs(p[i]); x[i] += p[i]; } if (errx <= TOLERANCE) return x; } return x; } } public interface MathFunction2 { double[] func(double[] x); double[][] jacobian(double[] x); } 6d bl[] f d bl[ l th]it Til 20SimpleModel // Solve x2 + xy= 10 and y + 3xy2= 57 public class SimpleModel implements MathFunction2 { public double[] func(double[] x) { double[] f= new double[x.length]; f[0]= x[0]*x[0] + x[0]*x[1] - 10; f[1]= x[1] + 3*x[0]*x[1]*x[1] - 57; return f; } public double[][] jacobian(double[] x) { int n= x.length; double[][] j= new double[n][n]; j[0][0]= 2*x[0] + x[1]; j[0][1]= x[0]; j[1][0]= 3*x[1]*x[1]; j[1][1]= 1 + 6*x[0]*x[1]; return j; } } SimpleModelTest public class SimpleModelTest { public static void main(String[] args) { SimpleModel s= new SimpleModel(); int nTrial= 20; double[] x= {1.5, 3.5}; // Initial guess x= Newton.mnewt(nTrial, x, s); for (double d : x) System.out.println(d); } } // Finds solution {2, 3} 78Logit demand models• Mode choice example for work trip–Individual has choice between transit and auto– We assume the


View Full Document

MIT 1 204 - Nonlinear unconstrained optimization

Download Nonlinear unconstrained optimization
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Nonlinear unconstrained optimization and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Nonlinear unconstrained optimization 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?