DOC PREVIEW
SJSU ISE 230 - Nonlinear Programming

This preview shows page 1-2-3-20-21-40-41-42 out of 42 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 42 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Chapter 11 Nonlinear Programming11.1 Review of Differential Calculus11.2 Introductory ConceptsSlide 5Slide 6Example 11: Tire ProductionEx. 11 - continuedExample 11: SolutionSlide 1011.3 Convex and Concave FunctionsSlide 12Slide 13Slide 1411.4 Solving NLPs with One VariableSlide 16Example 21: Profit Maximization by MonopolistExample 21: SolutionSlide 19Slide 20Slide 2111.5 Golden Section SearchSlide 23Slide 2411.6 Unconstrained Maximization and Minimization with Several VariablesSlide 2611.7: The Method of Steepest AscentSlide 2811.8 Lagrange MultipliersSlide 3011.9 The Kuhn-Tucker ConditionsSlide 32Slide 33Slide 3411.10:Quadratic Programming11.11: Separable Programming.Slide 3711.12 The Method of Feasible DirectionsSlide 39Slide 4011.13 Pareto Optimality and Trade-Off CurvesSlide 42Chapter 11Nonlinear Programmingto accompanyOperations Research: Applications and Algorithms 4th editionby Wayne L. WinstonCopyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc.211.1 Review of Differential CalculusThe equation means that as x gets closer to a (but not equal to a), the value of f(x) gets arbitrarily close to c.A function f(x) is continuous at a point ifIf f(x) is not continuous at x=a, we say that f(x) is discontinuous (or has a discontinuity) at a.cxfax)(lim)()(lim afxfax3 The derivative of a function f(x) at x = a (written f’(a)] is defined to be N-th order Taylor series expansionThe partial derivative of f(x1, x2,…xn) with respect to the variable xi is writtenxafxafx)()(lim01)1(1)()!1()(!)()()(nniniiihnpfhiafafhafininiixiixxxxfxxxxfxfxfi),...,...,(),...,,...,(lim where,110411.2 Introductory ConceptsA general nonlinear programming problem (NLP) can be expressed as follows:Find the values of decision variables x1, x2,…xn thatmax (or min) z = f(x1, x2,…,xn)s.t. g1(x1, x2,…,xn) (≤, =, or ≥)b1s.t. g2(x1, x2,…,xn) (≤, =, or ≥)b2 . . . gm(x1, x2,…,xn) (≤, =, or ≥)bm5 As in linear programming f(x1, x2,…,xn) is the NLP’s objective function, and g1(x1, x2,…,xn) (≤, =, or ≥)b1,…gm(x1, x2,…,xn) (≤, =, or ≥)bm are the NLP’s constraints. An NLP with no constraints is an unconstrained NLP.The feasible region for NLP above is the set of points (x1, x2,…,xn) that satisfy the m constraints in the NLP. A point in the feasible region is a feasible point, and a point that is not in the feasible region is an infeasible point.6 If the NLP is a maximization problem then any point in the feasible region for which f( ) ≥ f(x) holds true for all points x in the feasible region is an optimal solution to the NLP. NLPs can be solved with LINGO.Even if the feasible region for an NLP is a convex set, he optimal solution need not be a extreme point of the NLP’s feasible region.For any NLP (maximization), a feasible point x = (x1,x2,…,xn) is a local maximum if for sufficiently small , any feasible point x’ = (x’1,x’2,…,x’n) having | x1–x’1|< (i = 1,2,…,n) satisfies f(x)≥f(x’).xx7Example 11: Tire ProductionFirerock produces rubber used for tires by combining three ingredients: rubber, oil, and carbon black. The costs for each are given.The rubber used in automobile tires must havea hardness of between 25 and 35an elasticity of at 16a tensile strength of at least 12To manufacture a set of four automobile tires, 100 pounds of product is needed.The rubber to make a set of tires must contain between 25 and 60 pounds of rubber and at least 50 pounds of carbon black.8Ex. 11 - continuedDefine:R = pounds of rubber in mixture used to produce four tiresO = pounds of oil in mixture used to produce four tiresC = pounds of carbon black used to produce four tiresStatistical analysis has shown that the hardness, elasticity, and tensile strnegth of a 100-pound mixture of rubber, oil, and carbon black isTensile strength = 12.5 - .10(O) - .001(O)2Elasticity = 17 - + .35R .04(O) - .002(O)2Hardness = 34 + .10R + .06(O) -.3(C) + .001(R)(O) +.005(O)2+.001C2Formulate the NLP whose solution will tell Firerock how to minimize the cost of producing the rubber product needed to manufacture a set of automobile tires.9Example 11: SolutionAfter definingTS = Tensile StrengthE = ElasticityH = Hardness of mixturethe LINGO program gives the correct formulation.10 It is easy to use the Excel Solver to solve NLPs.The process is similar to a linear model.For NLPs having multiple local optimal solutions, the Solver may fail to find the optimal solution because it may pick a local extremum that is not a global extremum.1111.3 Convex and Concave FunctionsA function f(x1, x2,…,xn) is a convex function on a convex set S if for any x’  sand xn  s f [cx’+(1- c)xn] ≤ cf(x’)+(1-c)f(xn)holds for 0 ≤ c ≤ 1.A function f(x1, x2,…,xn) is a concave function on a convex set S if for any x’  s and xn  s f [cx’+(1- c)xn] ≥ cf(x’)+(1-c)f(xn)holds for 0 ≤ c ≤ 1.12 Theorems - Consider a general NLP.Suppose the feasible region S for NLP is a convex set. If f(x) is concave on S, then any local maximum (minimum) for the NLP is an optimal solution to the NLP.Suppose fn(x) exists for all x in a convex set S. Then f(x) is a convex (concave) function of S if and only if fn(x) ≥ 0[fn(x) ≤ 0] for all x in S.Suppose f(x1, x2,…, xn) has continuous second-order partial derivatives for each point x=(x1, x2,…, xn)  S. Then f(x1, x2,…, xn) is a convex function on S if and only if for each x  S, all principal minors of H are non-negative.13 Suppose f(x1, x2,…, xn) has continuous second-order partial derivatives for each point x=(x1, x2,…, xn)  S. Then f(x1, x2,…, xn) is a concave function on S if and only if for each x  S and k=1, 2,…n, all nonzero principal minors have the same sign as (-1)k.The Hessian of f(x1, x2,…, xn) is the n x n matrix whose ijth entry isAn ith principal minor of an n x n matrix is the determinant of any i x i matrix obtained by deleting n – i rows and the corresponding n – i columns of the matrix.jixxf214 The kth leading principal minor of an n x n matrix is the determinant of the k x k matrix obtained by deleting the last n-k rose and columns of the matrix.1511.4 Solving NLPs with One VariableSolving the NLPTo find the optimal solution for the NLP find


View Full Document

SJSU ISE 230 - Nonlinear Programming

Download Nonlinear Programming
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Nonlinear Programming and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Nonlinear Programming 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?