## Methods of optimizationNonlinear programming. Kuhn-tucker necessary conditions. Saddle-point property of the lagrangian functions. The constraint qualification. Search methods for unconstrained optimization. Grild search. Hooke and jeeves' method. Spendley, hext and himsworth's method. Nelder and mead's method. Gradient methods for unconstrained optimization. Method of steepest descent. The newton-raphson method. The davidon-fletcher-powell method. Constrained optimization. Hemstitching. The gradient projection method. Penalty functions. Dynamic programming The allocation problem. Oriented networks. The farmer's problem. |

### From inside the book

23 pages matching **inequality constraints** in this book

Where's the rest of this book?

Results 1-3 of 23

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Introduction | 1 |

Nonlinear Programming | 35 |

Search Methods for Unconstrained Optimization | 74 |

Copyright | |

5 other sections not shown

### Common terms and phrases

assumed classical optimization complementary DFP concave function constrained local maximum constraint boundary constraint qualification convergence convex function convex set current point defined derivatives DFP method direction of search dynamic programming equality constraints equation evaluations Example function f function value given global maximum Golden Section search gradient methods Hence Hessian matrix inequality constraints initial point interval of uncertainty iteration Kuhn-Tucker necessary conditions Lagrange multipliers Lagrangian function linear programming problem linear searches maximize maximum value minimal path minimizes f(x minimizing problem minimum mutually conjugate directions node non-negativity restrictions nonlinear programming nonlinear programming problem objective function obtain optimal point optimal solution optimization problem optimization technique positive definite Powell's method problem 2.1 proof quadratic function quadratic programming problem replaced saddle-point satisfies the constraints search direction Section sequence solve step lengths suff1cient Suppose surplus variables Theorem theory unconstrained optimization unimodal function unrestricted in sign vector x'Dx