## Introduction to Optimization Techniques: Fundamentals and Applications of Nonlinear ProgrammingThe purpose of the book is to introduce the basic techniques for locating extrema (minima or maxima) of a function of several variables. Such a need arises naturally in various design optimization and planning problems. A standard set of techniques for unconstrained function extremization are presented. Small-step and large-step gradient methods: methods involving second partial derivatives of the function, such as the Newton-Raphson method and the Davidon-Fletcher-Powell method; and several other direct search methods are discussed. There are also discussions on elementary aspects of function extremization subject to linear or nonlinear constraints-such as the concept of constraint qualification, Fritz-John and Kuhn-Tucker theorems, penalty function method, etc. assuming differentiability and convexity of objective functions and constraint equations. In addition to presenting various standard algorithms for function extremization, the book also contains some simplified accounts of optimization problems drawn from various branches of engineering and operations research. (Author). |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Problems | 55 |

Criterion Function Representation | 61 |

Problems | 83 |

Copyright | |

15 other sections not shown

### Common terms and phrases

active constraint algorithm Appendix approximation assumed basic solution called Chapter column vectors components conjugate gradient conjugate vectors Consider constraint equations constraint qualification condition convergence convex functions convex set criterion function defined Denote direct methods direction of search discussed distance function dual problem eigenvalues eigenvectors equality constraints evaluated example exists extrema Farkas lemma feasible direction Figure formulated function values g-conjugate given gradient vector Hessian matrix inequality constraints input iteration Kuhn-Tucker Lagrange multipliers Lagrangian linear programming linear programming problem linearly independent locating maximization methods in category minimum point n x n n-dimensional Newton-Raphson method nonlinear programming norm objective function obtain optimal gradient method optimal solution optimization problems orthogonal partial derivatives penalty function method positive definite quadratic form quadratic function satisfy scalar search method Section sequence solved step Subroutine subspace Suppose symmetric symmetric matrix tangent tion unconstrained variable metric method zero