This reprint of the 1969 book of the same name is a concise, rigorous, yet accessible, account of the fundamentals of constrained optimization theory. Many problems arising in diverse fields such as machine learning, medicine, chemical engineering, structural design, and airline scheduling can be reduced to a constrained optimization problem. This book provides readers with the fundamentals needed to study and solve such problems. Beginning with a chapter on linear inequalities and theorems of the alternative, basics of convex sets and separation theorems are then derived based on these theorems. This is followed by a chapter on convex functions that includes theorems of the alternative for such functions. These results are used in obtaining the saddlepoint optimality conditions of nonlinear programming without differentiability assumptions.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Other editions - View all
Applied Mathematics Arrow-Hurwicz-Uzawa constraint qualification assumption Chap concave functions contradicts convex combination convex function convex set convex set T C R Corollary denote dual problem duality theorem equality constraints equivalent establish exists Farkas feasible point finite FJSP fo,f follows Fritz John func g be differentiable g satisfies Hence inequality infimum Kuhn–Tucker constraint qualification lemma let g limit point linear programming linearly independent lower semicontinuous m-dimensional vector function Mangasarian matrix maximum minimization problem minimum necessary optimality conditions necessary optimality criteria nonempty nonlinear programming nonvacuous numerical function defined open ball open set problem MP programming problem pseudoconcave pseudoconvex quasiconvex function real number rows saddlepoint problem semidefinite sequence solution of MP solves MP strictly concave strictly convex strictly quasiconcave strictly quasiconvex supremum Theorem Let tion vector function defined zero