Springer Science & Business Media, Apr 28, 2000 - Mathematics - 636 pages
This is a book for people interested in solving optimization problems. Because of the wide (and growing) use of optimization in science, engineering, economics, and industry, it is essential for students and practitioners alike to develop an understanding of optimization algorithms. Knowledge of the capabilities and limitations of these algorithms leads to a better understanding of their impact on various applications, and points the way to future research on improving and extending optimization algorithms and software. Our goal in this book is to give a comprehensive description of the most powerful, state-of-the-art, techniques for solving continuous optimization problems. By presenting the motivating ideas for each algorithm, we try to stimulate the reader’s intuition and make the technical details easier to follow. Formal mathematical requirements are kept to a minimum. Because of our focus on continuous problems, we have omitted discussion of important optimization topics such as discrete and stochastic optimization.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Line Search Methods
Nonlinear LeastSquares Problems
The Simplex Method
Penalty Barrier and Augmented Lagrangian Methods 488
Other editions - View all
algorithm approach approximate solution automatic differentiation BFGS BFGS method Bk+1 bound Cauchy point CG iteration Chapter choose columns components compute conjugate gradient method constrained optimization curvature decrease defined deﬁnite derivatives descent direction described diagonal discussed eigenvalues elements evaluation example f(xk feasible point feasible sequence Figure function f global convergence graph Hessian approximation implementation inequality constraints Jacobian KKT conditions L-BFGS Lagrange multiplier Lagrangian Lemma LICQ line search linear program Lipschitz continuous matrix merit function minimizer Newton step Newton–CG Newton’s method node nonsingular nonzero norm objective function obtain ofthe optimization problems parameter partially separable positive definite proof properties quadratic programming quasi-Newton methods region require result satisfies scalar search direction second-order simplex method solving steepest descent step length strategy subproblem subspace sufficiently Suppose symmetric techniques term trust-region unconstrained variables vector Wolfe conditions xk+1 zero