## Iterative Methods for OptimizationThis book presents a carefully selected group of methods for unconstrained and bound constrained optimization problems and analyzes them in depth both theoretically and algorithmically. It focuses on clarity in algorithmic description and analysis rather than generality, and while it provides pointers to the literature for the most general theoretical results and robust software, the author thinks it is more important that readers have a complete understanding of special cases that convey essential ideas. A companion to Kelley's book, Iterative Methods for Linear and Nonlinear Equations (SIAM, 1995), this book contains many exercises and examples and can be used as a text, a tutorial for self-study, or a reference. Iterative Methods for Optimization does more than cover traditional gradient-based optimization: it is the first book to treat sampling methods, including the Hooke-Jeeves, implicit filtering, MDS, and Nelder-Mead schemes in a unified way. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

FR18_ch1 | 3 |

FR18_ch2 | 13 |

FR18_ch3 | 39 |

FR18_ch4 | 71 |

FR18_ch5 | 87 |

FR18_ch6 | 109 |

FR18_ch7 | 123 |

FR18_ch8 | 135 |

FR18_backmatter | 161 |

### Other editions - View all

### Common terms and phrases

250 function evaluations active set approximate Hessian Assume BFGS method bound constrained problems Central Differences completes the proof compute continuously differentiable convergence result descent direction difference Hessian dogleg example f(an floating point operations foount Forward Differences function evaluations Figure Gauss–Newton iteration global convergence Gradient Norm gradient projection Hence Hooke—Jeeves hyperrectangle implementation implicit filtering implies initial iterate kmaa least squares problems Lemma Let f Let the standard Levenberg–Marquardt line search linear Lipschitz constant Lipschitz continuous MATLAB MATLAB code matrix minimizer minimum Newton step Newton's method nondegenerate nonlinear equations nonsingular objective function optimization problems positive definite projected BFGS q-quadratically q-superlinearly quadratic model quasi-Newton methods reduced Hessian residual problem satisfy 6.1 sequence simplex gradient solution solve standard assumptions hold stationary point steepest descent superlinear convergence trial point trust region problem trust region radius update vector vertices Vf(a Vf(r Wolfe conditions