## Introduction to Non-Linear OptimizationIn this textbook the author concentrates on presenting the main core of methods in non-linear optimization that have evolved over the past two decades. It is intended primarily for actual or potential practising optimizer who need to know how different methods work, how to select methods for the job in hand and how to use the chosen method. While the level of mathematical rigour is not very high, the book necessarily contains a considerable amount of mathematical argument and pre-supposes a knowledge such as would be attained by someone reaching the end of the second year of an undergraduate course in physical science, engineering or computational mathematics. The main emphasis is on linear algebra, and more advanced topics are discussed briefly where relevant in the text. The book will appeal to a range of students and research workers working on optimization problems in such fields as applied mathematics, computer science, engineering, business studies, economics and operations research. |

### From inside the book

17 pages matching **Levenberg-Marquardt method** in this book

Where's the rest of this book?

Results 1-3 of 17

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

LINEARLY CONSTRAINED MINIMIZATION 152 | 6 |

INTRODUCTION | 7 |

UNIVARIATE MINIMIZATION | 26 |

Copyright | |

6 other sections not shown

### Other editions - View all

### Common terms and phrases

active set akpk algorithm approximation BFGS formula Bk+i bk+l Broyden's family compute condition number conjugacy conjugate gradient methods curvature descent direction eigenvalues elements exact linear search feasible figure finite difference first-order Fk+l function value Gauss-Newton method Gill and Murray Gill-Murray Golden Section search gradient evaluations gradient vector gtol Hessian matrix Hk Agk Hk+i inequality constraints interpolation interval of uncertainty interval reduction Jk Jk Lagrange multiplier Lagrangian methods Levenberg-Marquardt method modified Newton method non-linear least squares number of iterations number of variables objective function obtained optimization orthogonal positive definite possible problem quadratic function quadratic termination quasi-Newton methods rank-one rate of convergence reduction in function saddle point satisfy scalar search vector second derivatives second-order singular solve stationary point strong minimum subspace sufficiently superlinear symmetric Taylor series techniques term vector pk xk+i zero