SCALES:INTRODUCTION NON-LINEAR, OPTIMIZATION RPTIn this textbook the author concentrates on presenting the main core of methods in non-linear optimization that have evolved over the past two decades. It is intended primarily for actual or potential practising optimizer who need to know how different methods work, how to select methods for the job in hand and how to use the chosen method. While the level of mathematical rigour is not very high, the book necessarily contains a considerable amount of mathematical argument and pre-supposes a knowledge such as would be attained by someone reaching the end of the second year of an undergraduate course in physical science, engineering or computational mathematics. The main emphasis is on linear algebra, and more advanced topics are discussed briefly where relevant in the text. The book will appeal to a range of students and research workers working on optimization problems in such fields as applied mathematics, computer science, engineering, business studies, economics and operations research. |
Contents
MULTIVARIATE MINIMIZATION | 56 |
NONLINEAR LEAST SQUARES | 110 |
FUNDAMENTALS OF CONSTRAINED OPTIMIZATION | 139 |
Copyright | |
4 other sections not shown
Other editions - View all
Common terms and phrases
active set ak+1 Amax approximation AT(x augmented Lagrangian b₁ barrier function BFGS Bk+1 Broyden's family c₁ compute condition number conjugate gradient methods Contours curvature deletion descent direction eigenvalues end set equality constraints exact linear search figure finite difference first-order Fk+1 Fletcher formula function value Gill and Murray Gill-Murray Golden Section search gtol Hessian matrix Hk Agk Hk+1 inequality constraints interval reduction Lagrange multiplier Lagrange multiplier estimates Lagrangian Lagrangian function linearly constrained modified Newton method objective function obtained optimization penalty function positive definite possible projection methods quadratic function quadratic termination quasi-Newton methods rank-one rate of convergence saddle point satisfy search vector second derivatives second-order set xk+1 solve stationary point steepest descent strong minimum subspace sufficiently symmetric tangent hyperplane Taylor series techniques term unconstrained minimization update zero