Algorithms for Continuous Optimization: The State of the ArtEmilio Goiuseppe Spedicato The NATO Advanced Study Institute on "Algorithms for continuous optimiza tion: the state of the art" was held September 5-18, 1993, at II Ciocco, Barga, Italy. It was attended by 75 students (among them many well known specialists in optimiza tion) from the following countries: Belgium, Brasil, Canada, China, Czech Republic, France, Germany, Greece, Hungary, Italy, Poland, Portugal, Rumania, Spain, Turkey, UK, USA, Venezuela. The lectures were given by 17 well known specialists in the field, from Brasil, China, Germany, Italy, Portugal, Russia, Sweden, UK, USA. Solving continuous optimization problems is a fundamental task in computational mathematics for applications in areas of engineering, economics, chemistry, biology and so on. Most real problems are nonlinear and can be of quite large size. Devel oping efficient algorithms for continuous optimization has been an important field of research in the last 30 years, with much additional impetus provided in the last decade by the availability of very fast and parallel computers. Techniques, like the simplex method, that were already considered fully developed thirty years ago have been thoroughly revised and enormously improved. The aim of this ASI was to present the state of the art in this field. While not all important aspects could be covered in the fifty hours of lectures (for instance multiob jective optimization had to be skipped), we believe that most important topics were presented, many of them by scientists who greatly contributed to their development. |
From inside the book
Results 1-5 of 46
Page v
Sorry, this page's content is restricted.
Sorry, this page's content is restricted.
Page vii
Sorry, this page's content is restricted.
Sorry, this page's content is restricted.
Page 35
Sorry, this page's content is restricted.
Sorry, this page's content is restricted.
Page 37
Sorry, this page's content is restricted.
Sorry, this page's content is restricted.
Page 38
Sorry, this page's content is restricted.
Sorry, this page's content is restricted.
Contents
General Optimally Conditions via a Separation Scheme | 1 |
Linear Equations in Optimisation | 25 |
Generalized and Sparse Least Squares Problems | 37 |
Algorithms for Solving Nonlinear Systems of Equations | 81 |
AN OVERVIEW OF UNCONSTRAINED OPTIMIZATION | 109 |
Nonquadratic Model Methods in Unconstrained Optimization | 145 |
ALGORITHMS FOR GENERAL CONSTRAINED NONLINEAR OPTIMIZATION | 169 |
Exact Penalty Methods | 209 |
A Condensed Introduction to Bundle Methods in Nonsmooth Optimization | 357 |
COMPUTATIONAL METHODS FOR LINEAR PROGRAMMING | 383 |
INFEASIBLE INTERIOR POINT METHODS FOR SOLVING LINEAR PROGRAMS | 415 |
Algorithms for Linear Complementarity Problems | 435 |
A HOMEWORK EXERCISE THE BIGM PROBLEM | 475 |
DETERMINISTIC GLOBAL OPTIMIZATION | 481 |
ON AUTOMATIC DIFFERENTIATION AND CONTINUOUS OPTIMIZATION | 501 |
NEURAL NETWORKS AND UNCONSTRAINED OPTIMIZATION | 513 |
Stable BarrierProjection and BarrierNewton Methods for Linear and Nonlinear Programming | 255 |
a Current Survey | 287 |
ABS Methods for Nonlinear Optimization | 333 |
LIMITATIONS CHALLENGES AND OPPORTUNITIES | 531 |
561 | |
Other editions - View all
Algorithms for Continuous Optimization: The State of the Art Emilio Goiuseppe Spedicato No preview available - 2012 |
Common terms and phrases
algorithm applied approach approximation augmented Lagrangian automatic differentiation barrier function BFGS BFGS method bounds Broyden Cholesky factorization column condition conjugate gradient Conn consider constrained optimization constraints convex defined denote derivatives equations evaluation exact penalty function feasible set finite formula given GLCP global convergence global optimization gradient method Hessian implementation inequality interior point methods iteration large-scale least squares problems line search linear programming linear system Math Mathematical Programming matrix minimization Newton Newton's method nonlinear optimization nonlinear programming nonsingular numerical Numerical Analysis objective function obtained optimisation optimization problems orthogonal parallel computers parameter penalty function pivoting positive definite processors programming problems properties quadratic programming quasi-Newton methods Research satisfies scaling Schnabel sequence sequential SIAM Journal simplex solution sparse Spedicato step superlinear symmetric Technical Report techniques Theorem Toint trust region unconstrained update vector
Popular passages
Page 471 - Multigrid Algorithms for the solution of linear complementarity problems arising from free boundary problems,