Numerical Optimization with Computational Errors

Front Cover
Springer, Apr 22, 2016 - Mathematics - 304 pages

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.

This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.

 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

1 Introduction
1
2 Subgradient Projection Algorithm
11
3 The Mirror Descent Algorithm
41
4 Gradient Algorithm with a Smooth Objective Function
59
5 An Extension of the Gradient Algorithm
73
6 Weiszfelds Method
85
7 The Extragradient Method for Convex Optimization
105
8 A Projected Subgradient Method for Nonsmooth Problems
119
11 Maximal Monotone Operators and the Proximal Point Algorithm
169
12 The Extragradient Method for Solving Variational Inequalities
183
13 A Common Solution of a Family of Variational Inequalities
205
14 Continuous Subgradient Method
225
15 Penalty Methods
239
16 Newtons Method
265
References
297
Index
302

9 Proximal Point Method in Hilbert Spaces
137
10 Proximal Point Methods in Metric Spaces
149

Other editions - View all

Common terms and phrases