Nonsmooth Optimization: Proceedings of a IIASA Workshop, March 28 - April 8, 1977
Claude Lemarechal, Robert Mifflin
Elsevier, May 19, 2014 - Technology & Engineering - 194 pages
Nonsmooth Optimization contains the proceedings of a workshop on non-smooth optimization (NSO) held from March 28 to April 8,1977 in Austria under the auspices of the International Institute for Applied Systems Analysis. The papers explore the techniques and theory of NSO and cover topics ranging from systems of inequalities to smooth approximation of non-smooth functions, as well as quadratic programming and line searches.
Comprised of nine chapters, this volume begins with a survey of Soviet research on subgradient optimization carried out since 1962, followed by a discussion on rates of convergence in subgradient optimization. The reader is then introduced to the method of subgradient optimization in an abstract setting and the minimal hypotheses required to ensure convergence; NSO and nonlinear programming; and bundle methods in NSO. A feasible descent algorithm for linearly constrained least squares problems is described. The book also considers sufficient minimization of piecewise-linear univariate functions before concluding with a description of the method of parametric decomposition in mathematical programming.
This monograph will be of interest to mathematicians and mathematics students.
What people are saying - Write a review
We haven't found any reviews in the usual places.
CHAPTER 2 NONDIFFERENTIABLE OPTIMIZATION AND THE RELAXATION METHOD
CHAPTER 3 AN EXTENSION OF THE METHOD OF SUBGRADIENTS
CHAPTER 4 NONSMOOTH OPTIMIZATION AND NONLINEAR PROGRAMMING
CHAPTER 5 BUNDLE METHODS IN NONSMOOTH OPTIMIZATION
CHAPTER 6 A FEASIBLE DESCENT ALGORITHM FOR LINEARLY CONSTRAINED LEAST SQUARES PROBLEMS
CHAPTER 7 SUFFICIENT MINIMIZATION OF PIECEWISELINEAR UNIVARIATE FUNCTIONS
THE NONCONVEX CASE
CHAPTER 9 A SET OF NONSMOOTH OPTIMIZATION TEST PROBLEMS
APPENDIX LIST OF PARTICIPANTS
Other editions - View all
algorithm applied Arg min assume bundle methods choose computing condition numbers conjugate subgradient constrained least squares constraints convergence rate convex functions Cybernetics defined denote descent direction descent method differentiable directional derivative Ermol execution of Step exists extremal-value function f(xk feasible finite given Gradient Method h(xk Hence holds IIASA IMAX iteration Kibern Lagrange multipliers least squares problem Lemarechal Lemma lim inf line-search linear programming Lipschitz Continuous Mathematical Programming matrix method for solving Mifflin Minimization Method Nauka Newton's Method nondifferentiable nonempty Nonlinear Programming NONSMOOTH OPTIMIZATION Nurminskii Operations Research P(xk piecewise-linear Poljak Programming Problems Proof Pshenichnyi quadratic quasidifferentiable rate of convergence relaxation method result Russian satisfied Section semismooth sequence Shor SIAM Journal solution Space Dilation Stochastic subgradient method subgradient optimization Suppose Theorem tion variables vector x e P y,z Y(xk