Recent Advances in Nonsmooth Optimization
Nonsmooth optimization covers the minimization or maximization of functions which do not have the differentiability properties required by classical methods. The field of nonsmooth optimization is significant, not only because of the existence of nondifferentiable functions arising directly in applications, but also because several important methods for solving difficult smooth problems lead directly to the need to solve nonsmooth problems, which are either smaller in dimension or simpler in structure.This book contains twenty five papers written by forty six authors from twenty countries in five continents. It includes papers on theory, algorithms and applications for problems with first-order nondifferentiability (the usual sense of nonsmooth optimization) second-order nondifferentiability, nonsmooth equations, nonsmooth variational inequalities and other problems related to nonsmooth optimization.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Prederivatives and Second Order Conditions for Infinite
Necessary and Sufficient Conditions for Solution Stability
Miscellaneous Incidences of Convergence Theories in Optimization
On Regularized Duality in Convex Optimization
A Globally Convergent Newton Method for Solving Variational
Upper Bounds on a Parabolic Second Order Directional Derivative
A SLP Method with a Quadratic Correction Step for Nonsmooth
A Successive Approximation QuasiNewton Process for Nonlinear
SecondOrder Nonsmooth Analysis in Nonlinear Programming
Characterizations of Optimality for Homogeneous Programming
Other editions - View all
algorithm Analysis applied approach approximate assume assumption Banach space bounded called closed complementarity problem computing cone consider constraint containing continuous convergence convex corresponding defined Definition denote differentiable directional derivative equal equations equivalent Example exists feasible finite function f give given global gradient Hence holds implies introduced iteration Journal Lemma linear locally lower mapping Mathematical Mathematical Programming matrix mean value theorem means method minimization monotone Moreover necessary Newton method nonlinear programming nonsmooth normal Note objective obtain Operations optimization problems parametric particular positive problem projection Proof properties Proposition prove quadratic Remark Research respectively result satisfied second order second-order sequence solution solving space stability step subdifferential subgradient subset sufficiently Suppose Theorem theory University variational inequality vector zero