## Introductory Lectures on Convex Optimization: A Basic CourseIt was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12]. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Nonlinear Optimization 11 World of nonlinear optimization 111 General formulation of the problem | 1 |

112 Performance of numerical methods | 4 |

113 Complexity bounds for global optimization | 7 |

114 Identity cards of the fields | 13 |

121 Relaxation and approximation | 15 |

122 Classes of differentiable functions | 20 |

123 Gradient method | 25 |

124 Newton method | 32 |

322 Main lemma | 138 |

323 Subgradient method | 141 |

324 Minimization with functional constraints | 144 |

325 Complexity bounds in finite dimension | 146 |

326 Cutting plane schemes | 149 |

33 Methods with complete data | 156 |

331 Model of nonsmooth function | 157 |

332 Kelley method | 158 |

131 Gradient method and Newton method What is different? | 37 |

132 Conjugate gradients | 42 |

133 Constrained minimization | 46 |

Smooth Convex Optimization 21 Minimization of smooth functions 211 Smooth convex functions | 51 |

212 Lower complexity bounds for FL | 58 |

213 Strongly convex functions | 63 |

214 Lower complexity bounds for S | 66 |

215 Gradient method | 68 |

221 Optimal methods | 71 |

222 Convex sets | 81 |

223 Gradient mapping | 86 |

224 Minimization methods for simple sets | 87 |

231 Minimax problem | 90 |

232 Gradient mapping | 93 |

233 Minimization methods for minimax problem | 96 |

234 Optimization with functional constraints | 100 |

235 Method for constrained minimization | 105 |

Nonsmooth Convex Optimization 31 General convex functions 311 Motivation and definitions | 111 |

312 Operations with convex functions | 117 |

313 Continuity and differentiability | 121 |

314 Separation theorems | 124 |

315 Subgradients | 126 |

316 Computing subgradients | 130 |

321 General lower complexity bounds | 135 |

333 Level method | 160 |

334 Constrained minimization | 164 |

Structural Optimization 41 Selfconcordant functions 411 Black box concept in convex optimization | 171 |

412 What the Newton method actually does? | 173 |

413 Definition of selfconcordant function | 175 |

414 Main inequalities | 181 |

415 Minimizing the selfconcordant function | 187 |

421 Motivation | 192 |

422 Definition of selfconcordant barriers | 193 |

423 Main inequalities | 196 |

424 Pathfollowing scheme | 199 |

425 Finding the analytic center | 203 |

426 Problems with functional constraints | 206 |

431 Bounds on parameters of selfconcordant barriers | 210 |

432 Linear and quadratic optimization | 213 |

433 Semidefinite optimization | 216 |

434 Extremal ellipsoids | 220 |

435 Separable optimization | 224 |

436 Choice of minimization scheme | 227 |

231 | |

233 | |

235 | |

### Other editions - View all

### Common terms and phrases

approximation assume assumption black box called closed and convex closed convex function closed convex set Compute conjugate gradients conjugate gradients method Consider the function constrained minimization problem convex optimization coordinate vector Corollary defined definition Denote differentiable function efficiency estimate ellipsoid method equation example f(xk fcth iteration fi(x fj(x function f(x functional constraints gradient mapping gradient method Hessian inequality interior-point methods Let f Let us fix Let us look Let us prove level method linear Lipschitz continuous local minimum lower bound lower complexity bounds max-type function minimization methods minimization scheme Newton method nonlinear optimization nonsmooth Note objective function optimal value optimization methods optimization problems oracle parameter path-following scheme problem class Proof properties quadratic function rate of convergence self-concordant barrier self-concordant function set Q solution solve strongly convex functions subgradient method test point tion unconstrained minimization vector view of Lemma view of Theorem xk+i

### References to this book

Numerische Verfahren der konvexen, nichtglatten Optimierung: Eine ... Walter Alt Limited preview - 2013 |