Algorithms for Constrained Minimization of Smooth Nonlinear FunctionsAlbert G. Buckley, Jean-Louis Goffin |
Contents
1 The watchdog technique for forcing convergence in algorithms | 5 |
2 Reduced quasiNewton methods with feasibility improvement | 18 |
3 A superlinearly convergent algorithm for constrained optimization | 45 |
Copyright | |
5 other sections not shown
Common terms and phrases
active constraints active set active set strategy applied approximation augmented Lagrangian bounded codes computational conjugate gradient methods constrained problems defined descent direction En+m equality constraints equality-constrained equations exact penalty function exterior penalty feasible point first-order Kuhn-Tucker hence Hessian I(zk inequality constraints infeasible Jacobian Kuhn-Tucker point Lagrange multiplier estimates Lagrange multipliers Lagrangian function Lemma line search linear constraints linearised linearly constrained major iteration manifold Mathematical Programming matrix minimize nonlinear constraints nonlinear programming objective function Optimization Theory original problem penalty function penalty parameter positive definite procedure Proof Proposition QP sub-problem QP(x quadratic programming quasi-Newton method rate of convergence reduced gradient reduced problem relaxed criterion S(zk satisfied search direction Section sequence solution solving step length strict complementary slackness subproblem superbasic superbasic variable superlinear convergence test problems Theorem unconstrained update variable metric vector watchdog technique xk+1 Zk+1