What people are saying - Write a review
We haven't found any reviews in the usual places.
1 4 I 4
Unconstrained Optimization p
Lagrange Multiplier Theory p
6 other sections not shown
Other editions - View all
algorithm analysis approximation Armijo rule assume assumption Bertsekas computation conjugate gradient method constrained problem constraint set continuously differentiable convergence rate convex function convex set coordinate corresponding cost function data blocks deﬁned denote descent direction dual function dual problem eigenvalues equal equation example Exercise exists f wk feasible direction ﬁnd ﬁnite ﬁrst order ﬁxed function f given global minimum gradient projection method Hessian Hessian matrix inequality constraints integer iteration Lagrange multiplier limit point linear programming matrix minimization rule minimize f minimum of f modiﬁed Newton’s method nonlinear programming obtain optimal solution optimality conditions optimization problem penalty function positive deﬁnite positive scalar primal problem minimize proof Proposition quadratic program rate of convergence result satisﬁes scalar second order Section Show solving stationary point steepest descent stepsize rule subgradient subgradient method subset subspace theorem unconstrained V2 f variables vector zero