Superlinearly Convergent Quasi-Newton Methods for Nonlinear Programming |
From inside the book
Results 1-3 of 7
Page 28
... quadratic termination property , i.e. , they locate the minimum value of a convex quadratic function in Another characteristic of the algo- a finite number of iterations . rithms that we present below is that they work when the estimate ...
... quadratic termination property , i.e. , they locate the minimum value of a convex quadratic function in Another characteristic of the algo- a finite number of iterations . rithms that we present below is that they work when the estimate ...
Page 38
... convex quadratic function in a finite number of iterations . The rate of convergence of the so - called Davidon - Fletcher - Powell algorithm is superlinear [ Powell , 1971 ] . Several modified versions followed Davidon's paper and in ...
... convex quadratic function in a finite number of iterations . The rate of convergence of the so - called Davidon - Fletcher - Powell algorithm is superlinear [ Powell , 1971 ] . Several modified versions followed Davidon's paper and in ...
Page 44
... ( Quadratic termination ) Now we can prove Assume that algorithm 5.1 has been used to solve ( UP ) where f is a convex quadratic function . If vaf ( x ) H_Vf ( x ) ≥ v || vf ( x_ ||| 2 , Ꭹ n Ꭹ n - l and the sequence of vectors Yo1 are ...
... ( Quadratic termination ) Now we can prove Assume that algorithm 5.1 has been used to solve ( UP ) where f is a convex quadratic function . If vaf ( x ) H_Vf ( x ) ≥ v || vf ( x_ ||| 2 , Ꭹ n Ꭹ n - l and the sequence of vectors Yo1 are ...
Common terms and phrases
accumulation point algorithm analysis Armijo assume assumption bounded Chapter close complete Computer conclude Consider constraint corollary D₁ defined denoted derivatives differentiable direction dual equations estimate exists expression fact function G₁ G₂ g³(x given Goldstein gradient H₁ hence Hessian holds implies induction inequality introduced inverse iteration Kuhn-Tucker Lagrangian lemma Lipschitz continuous M₂ Mangasarian Mathematics matrix mean value theorem Method Minimization Nonlinear Programming nonsingular norm observe obtained optimization problems otherwise P₁ primal feasible algorithm problem problem Q Proof prove quadratic quasi-Newton algorithm rate of convergence remark Report result satisfies sequence shows solution solving stepsize procedure sufficient conditions superlinear rate theorem thesis tions uniformly positive definite updating scheme v²f(x vector Vf(x x₁ York