## A Mathematical View of Interior-Point Methods in Convex OptimizationThis compact book, through the simplifying perspective it presents, will take a reader who knows little of interior-point methods to within sight of the research frontier, developing key ideas that were over a decade in the making by numerous interior-point method researchers. It aims at developing a thorough understanding of the most general theory for interior-point methods, a class of algorithms for convex optimization problems. The study of these algorithms has dominated the continuous optimization literature for nearly 15 years. In that time, the theory has matured tremendously, but much of the literature is difficult to understand, even for specialists. By focusing only on essential elements of the theory and emphasizing the underlying geometry, A Mathematical View of Interior-Point Methods in Convex Optimization makes the theory accessible to a wide audience, allowing them to quickly develop a fundamental understanding of the material. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Other editions - View all

A Mathematical View of Interior-point Methods in Convex Optimization James Renegar Limited preview - 2001 |

A Mathematical View of Interior-Point Methods in Convex Optimization James Renegar No preview available - 2001 |

A Mathematical View of Interior-Point Methods in Convex Optimization James Renegar No preview available - 2001 |

### Common terms and phrases

a-val affine spaces algorithm approximation Assume f asymptotically feasible barrier method central path Consequently convex functional Convex Optimization convex set Corollary defined definition of self-concordance denote differentiable domain dot product dual feasible dual instance eigenvalues equivalent Farkas lemma feasible points following theorem Frobenius norm functional f gradient Hence Hessian implies inequality interior-point methods intrinsically self-conjugate ipm theory line searches linear operator logarithmic barrier function logarithmically homogeneous minimizer Nesterov–Todd directions Newton step Newton’s method nonnegative orthant norm nullspace objective value optimization problem pd matrices primal and dual primal instance Proposition prove reference inner product rely s.t. Ax satisfying scaling point self-concordant functionals self-scaled cone Sn×n strong duality subspace suffices to show surjective symmetric univariate functional vector x e DF x e Kº