Performance Estimation of second-order optimization methods on classes of univariate functions
Performance Estimation of second-order optimization methods on classes of univariate functions
We develop a principled approach to obtain exact computer-aided worst-case guarantees on the performance of second-order optimization methods on classes of univariate functions. We first present a generic technique to derive interpolation conditions for a wide range of univariate functions, and use it to obtain such conditions for generalized self-concordant functions (including self-concordant and quasi-self-concordant functions) and functions with Lipschitz Hessian (both convex and non-convex). We then exploit these conditions within the Performance Estimation framework to tightly analyze the convergence of second-order methods on univariate functions, including (Cubic Regularized) Newton's method and several of its variants. Thereby, we improve on existing convergence rates, exhibit univariate lower bounds (that thus hold in the multivariate case), and analyze the performance of these methods with respect to the same criteria.
Anne Rubbens、Nizar Bousselmi、Julien M. Hendrickx、François Glineur
数学计算技术、计算机技术
Anne Rubbens,Nizar Bousselmi,Julien M. Hendrickx,François Glineur.Performance Estimation of second-order optimization methods on classes of univariate functions[EB/OL].(2025-06-28)[2025-07-17].https://arxiv.org/abs/2506.22764.点此复制
评论