|国家预印本平台
首页|Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization

Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization

Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization

来源:Arxiv_logoArxiv
英文摘要

This work assesses both empirically and theoretically, using the performance estimation methodology, how robust different first-order optimization methods are when subject to relative inexactness in their gradient computations. Relative inexactness occurs, for example, when compressing the gradient using fewer bits of information, which happens when dealing with large-scale problems on GPUs. Three major families of methods are analyzed: constant step gradient descent, long-step methods, and accelerated methods. The latter two are first shown to be theoretically not robust to inexactness. Then, a semi-heuristic shortening factor is introduced to improve their theoretical guarantees. All methods are subsequently tested on a concrete inexact problem, with two different types of relative inexactness, and it is observed that both accelerated methods are much more robust than expected, and that the shortening factor significantly helps the long-step methods. In the end, all shortened methods appear to be promising, even in this inexact setting.

Pierre Vernimmen、Fran?ois Glineur

计算技术、计算机技术

Pierre Vernimmen,Fran?ois Glineur.Empirical and computer-aided robustness analysis of long-step and accelerated methods in smooth convex optimization[EB/OL].(2025-06-11)[2025-06-30].https://arxiv.org/abs/2506.09730.点此复制

评论