|国家预印本平台
首页|Accelerated Gradient Methods Through Variable and Operator Splitting

Accelerated Gradient Methods Through Variable and Operator Splitting

Accelerated Gradient Methods Through Variable and Operator Splitting

来源:Arxiv_logoArxiv
英文摘要

This paper introduces a unified framework for accelerated gradient methods through the variable and operator splitting (VOS). The operator splitting decouples the optimization process into simpler subproblems, and more importantly, the variable splitting leads to acceleration. The key contributions include the development of strong Lyapunov functions to analyze stability and convergence rates, as well as advanced discretization techniques like Accelerated Over-Relaxation (AOR) and extrapolation by the predictor-corrector methods (EPC). For convex case, we introduce a dynamic updating parameter and a perturbed VOS flow. The framework effectively handles a wide range of optimization problems, including convex optimization, composite convex optimization, and saddle point systems with bilinear coupling.

Jingrong Wei、Long Chen、Luo Hao

自动化基础理论

Jingrong Wei,Long Chen,Luo Hao.Accelerated Gradient Methods Through Variable and Operator Splitting[EB/OL].(2025-05-06)[2025-06-18].https://arxiv.org/abs/2505.04065.点此复制

评论