|国家预印本平台
首页|Distributed Composite Optimization with Sub-Weibull Noises

Distributed Composite Optimization with Sub-Weibull Noises

Distributed Composite Optimization with Sub-Weibull Noises

来源:Arxiv_logoArxiv
英文摘要

With the rapid development of multi-agent distributed optimization (MA-DO) theory over the past decade, the distributed stochastic gradient method (DSGM) occupies an important position. Although the theory of different DSGMs has been widely established, the main-stream results of existing work are still derived under the condition of light-tailed stochastic gradient noises. Increasing recent examples from various fields, indicate that, the light-tailed noise model is overly idealized in many practical instances, failing to capture the complexity and variability of noises in real-world scenarios, such as the presence of outliers or extreme values from data science and statistical learning. To address this issue, we propose a new DSGM framework that incorporates stochastic gradients under sub-Weibull randomness. We study a distributed composite stochastic mirror descent scheme with sub-Weibull gradient noise (DCSMD-SW) for solving a distributed composite optimization (DCO) problem over the time-varying multi-agent network. By investigating sub-Weibull randomness in DCSMD for the first time, we show that the algorithm is applicable in common heavy-tailed noise environments while also guaranteeing good convergence properties. We comprehensively study the convergence performance of DCSMD-SW. Satisfactory high probability convergence rates are derived for DCSMD-SW without any smoothness requirement. The work also offers a unified analysis framework for several critical cases of both algorithms and noise environments.

Zhan Yu、Zhongjie Shi、Deming Yuan

计算技术、计算机技术

Zhan Yu,Zhongjie Shi,Deming Yuan.Distributed Composite Optimization with Sub-Weibull Noises[EB/OL].(2025-06-15)[2025-06-30].https://arxiv.org/abs/2506.12901.点此复制

评论