Jointly Computation- and Communication-Efficient Distributed Learning
Jointly Computation- and Communication-Efficient Distributed Learning
We address distributed learning problems over undirected networks. Specifically, we focus on designing a novel ADMM-based algorithm that is jointly computation- and communication-efficient. Our design guarantees computational efficiency by allowing agents to use stochastic gradients during local training. Moreover, communication efficiency is achieved as follows: i) the agents perform multiple training epochs between communication rounds, and ii) compressed transmissions are used. We prove exact linear convergence of the algorithm in the strongly convex setting. We corroborate our theoretical results by numerical comparisons with state of the art techniques on a classification task.
Xiaoxing Ren、Nicola Bastianello、Karl H. Johansson、Thomas Parisini
通信计算技术、计算机技术
Xiaoxing Ren,Nicola Bastianello,Karl H. Johansson,Thomas Parisini.Jointly Computation- and Communication-Efficient Distributed Learning[EB/OL].(2025-08-21)[2025-09-02].https://arxiv.org/abs/2508.15509.点此复制
评论