Cooperative Gradient Coding
Cooperative Gradient Coding
This work studies gradient coding (GC) in the context of distributed training problems with unreliable communication. We propose cooperative GC (CoGC), a novel gradient-sharing-based GC framework that leverages cooperative communication among clients. This approach ultimately eliminates the need for dataset replication, making it both communication- and computation-efficient and suitable for federated learning (FL). By employing the standard GC decoding mechanism, CoGC yields strictly binary outcomes: either the global model is exactly recovered, or the decoding fails entirely, with no intermediate results. This characteristic ensures the optimality of the training and demonstrates strong resilience to client-to-server communication failures when the communication channels among clients are in good condition. However, it may also result in communication inefficiency and hinder convergence due to its lack of flexibility, especially when communication channels among clients are in poor condition. To overcome this limitation and further harness the potential of GC matrices, we propose a complementary decoding mechanism, termed GC$^+$, which leverages information that would otherwise be discarded during GC decoding failures. This approach significantly improves system reliability under unreliable communication, as the full recovery of the global model typically dominates in GC$^+$. To conclude, this work establishes solid theoretical frameworks for both CoGC and GC$^+$. We provide complete outage analyses for each decoding mechanism, along with a rigorous investigation of how outages affect the structure and performance of GC matrices. Building on these analyses, we derive convergence bounds for both decoding mechanisms. Finally, the effectiveness of CoGC and GC$^+$ is validated through extensive simulations.
Shudi Weng、Ming Xiao、Chao Ren、Mikael Skoglund
通信无线通信
Shudi Weng,Ming Xiao,Chao Ren,Mikael Skoglund.Cooperative Gradient Coding[EB/OL].(2025-07-07)[2025-07-21].https://arxiv.org/abs/2507.05230.点此复制
评论