Supplementary File: Cooperative Gradient Coding for Semi-Decentralized Federated Learning
Supplementary File: Cooperative Gradient Coding for Semi-Decentralized Federated Learning
Stragglers' effects are known to degrade FL performance. In this paper, we investigate federated learning (FL) over wireless networks in the presence of communication stragglers, where the power-constrained clients collaboratively train a global model by iteratively optimizing a local objective function with their local datasets and transmitting local model updates to the central parameter server (PS) through fading channels. To tackle communication stragglers without dataset sharing or prior information about the network at PS, we propose cooperative gradient coding (CoGC) for semi-decentralized FL to enable the exact global model recovery at PS. Furthermore, we conduct a thorough theoretical analysis of the proposed approach. Namely, an outage analysis of the proposed approach is provided, followed by a convergence analysis based on the failure probability of the global model recovery at PS. Nevertheless, simulation results reveal the superiority of the proposed approach in the presence of stragglers under imbalanced data distribution.
Chengxi Li、Shudi Weng、Ming Xiao、Mikael Skoglund
无线通信通信计算技术、计算机技术
Chengxi Li,Shudi Weng,Ming Xiao,Mikael Skoglund.Supplementary File: Cooperative Gradient Coding for Semi-Decentralized Federated Learning[EB/OL].(2024-03-31)[2025-08-02].https://arxiv.org/abs/2404.00780.点此复制
评论