|国家预印本平台
首页|The Panaceas for Improving Low-Rank Decomposition in Communication-Efficient Federated Learning

The Panaceas for Improving Low-Rank Decomposition in Communication-Efficient Federated Learning

The Panaceas for Improving Low-Rank Decomposition in Communication-Efficient Federated Learning

来源:Arxiv_logoArxiv
英文摘要

To improve the training efficiency of federated learning (FL), previous research has employed low-rank decomposition techniques to reduce communication overhead. In this paper, we seek to enhance the performance of these low-rank decomposition methods. Specifically, we focus on three key issues related to decomposition in FL: what to decompose, how to decompose, and how to aggregate. Subsequently, we introduce three novel techniques: Model Update Decomposition (MUD), Block-wise Kronecker Decomposition (BKD), and Aggregation-Aware Decomposition (AAD), each targeting a specific issue. These techniques are complementary and can be applied simultaneously to achieve optimal performance. Additionally, we provide a rigorous theoretical analysis to ensure the convergence of the proposed MUD. Extensive experimental results show that our approach achieves faster convergence and superior accuracy compared to relevant baseline methods. The code is available at https://github.com/Leopold1423/fedmud-icml25.

Shiwei Li、Xiandi Luo、Haozhao Wang、Xing Tang、Shijie Xu、Weihong Luo、Yuhua Li、Xiuqiang He、Ruixuan Li

通信电子技术应用

Shiwei Li,Xiandi Luo,Haozhao Wang,Xing Tang,Shijie Xu,Weihong Luo,Yuhua Li,Xiuqiang He,Ruixuan Li.The Panaceas for Improving Low-Rank Decomposition in Communication-Efficient Federated Learning[EB/OL].(2025-05-29)[2025-06-14].https://arxiv.org/abs/2505.23176.点此复制

评论