|国家预印本平台
首页|Trustworthy Efficient Communication for Distributed Learning using LQ-SGD Algorithm

Trustworthy Efficient Communication for Distributed Learning using LQ-SGD Algorithm

Trustworthy Efficient Communication for Distributed Learning using LQ-SGD Algorithm

来源:Arxiv_logoArxiv
英文摘要

We propose LQ-SGD (Low-Rank Quantized Stochastic Gradient Descent), an efficient communication gradient compression algorithm designed for distributed training. LQ-SGD further develops on the basis of PowerSGD by incorporating the low-rank approximation and log-quantization techniques, which drastically reduce the communication overhead, while still ensuring the convergence speed of training and model accuracy. In addition, LQ-SGD and other compression-based methods show stronger resistance to gradient inversion than traditional SGD, providing a more robust and efficient optimization path for distributed learning systems.

Hongyang Li、Lincen Bai、Caesar Wu、Mohammed Chadli、Said Mammar、Pascal Bouvry

通信无线通信

Hongyang Li,Lincen Bai,Caesar Wu,Mohammed Chadli,Said Mammar,Pascal Bouvry.Trustworthy Efficient Communication for Distributed Learning using LQ-SGD Algorithm[EB/OL].(2025-06-22)[2025-07-16].https://arxiv.org/abs/2506.17974.点此复制

评论