|国家预印本平台
首页|TAH-QUANT: Effective Activation Quantization in Pipeline Parallelism over Slow Network

TAH-QUANT: Effective Activation Quantization in Pipeline Parallelism over Slow Network

TAH-QUANT: Effective Activation Quantization in Pipeline Parallelism over Slow Network

来源:Arxiv_logoArxiv
英文摘要

Decentralized training of large language models offers the opportunity to pool computational resources across geographically distributed participants but faces significant network communication bottlenecks, particularly in pipeline-parallel settings. While pipeline parallelism partitions model layers across devices to handle large-scale models, it necessitates frequent communication of intermediate activations, creating challenges when network bandwidth is limited. Existing activation compression methods, such as AQ-SGD, mitigate quantization-induced errors through error compensation but impose prohibitive memory overhead by requiring storage of previous activations. To address these issues, we introduce TAH-Quant (Tile-wise Adaptive Hadamard Quantization), a novel activation quantization framework designed specifically for pipeline parallelism. Our approach integrates fine-grained tile-wise quantization for precise control, entropy-guided token-level adaptive bit allocation for optimal bit usage, and a Hadamard-based transform with pivot element swapping to effectively suppress quantization outliers. We further provide a theoretical analysis, proving that pipeline parallel training equipped with TAH-Quant maintains a convergence rate of $\mathcal{O}(1/\sqrt{T})$, matching that of vanilla stochastic gradient descent. Extensive experiments on diverse LLM tasks demonstrate that TAH-Quant achieves aggressive activation quantization (3-4 bits) ratio, which provides up to 4.3$\times$ end-to-end speedup without compromising training convergence, matches state-of-the-art methods, incurs no extra memory overhead, and generalizes well across different training scenarios.

Guangxin He、Yuan Cao、Yutong He、Tianyi Bai、Kun Yuan、Binhang Yuan

计算技术、计算机技术

Guangxin He,Yuan Cao,Yutong He,Tianyi Bai,Kun Yuan,Binhang Yuan.TAH-QUANT: Effective Activation Quantization in Pipeline Parallelism over Slow Network[EB/OL].(2025-06-02)[2025-06-25].https://arxiv.org/abs/2506.01352.点此复制

评论