|国家预印本平台
首页|X-EcoMLA: Upcycling Pre-Trained Attention into MLA for Efficient and Extreme KV Compression

X-EcoMLA: Upcycling Pre-Trained Attention into MLA for Efficient and Extreme KV Compression

X-EcoMLA: Upcycling Pre-Trained Attention into MLA for Efficient and Extreme KV Compression

来源:Arxiv_logoArxiv
英文摘要

Multi-head latent attention (MLA) is designed to optimize KV cache memory through low-rank key-value joint compression. Rather than caching keys and values separately, MLA stores their compressed latent representations, reducing memory overhead while maintaining the performance. While MLA improves memory efficiency without compromising language model accuracy, its major limitation lies in its integration during the pre-training phase, requiring models to be trained from scratch. This raises a key question: can we use MLA's benefits fully or partially in models that have already been pre-trained with different attention mechanisms? In this paper, we propose X-EcoMLA to deploy post training distillation to enable the upcycling of Transformer-based attention into an efficient hybrid MLA variant through lightweight post-training adaptation, bypassing the need for extensive pre-training. We demonstrate that leveraging the dark knowledge of a well-trained model can enhance training accuracy and enable extreme KV cache compression in MLA without compromising model performance. The experimental results show that our proposed method can effectively compress the KV cache while preserving the performance on the benchmarks; specifically, for Llama3.2-1B-Instruct baseline, a 6.4x compression achieves the same average score by using only 3.6B training tokens and 70 GPU hours on AMD MI300, whereas a 10.6x compression have less than 0.1\% average score drop with 7B training tokens and 140 GPU hours.

Guihong Li、Mehdi Rezagholizadeh、Mingyu Yang、Vikram Appia、Emad Barsoum

计算技术、计算机技术

Guihong Li,Mehdi Rezagholizadeh,Mingyu Yang,Vikram Appia,Emad Barsoum.X-EcoMLA: Upcycling Pre-Trained Attention into MLA for Efficient and Extreme KV Compression[EB/OL].(2025-03-14)[2025-05-02].https://arxiv.org/abs/2503.11132.点此复制

评论