|国家预印本平台
首页|IMPACT: Importance-Aware Activation Space Reconstruction

IMPACT: Importance-Aware Activation Space Reconstruction

IMPACT: Importance-Aware Activation Space Reconstruction

来源:Arxiv_logoArxiv
英文摘要

Large language models (LLMs) achieve strong performance across many domains but are difficult to deploy in resource-constrained settings due to their size. Low-rank weight matrix compression is a popular strategy for reducing model size, typically by minimizing weight reconstruction error under the assumption that weights are low-rank. However, this assumption often does not hold in LLMs. Instead, LLM activations exhibit stronger low-rank structure-prompting a shift toward minimizing activation reconstruction error. We show that this shift alone is insufficient: activation dimensions contribute unequally to model performance, and uniform reconstruction can harm performance. We propose IMPACT, a principled framework for importance-aware activation reconstruction that links model compression decisions to their impact on model behavior. IMPACT formulates an optimization problem that considers both activation structure and gradient sensitivity, and derives a closed-form solution where the optimal reconstruction bases are the eigenvectors of an importance-weighted activation covariance matrix. This enables low-rank approximations explicitly optimized to preserve accuracy. Experiments across diverse models and tasks show that IMPACT achieves up to 48.6% greater model size reduction with accuracy comparable to state-of-the-art baselines.

Md Mokarram Chowdhury、Daniel Agyei Asante、Ernie Chang、Yang Li

计算技术、计算机技术

Md Mokarram Chowdhury,Daniel Agyei Asante,Ernie Chang,Yang Li.IMPACT: Importance-Aware Activation Space Reconstruction[EB/OL].(2025-07-04)[2025-07-16].https://arxiv.org/abs/2507.03828.点此复制

评论