|国家预印本平台
首页|WINA: Weight Informed Neuron Activation for Accelerating Large Language Model Inference

WINA: Weight Informed Neuron Activation for Accelerating Large Language Model Inference

WINA: Weight Informed Neuron Activation for Accelerating Large Language Model Inference

来源:Arxiv_logoArxiv
英文摘要

The growing computational demands of large language models (LLMs) make efficient inference and activation strategies increasingly critical. While recent approaches, such as Mixture-of-Experts (MoE), leverage selective activation but require specialized training, training-free sparse activation methods offer broader applicability and superior resource efficiency through their plug-and-play design. However, many existing methods rely solely on hidden state magnitudes to determine activation, resulting in high approximation errors and suboptimal inference accuracy. To address these limitations, we propose WINA (Weight Informed Neuron Activation), a novel, simple, and training-free sparse activation framework that jointly considers hidden state magnitudes and the column-wise $\ell_2$-norms of weight matrices. We show that this leads to a sparsification strategy that obtains optimal approximation error bounds with theoretical guarantees tighter than existing techniques. Empirically, WINA also outperforms state-of-the-art methods (e.g., TEAL) by up to $2.94\%$ in average performance at the same sparsity levels, across a diverse set of LLM architectures and datasets. These results position WINA as a new performance frontier for training-free sparse activation in LLM inference, advancing training-free sparse activation methods and setting a robust baseline for efficient inference. The source code is available at https://github.com/microsoft/wina.

Sihan Chen、Dan Zhao、Jongwoo Ko、Colby Banbury、Huiping Zhuang、Luming Liang、Tianyi Chen

计算技术、计算机技术

Sihan Chen,Dan Zhao,Jongwoo Ko,Colby Banbury,Huiping Zhuang,Luming Liang,Tianyi Chen.WINA: Weight Informed Neuron Activation for Accelerating Large Language Model Inference[EB/OL].(2025-05-25)[2025-06-07].https://arxiv.org/abs/2505.19427.点此复制

评论