|国家预印本平台
首页|Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments

Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments

Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments

来源:Arxiv_logoArxiv
英文摘要

State-space models (SSMs), particularly the Mamba architecture, have emerged as powerful alternatives to Transformers for sequence modeling, offering linear-time complexity and competitive performance across diverse tasks. However, their large parameter counts pose significant challenges for deployment in resource-constrained environments. We propose a novel unstructured pruning framework tailored for Mamba models that achieves up to 70\% parameter reduction while retaining over 95\% of the original performance. Our approach integrates three key innovations: (1) a gradient-aware magnitude pruning technique that combines weight magnitude and gradient information to identify less critical parameters, (2) an iterative pruning schedule that gradually increases sparsity to maintain model stability, and (3) a global pruning strategy that optimizes parameter allocation across the entire model. Through extensive experiments on WikiText-103, Long Range Arena, and ETT time-series benchmarks, we demonstrate significant efficiency gains with minimal performance degradation. Our analysis of pruning effects on Mamba's components reveals critical insights into the architecture's redundancy and robustness, enabling practical deployment in resource-constrained settings while broadening Mamba's applicability.

Ibne Farabi Shihab、Sanjeda Akter、Anuj Sharma

计算技术、计算机技术

Ibne Farabi Shihab,Sanjeda Akter,Anuj Sharma.Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments[EB/OL].(2025-05-13)[2025-06-08].https://arxiv.org/abs/2505.08299.点此复制

评论