|国家预印本平台
首页|Quantizing Small-Scale State-Space Models for Edge AI

Quantizing Small-Scale State-Space Models for Edge AI

Quantizing Small-Scale State-Space Models for Edge AI

来源:Arxiv_logoArxiv
英文摘要

State-space models (SSMs) have recently gained attention in deep learning for their ability to efficiently model long-range dependencies, making them promising candidates for edge-AI applications. In this paper, we analyze the effects of quantization on small-scale SSMs with a focus on reducing memory and computational costs while maintaining task performance. Using the S4D architecture, we first investigate post-training quantization (PTQ) and show that the state matrix A and internal state x are particularly sensitive to quantization. Furthermore, we analyze the impact of different quantization techniques applied to the parameters and activations in the S4D architecture. To address the observed performance drop after Post-training Quantization (PTQ), we apply Quantization-aware Training (QAT), significantly improving performance from 40% (PTQ) to 96% on the sequential MNIST benchmark at 8-bit precision. We further demonstrate the potential of QAT in enabling sub-8-bit precisions and evaluate different parameterization schemes for QAT stability. Additionally, we propose a heterogeneous quantization strategy that assigns different precision levels to model components, reducing the overall memory footprint by a factor of 6x without sacrificing performance. Our results provide actionable insights for deploying quantized SSMs in resource-constrained environments.

Leo Zhao、Tristan Torchet、Melika Payvand、Laura Kriener、Filippo Moro

计算技术、计算机技术

Leo Zhao,Tristan Torchet,Melika Payvand,Laura Kriener,Filippo Moro.Quantizing Small-Scale State-Space Models for Edge AI[EB/OL].(2025-06-14)[2025-06-28].https://arxiv.org/abs/2506.12480.点此复制

评论