|国家预印本平台
首页|Block-Biased Mamba for Long-Range Sequence Processing

Block-Biased Mamba for Long-Range Sequence Processing

Block-Biased Mamba for Long-Range Sequence Processing

来源:Arxiv_logoArxiv
英文摘要

Mamba extends earlier state space models (SSMs) by introducing input-dependent dynamics, and has demonstrated strong empirical performance across a range of domains, including language modeling, computer vision, and foundation models. However, a surprising weakness remains: despite being built on architectures designed for long-range dependencies, Mamba performs poorly on long-range sequential tasks. Understanding and addressing this gap is important for improving Mamba's universality and versatility. In this work, we analyze Mamba's limitations through three perspectives: expressiveness, inductive bias, and training stability. Our theoretical results show how Mamba falls short in each of these aspects compared to earlier SSMs such as S4D. To address these issues, we propose $\text{B}_2\text{S}_6$, a simple extension of Mamba's S6 unit that combines block-wise selective dynamics with a channel-specific bias. We prove that these changes equip the model with a better-suited inductive bias and improve its expressiveness and stability. Empirically, $\text{B}_2\text{S}_6$ outperforms S4 and S4D on Long-Range Arena (LRA) tasks while maintaining Mamba's performance on language modeling benchmarks.

Annan Yu、N. Benjamin Erichson

计算技术、计算机技术

Annan Yu,N. Benjamin Erichson.Block-Biased Mamba for Long-Range Sequence Processing[EB/OL].(2025-05-13)[2025-06-23].https://arxiv.org/abs/2505.09022.点此复制

评论