Bridging Expressivity and Scalability with Adaptive Unitary SSMs
Bridging Expressivity and Scalability with Adaptive Unitary SSMs
Recent work has revealed that state space models (SSMs), while efficient for long-sequence processing, are fundamentally limited in their ability to represent formal languages particularly due to time-invariant and real-valued recurrence structures. In this work, we draw inspiration from adaptive and structured dynamics observed in biological neural systems and introduce the Adaptive Unitary State Space Model (AUSSM)- a novel class of SSMs that leverages skew-symmetric, input-dependent recurrence to achieve unitary evolution and high expressive power. Using algebraic automata theory, we prove that AUSSM can perform modulo counting and simulate solvable group automata at finite precision, enabling SSMs to model a broad class of regular languages that are out of reach for other SSM architectures. To overcome the practical inefficiencies of adaptive recurrence, we develop a separable convolution formulation and a CUDA implementation that enables scalable parallel training. Empirically, we show that AUSSM when interleaved with Mamba outperform prior SSMs on formal algorithmic tasks such as parity and modular arithmetic, and achieve competent performance on real-world long time-series classification benchmarks. Our results demonstrate that adaptive unitary recurrence provides a powerful and efficient inductive bias for both symbolic and continuous sequence modeling.
Arjun Karuvally、Franz Nowak、Anderson T. Keller、Carmen Amo Alonso、Terrence J. Sejnowski、Hava T. Siegelmann
计算技术、计算机技术
Arjun Karuvally,Franz Nowak,Anderson T. Keller,Carmen Amo Alonso,Terrence J. Sejnowski,Hava T. Siegelmann.Bridging Expressivity and Scalability with Adaptive Unitary SSMs[EB/OL].(2025-07-07)[2025-07-16].https://arxiv.org/abs/2507.05238.点此复制
评论