Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions
Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions
The evolution of sequence modeling architectures, from recurrent neural networks and convolutional models to Transformers and structured state-space models, reflects ongoing efforts to address the diverse temporal dependencies inherent in sequential data. Despite this progress, systematically characterizing the strengths and limitations of these architectures remains a fundamental challenge. In this work, we propose a synthetic benchmarking framework to evaluate how effectively different sequence models capture distinct temporal structures. The core of this approach is to generate synthetic targets, each characterized by a memory function and a parameter that determines the strength of temporal dependence. This setup allows us to produce a continuum of tasks that vary in temporal complexity, enabling fine-grained analysis of model behavior concerning specific memory properties. We focus on four representative memory functions, each corresponding to a distinct class of temporal structures. Experiments on several sequence modeling architectures confirm existing theoretical insights and reveal new findings. These results demonstrate the effectiveness of the proposed method in advancing theoretical understanding and highlight the importance of using controllable targets with clearly defined structures for evaluating sequence modeling architectures.
Haotian Jiang、Zeyu Bao、Shida Wang、Qianxiao Li
计算技术、计算机技术
Haotian Jiang,Zeyu Bao,Shida Wang,Qianxiao Li.Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions[EB/OL].(2025-06-05)[2025-06-15].https://arxiv.org/abs/2506.05678.点此复制
评论