Message-Passing State-Space Models: Improving Graph Learning with Modern Sequence Modeling
Message-Passing State-Space Models: Improving Graph Learning with Modern Sequence Modeling
The recent success of State-Space Models (SSMs) in sequence modeling has motivated their adaptation to graph learning, giving rise to Graph State-Space Models (GSSMs). However, existing GSSMs operate by applying SSM modules to sequences extracted from graphs, often compromising core properties such as permutation equivariance, message-passing compatibility, and computational efficiency. In this paper, we introduce a new perspective by embedding the key principles of modern SSM computation directly into the Message-Passing Neural Network framework, resulting in a unified methodology for both static and temporal graphs. Our approach, MP-SSM, enables efficient, permutation-equivariant, and long-range information propagation while preserving the architectural simplicity of message passing. Crucially, MP-SSM enables an exact sensitivity analysis, which we use to theoretically characterize information flow and evaluate issues like vanishing gradients and over-squashing in the deep regime. Furthermore, our design choices allow for a highly optimized parallel implementation akin to modern SSMs. We validate MP-SSM across a wide range of tasks, including node classification, graph property prediction, long-range benchmarks, and spatiotemporal forecasting, demonstrating both its versatility and strong empirical performance.
Andrea Ceni、Alessio Gravina、Claudio Gallicchio、Davide Bacciu、Carola-Bibiane Schonlieb、Moshe Eliasof
计算技术、计算机技术
Andrea Ceni,Alessio Gravina,Claudio Gallicchio,Davide Bacciu,Carola-Bibiane Schonlieb,Moshe Eliasof.Message-Passing State-Space Models: Improving Graph Learning with Modern Sequence Modeling[EB/OL].(2025-05-24)[2025-06-12].https://arxiv.org/abs/2505.18728.点此复制
评论