|国家预印本平台
首页|Physics-inspired Energy Transition Neural Network for Sequence Learning

Physics-inspired Energy Transition Neural Network for Sequence Learning

Physics-inspired Energy Transition Neural Network for Sequence Learning

来源:Arxiv_logoArxiv
英文摘要

Recently, the superior performance of Transformers has made them a more robust and scalable solution for sequence modeling than traditional recurrent neural networks (RNNs). However, the effectiveness of Transformer in capturing long-term dependencies is primarily attributed to their comprehensive pair-modeling process rather than inherent inductive biases toward sequence semantics. In this study, we explore the capabilities of pure RNNs and reassess their long-term learning mechanisms. Inspired by the physics energy transition models that track energy changes over time, we propose a effective recurrent structure called the``Physics-inspired Energy Transition Neural Network" (PETNN). We demonstrate that PETNN's memory mechanism effectively stores information over long-term dependencies. Experimental results indicate that PETNN outperforms transformer-based methods across various sequence tasks. Furthermore, owing to its recurrent nature, PETNN exhibits significantly lower complexity. Our study presents an optimal foundational recurrent architecture and highlights the potential for developing effective recurrent neural networks in fields currently dominated by Transformer.

Zhou Wu、Junyi An、Baile Xu、Furao Shen、Jian Zhao

计算技术、计算机技术

Zhou Wu,Junyi An,Baile Xu,Furao Shen,Jian Zhao.Physics-inspired Energy Transition Neural Network for Sequence Learning[EB/OL].(2025-05-06)[2025-07-02].https://arxiv.org/abs/2505.03281.点此复制

评论