|国家预印本平台
首页|Long-Sequence Memory with Temporal Kernels and Dense Hopfield Functionals

Long-Sequence Memory with Temporal Kernels and Dense Hopfield Functionals

Long-Sequence Memory with Temporal Kernels and Dense Hopfield Functionals

来源:Arxiv_logoArxiv
英文摘要

In this study we introduce a novel energy functional for long-sequence memory, building upon the framework of dense Hopfield networks which achieves exponential storage capacity through higher-order interactions. Building upon earlier work on long-sequence Hopfield memory models, we propose a temporal kernal $K(m, k)$ to incorporate temporal dependencies, enabling efficient sequential retrieval of patterns over extended sequences. We demonstrate the successful application of this technique for the storage and sequential retrieval of movies frames which are well suited for this because of the high dimensional vectors that make up each frame creating enough variation between even sequential frames in the high dimensional space. The technique has applications in modern transformer architectures, including efficient long-sequence modeling, memory augmentation, improved attention with temporal bias, and enhanced handling of long-term dependencies in time-series data. Our model offers a promising approach to address the limitations of transformers in long-context tasks, with potential implications for natural language processing, forecasting, and beyond.

Ahmed Farooq

计算技术、计算机技术

Ahmed Farooq.Long-Sequence Memory with Temporal Kernels and Dense Hopfield Functionals[EB/OL].(2025-06-27)[2025-07-16].https://arxiv.org/abs/2507.01052.点此复制

评论