|国家预印本平台
首页|Leveraging Historical and Current Interests for Continual Sequential Recommendation

Leveraging Historical and Current Interests for Continual Sequential Recommendation

Leveraging Historical and Current Interests for Continual Sequential Recommendation

来源:Arxiv_logoArxiv
英文摘要

Sequential recommendation models based on the Transformer architecture show superior performance in harnessing long-range dependencies within user behavior via self-attention. However, naively updating them on continuously arriving non-stationary data streams incurs prohibitive computation costs or leads to catastrophic forgetting. To address this, we propose Continual Sequential Transformer for Recommendation (CSTRec) that effectively leverages well-preserved historical user interests while capturing current interests. At its core is Continual Sequential Attention (CSA), a linear attention mechanism that retains past knowledge without direct access to old data. CSA integrates two key components: (1) Cauchy-Schwarz Normalization that stabilizes training under uneven interaction frequencies, and (2) Collaborative Interest Enrichment that mitigates forgetting through shared, learnable interest pools. We further introduce a technique that facilitates learning for cold-start users by transferring historical knowledge from behaviorally similar existing users. Extensive experiments on three real-world datasets indicate that CSTRec outperforms state-of-the-art baselines in both knowledge retention and acquisition.

Gyuseok Lee、Hyunsik Yoo、Junyoung Hwang、SeongKu Kang、Hwanjo Yu

计算技术、计算机技术

Gyuseok Lee,Hyunsik Yoo,Junyoung Hwang,SeongKu Kang,Hwanjo Yu.Leveraging Historical and Current Interests for Continual Sequential Recommendation[EB/OL].(2025-06-09)[2025-07-16].https://arxiv.org/abs/2506.07466.点此复制

评论