|国家预印本平台
| 注册
首页|DenseRec: Revisiting Dense Content Embeddings for Sequential Transformer-based Recommendation

DenseRec: Revisiting Dense Content Embeddings for Sequential Transformer-based Recommendation

DenseRec: Revisiting Dense Content Embeddings for Sequential Transformer-based Recommendation

来源:Arxiv_logoArxiv
英文摘要

Transformer-based sequential recommenders, such as SASRec or BERT4Rec, typically rely solely on learned item ID embeddings, making them vulnerable to the item cold-start problem, particularly in environments with dynamic item catalogs. While dense content embeddings from pre-trained models offer potential solutions, direct integration into transformer-based recommenders has consistently underperformed compared to ID-only approaches. We revisit this integration challenge and propose DenseRec, a simple yet effective method that introduces a dual-path embedding approach. DenseRec learns a linear projection from the dense embedding space into the ID embedding space during training, enabling seamless generalization to previously unseen items without requiring specialized embedding models or complex infrastructure. In experiments on three real-world datasets, we find DenseRec to consistently outperform an ID-only SASRec baseline, even without additional hyperparameter tuning and while using compact embedding models. Our analysis suggests improvements primarily arise from better sequence representations in the presence of unseen items, positioning DenseRec as a practical and robust solution for cold-start sequential recommendation.

Jan Malte Lichtenberg、Antonio De Candia、Matteo Ruffini

计算技术、计算机技术

Jan Malte Lichtenberg,Antonio De Candia,Matteo Ruffini.DenseRec: Revisiting Dense Content Embeddings for Sequential Transformer-based Recommendation[EB/OL].(2025-08-25)[2025-09-06].https://arxiv.org/abs/2508.18442.点此复制

评论