Integrating Textual Embeddings from Contrastive Learning with Generative Recommender for Enhanced Personalization
Integrating Textual Embeddings from Contrastive Learning with Generative Recommender for Enhanced Personalization
Recent advances in recommender systems have highlighted the complementary strengths of generative modeling and pretrained language models. We propose a hybrid framework that augments the Hierarchical Sequential Transduction Unit (HSTU) generative recommender with BLaIR -- a contrastive text embedding model. This integration enriches item representations with semantic signals from textual metadata while preserving HSTU's powerful sequence modeling capabilities. We evaluate our method on two domains from the Amazon Reviews 2023 dataset, comparing it against the original HSTU and a variant that incorporates embeddings from OpenAI's state-of-the-art text-embedding-3-large model. While the OpenAI embedding model is likely trained on a substantially larger corpus with significantly more parameters, our lightweight BLaIR-enhanced approach -- pretrained on domain-specific data -- consistently achieves better performance, highlighting the effectiveness of contrastive text embeddings in compute-efficient settings.
Yijun Liu
计算技术、计算机技术
Yijun Liu.Integrating Textual Embeddings from Contrastive Learning with Generative Recommender for Enhanced Personalization[EB/OL].(2025-04-13)[2025-05-02].https://arxiv.org/abs/2504.10545.点此复制
评论