SemSR: Semantics aware robust Session-based Recommendations
SemSR: Semantics aware robust Session-based Recommendations
Session-based recommendation (SR) models aim to recommend items to anonymous users based on their behavior during the current session. While various SR models in the literature utilize item sequences to predict the next item, they often fail to leverage semantic information from item titles or descriptions impeding session intent identification and interpretability. Recent research has explored Large Language Models (LLMs) as promising approaches to enhance session-based recommendations, with both prompt-based and fine-tuning based methods being widely investigated. However, prompt-based methods struggle to identify optimal prompts that elicit correct reasoning and lack task-specific feedback at test time, resulting in sub-optimal recommendations. Fine-tuning methods incorporate domain-specific knowledge but incur significant computational costs for implementation and maintenance. In this paper, we present multiple approaches to utilize LLMs for session-based recommendation: (i) in-context LLMs as recommendation agents, (ii) LLM-generated representations for semantic initialization of deep learning SR models, and (iii) integration of LLMs with data-driven SR models. Through comprehensive experiments on two real-world publicly available datasets, we demonstrate that LLM-based methods excel at coarse-level retrieval (high recall values), while traditional data-driven techniques perform well at fine-grained ranking (high Mean Reciprocal Rank values). Furthermore, the integration of LLMs with data-driven SR models significantly out performs both standalone LLM approaches and data-driven deep learning models, as well as baseline SR models, in terms of both Recall and MRR metrics.
Jyoti Narwariya、Priyanka Gupta、Muskan Gupta、Jyotsana Khatri、Lovekesh Vig
计算技术、计算机技术
Jyoti Narwariya,Priyanka Gupta,Muskan Gupta,Jyotsana Khatri,Lovekesh Vig.SemSR: Semantics aware robust Session-based Recommendations[EB/OL].(2025-08-28)[2025-09-06].https://arxiv.org/abs/2508.20587.点此复制
评论