Bidirectional Knowledge Distillation for Enhancing Sequential Recommendation with Large Language Models
Bidirectional Knowledge Distillation for Enhancing Sequential Recommendation with Large Language Models
Large language models (LLMs) have demonstrated exceptional performance in understanding and generating semantic patterns, making them promising candidates for sequential recommendation tasks. However, when combined with conventional recommendation models (CRMs), LLMs often face challenges related to high inference costs and static knowledge transfer methods. In this paper, we propose a novel mutual distillation framework, LLMD4Rec, that fosters dynamic and bidirectional knowledge exchange between LLM-centric and CRM-based recommendation systems. Unlike traditional unidirectional distillation methods, LLMD4Rec enables iterative optimization by alternately refining both models, enhancing the semantic understanding of CRMs and enriching LLMs with collaborative signals from user-item interactions. By leveraging sample-wise adaptive weighting and aligning output distributions, our approach eliminates the need for additional parameters while ensuring effective knowledge transfer. Extensive experiments on real-world datasets demonstrate that LLMD4Rec significantly improves recommendation accuracy across multiple benchmarks without increasing inference costs. This method provides a scalable and efficient solution for combining the strengths of both LLMs and CRMs in sequential recommendation systems.
Jiongran Wu、Jiahao Liu、Dongsheng Li、Guangping Zhang、Mingzhe Han、Hansu Gu、Peng Zhang、Li Shang、Tun Lu、Ning Gu
计算技术、计算机技术
Jiongran Wu,Jiahao Liu,Dongsheng Li,Guangping Zhang,Mingzhe Han,Hansu Gu,Peng Zhang,Li Shang,Tun Lu,Ning Gu.Bidirectional Knowledge Distillation for Enhancing Sequential Recommendation with Large Language Models[EB/OL].(2025-05-23)[2025-06-06].https://arxiv.org/abs/2505.18120.点此复制
评论