Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models
Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models
Ensuring the products displayed in e-commerce search results are relevant to users queries is crucial for improving the user experience. With their advanced semantic understanding, deep learning models have been widely used for relevance matching in search tasks. While large language models (LLMs) offer superior ranking capabilities, it is challenging to deploy LLMs in real-time systems due to the high-latency requirements. To leverage the ranking power of LLMs while meeting the low-latency demands of production systems, we propose a novel framework that distills a high performing LLM into a more efficient, low-latency student model. To help the student model learn more effectively from the teacher model, we first train the teacher LLM as a classification model with soft targets. Then, we train the student model to capture the relevance margin between pairs of products for a given query using mean squared error loss. Instead of using the same training data as the teacher model, we significantly expand the student model dataset by generating unlabeled data and labeling it with the teacher model predictions. Experimental results show that the student model performance continues to improve as the size of the augmented training data increases. In fact, with enough augmented data, the student model can outperform the teacher model. The student model has been successfully deployed in production at Walmart.com with significantly positive metrics.
Hongwei Shang、Nguyen Vo、Nitin Yadav、Tian Zhang、Ajit Puthenputhussery、Xunfan Cai、Shuyi Chen、Prijith Chandran、Changsung Kang
计算技术、计算机技术
Hongwei Shang,Nguyen Vo,Nitin Yadav,Tian Zhang,Ajit Puthenputhussery,Xunfan Cai,Shuyi Chen,Prijith Chandran,Changsung Kang.Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models[EB/OL].(2025-05-11)[2025-06-18].https://arxiv.org/abs/2505.07105.点此复制
评论