|国家预印本平台
| 注册
首页|Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?

Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?

Kaidong Feng Zhu Sun Jie Yang Hui Fang Xinghua Qu Wenyuan Liu

Arxiv_logoArxiv

Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?

Kaidong Feng Zhu Sun Jie Yang Hui Fang Xinghua Qu Wenyuan Liu

作者信息

引用本文复制引用

Kaidong Feng,Zhu Sun,Jie Yang,Hui Fang,Xinghua Qu,Wenyuan Liu.Does Knowledge Distillation Matter for Large Language Model based Bundle Generation?[EB/OL].(2025-04-23)[2025-12-13].https://arxiv.org/abs/2504.17220.

学科分类

计算技术、计算机技术

评论

首发时间 2025-04-23
下载量:0
|
点击量:2
段落导航相关论文