A Comprehensive Survey on Knowledge Distillation
A Comprehensive Survey on Knowledge Distillation
Deep Neural Networks (DNNs) have achieved notable performance in the fields of computer vision and natural language processing with various applications in both academia and industry. However, with recent advancements in DNNs and transformer models with a tremendous number of parameters, deploying these large models on edge devices causes serious issues such as high runtime and memory consumption. This is especially concerning with the recent large-scale foundation models, Vision-Language Models (VLMs), and Large Language Models (LLMs). Knowledge Distillation (KD) is one of the prominent techniques proposed to address the aforementioned problems using a teacher-student architecture. More specifically, a lightweight student model is trained using additional knowledge from a cumbersome teacher model. In this work, a comprehensive survey of knowledge distillation methods is proposed. This includes reviewing KD from different aspects: distillation sources, distillation schemes, distillation algorithms, distillation by modalities, applications of distillation, and comparison among existing methods. In contrast to most existing surveys, which are either outdated or simply update former surveys, this work proposes a comprehensive survey with a new point of view and representation structure that categorizes and investigates the most recent methods in knowledge distillation. This survey considers various critically important subcategories, including KD for diffusion models, 3D inputs, foundational models, transformers, and LLMs. Furthermore, existing challenges in KD and possible future research directions are discussed. Github page of the project: https://github.com/IPL-Sharif/KD_Survey
Zeynab Yasamani Ghamchi、Amir M. Mansourian、Vida Ramezanian、Kimia Dinashi、Amir Mohammad Babaei、Alireza Taherian、Shohreh Kasaei、Elaheh Badali Golezani、Masoud Ghafouri、Amirali Miri、Rozhan Ahmadi
计算技术、计算机技术
Zeynab Yasamani Ghamchi,Amir M. Mansourian,Vida Ramezanian,Kimia Dinashi,Amir Mohammad Babaei,Alireza Taherian,Shohreh Kasaei,Elaheh Badali Golezani,Masoud Ghafouri,Amirali Miri,Rozhan Ahmadi.A Comprehensive Survey on Knowledge Distillation[EB/OL].(2025-03-15)[2025-05-12].https://arxiv.org/abs/2503.12067.点此复制
评论