|国家预印本平台
首页|Intra-class Patch Swap for Self-Distillation

Intra-class Patch Swap for Self-Distillation

Intra-class Patch Swap for Self-Distillation

来源:Arxiv_logoArxiv
英文摘要

Knowledge distillation (KD) is a valuable technique for compressing large deep learning models into smaller, edge-suitable networks. However, conventional KD frameworks rely on pre-trained high-capacity teacher networks, which introduce significant challenges such as increased memory/storage requirements, additional training costs, and ambiguity in selecting an appropriate teacher for a given student model. Although a teacher-free distillation (self-distillation) has emerged as a promising alternative, many existing approaches still rely on architectural modifications or complex training procedures, which limit their generality and efficiency. To address these limitations, we propose a novel framework based on teacher-free distillation that operates using a single student network without any auxiliary components, architectural modifications, or additional learnable parameters. Our approach is built on a simple yet highly effective augmentation, called intra-class patch swap augmentation. This augmentation simulates a teacher-student dynamic within a single model by generating pairs of intra-class samples with varying confidence levels, and then applying instance-to-instance distillation to align their predictive distributions. Our method is conceptually simple, model-agnostic, and easy to implement, requiring only a single augmentation function. Extensive experiments across image classification, semantic segmentation, and object detection show that our method consistently outperforms both existing self-distillation baselines and conventional teacher-based KD approaches. These results suggest that the success of self-distillation could hinge on the design of the augmentation itself. Our codes are available at https://github.com/hchoi71/Intra-class-Patch-Swap.

Hongjun Choi、Eun Som Jeon、Ankita Shukla、Pavan Turaga

10.1016/j.neucom.2025.130408

计算技术、计算机技术

Hongjun Choi,Eun Som Jeon,Ankita Shukla,Pavan Turaga.Intra-class Patch Swap for Self-Distillation[EB/OL].(2025-05-20)[2025-07-16].https://arxiv.org/abs/2505.14124.点此复制

评论