|国家预印本平台
首页|Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation

Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation

Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation

来源:Arxiv_logoArxiv
英文摘要

Knowledge distillation compresses a larger neural model (teacher) into smaller, faster student models by training them to match teacher outputs. However, the internal computational transformations that occur during this process remain poorly understood. We apply techniques from mechanistic interpretability to analyze how internal circuits, representations, and activation patterns differ between teacher and student. Focusing on GPT2-small and its distilled counterpart DistilGPT2, we find that student models reorganize, compress, and discard teacher components, often resulting in stronger reliance on fewer individual components. To quantify functional alignment beyond output similarity, we introduce an alignment metric based on influence-weighted component similarity, validated across multiple tasks. Our findings reveal that while knowledge distillation preserves broad functional behaviors, it also causes significant shifts in internal computation, with important implications for the robustness and generalization capacity of distilled models.

Reilly Haskins、Benjamin Adams

计算技术、计算机技术

Reilly Haskins,Benjamin Adams.Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation[EB/OL].(2025-05-15)[2025-06-27].https://arxiv.org/abs/2505.10822.点此复制

评论