On Explaining Knowledge Distillation: Measuring and Visualising the Knowledge Transfer Process

Gereziher Adhane, Mohammad Mahdi Dehshibi, Dennis Vetter, David Masip, Gemma Roig; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 3467-3476

Abstract


Knowledge distillation (KD) remains challenging due to the opaque nature of the knowledge transfer process from a Teacher to a Student making it difficult to address certain issues related to KD. To address this we proposed UniCAM a novel gradient-based visual explanation method which effectively interprets the knowledge learned during KD. Our experimental results demonstrate that with the guidance of the Teacher's knowledge the Student model becomes more efficient learning more relevant features while discarding those that are not relevant. We refer to the features learned with the Teacher's guidance as distilled features and the features irrelevant to the task and ignored by the Student as residual features. Distilled features focus on key aspects of the input such as textures and parts of objects. In contrast residual features demonstrate more diffused attention often targeting irrelevant areas including the backgrounds of the target objects. In addition we proposed two novel metrics: the feature similarity score (FSS) and the relevance score (RS) which quantify the relevance of the distilled knowledge. Experiments on the CIFAR10 ASIRRA and Plant Disease datasets demonstrate that UniCAM and the two metrics offer valuable insights to explain the KD process.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Adhane_2025_WACV, author = {Adhane, Gereziher and Dehshibi, Mohammad Mahdi and Vetter, Dennis and Masip, David and Roig, Gemma}, title = {On Explaining Knowledge Distillation: Measuring and Visualising the Knowledge Transfer Process}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {3467-3476} }