What Role Does Data Augmentation Play in Knowledge Distillation?

Wei Li, Shitong Shao, Weiyan Liu, Ziming Qiu, Zhihao Zhu, Wei Huan; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 2204-2220

Abstract


Knowledge distillation is an effective way to transfer knowledge from a large model to a small model, which can significantly improve the performance of the small model. In recent years, some contrastive learning-based knowledge distillation methods (i.e., SSKD and HSAKD) have achieved excellent performance by utilizing data augmentation. However, the worth of data augmentation has always been overlooked by researchers in knowledge distillation, and no work analyzes its role in particular detail. To fix this gap, we analyze the effect of data augmentation on knowledge distillation from a multi-sided perspective. In particular, we demonstrate the following properties of data augmentation: (a) data augmentation can effectively help knowledge distillation work even if the teacher model does not have the information about augmented samples, and our proposed diverse and rich Joint Data Augmentation (JDA) is more valid than single rotating in knowledge distillation; (b) using diverse and rich augmented samples to assist the teacher model in training can improve its performance, but not the performance of the student model; (c) the student model can achieve excellent performance when the proportion of augmented samples is within a suitable range; (d) data augmentation enables knowledge distillation to work better in a few-shot scenario; (e) data augmentation is seamlessly compatible with some knowledge distillation methods and can potentially further improve their performance. Enlightened by the above analysis, we propose a method named Cosine Confidence Distillation (CCD) to transfer the augmented samples' knowledge more reasonably. And CCD achieves better performance than the latest SOTA HSAKD with fewer storage requirements on CIFAR-100 and ImageNet-1k. Our code will be released.

Related Material


[pdf] [supp] [code]
[bibtex]
@InProceedings{Li_2022_ACCV, author = {Li, Wei and Shao, Shitong and Liu, Weiyan and Qiu, Ziming and Zhu, Zhihao and Huan, Wei}, title = {What Role Does Data Augmentation Play in Knowledge Distillation?}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {2204-2220} }