Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution

Hadi Pouransari, Mojan Javaheripi, Vinay Sharma, Oncel Tuzel; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2021, pp. 3032-3042

Abstract


Knowledge distillation has been used to transfer knowledge learned by a sophisticated model (teacher) to a simpler model (student). This technique is widely used to compress model complexity. However, in most applications the compressed student model suffers from an accuracy gap with its teacher. We propose extracurricular learning, a novel knowledge distillation method, that bridges this gap by (1) modeling student and teacher output distributions; (2) sampling examples from an approximation to the underlying data distribution; and (3) matching student and teacher output distributions over this extended set including uncertain samples. We conduct rigorous evaluations on regression and classification tasks and show that compared to the standard knowledge distillation, extracurricular learning reduces the gap by 46% to 68%. This leads to major accuracy improvements compared to the empirical risk minimization-based training for various recent neural network architectures: 16% regression error reduction on the MPIIGaze dataset, +3.4% to +9.1% improvement in top-1 classification accuracy on the CIFAR100 dataset, and +2.9% top-1 improvement on the ImageNet dataset.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Pouransari_2021_CVPR, author = {Pouransari, Hadi and Javaheripi, Mojan and Sharma, Vinay and Tuzel, Oncel}, title = {Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2021}, pages = {3032-3042} }