Distilling Global and Local Logits With Densely Connected Relations

Youmin Kim, Jinbae Park, YounHo Jang, Muhammad Ali, Tae-Hyun Oh, Sung-Ho Bae; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 6290-6300

Abstract


In prevalent knowledge distillation, logits in most image recognition models are computed by global average pooling, then used to learn to encode the high-level and task-relevant knowledge. In this work, we solve the limitation of this global logit transfer in this distillation context. We point out that it prevents the transfer of informative spatial information, which provides localized knowledge as well as rich relational information across contexts of an input scene. To exploit the rich spatial information, we propose a simple yet effective logit distillation approach. We add a local spatial pooling layer branch to the penultimate layer, thereby our method extends the standard logit distillation and enables learning of both finely-localized knowledge and holistic representation. Our proposed method shows favorable accuracy improvement against the state-of-the-art methods on several image classification datasets. We show that our distilled students trained on the image classification task can be successfully leveraged for object detection and semantic segmentation tasks; this result demonstrates our method's high transferability.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Kim_2021_ICCV, author = {Kim, Youmin and Park, Jinbae and Jang, YounHo and Ali, Muhammad and Oh, Tae-Hyun and Bae, Sung-Ho}, title = {Distilling Global and Local Logits With Densely Connected Relations}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {6290-6300} }