Accumulation Knowledge Distillation for Conditional GAN Compression

Tingwei Gao, Rujiao Long; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 1302-1311

Abstract


This paper focuses on an efficient and high-performance compression method for conditional generative adversarial networks (cGANs) from the perspective of knowledge distillation. Previous cGANs compression approaches using knowledge distillation typically transfer knowledge in a one-to-one manner, where a specific student generator layer only receives knowledge from the same depth stage in the teacher generator. Obviously, this approach fails to sufficiently explore the valuable dark knowledge embedded in the intermediate teacher generator layers. To address this issue, a novel cGANs compression method based on accumulation knowledge distillation (ACKD) is proposed. ACKD accumulates knowledge from various teacher generator stages then transfers it to the student generator. To this end, ACKD first extracts the essential knowledge from different stages and subsequently unifies them to determine their relative importance. In this manner, ACKD is capable of effectively providing hierarchical, informative and targeted knowledge to the compressed student generator. The compressed cGANs achieved by ACKD demonstrate remarkable performance surpassing other other state-of-the-art methods on three benchmarks. Furthermore, ACKD compresses parameters over 100x and MACs over 50x, setting new records in cGANs compression.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Gao_2023_ICCV, author = {Gao, Tingwei and Long, Rujiao}, title = {Accumulation Knowledge Distillation for Conditional GAN Compression}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {1302-1311} }