Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment

Heliang Zheng, Huan Yang, Jianlong Fu, Zheng-Jun Zha, Jiebo Luo; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 10242-10251

Abstract


An important scenario for image quality assessment (IQA) is to evaluate image restoration (IR) algorithms. The state-of-the-art approaches adopt a full-reference paradigm that compares restored images with their corresponding pristine-quality images. However, pristine-quality images are usually unavailable in blind image restoration tasks and real-world scenarios. In this paper, we propose a practical solution named degraded-reference IQA (DR-IQA), which exploits the inputs of IR models, degraded images, as references. Specifically, we extract reference information from degraded images by distilling knowledge from pristine-quality images. The distillation is achieved through learning a reference space, where various degraded images are encouraged to share the same feature statistics with pristine-quality images. And the reference space is optimized to capture deep image priors that are useful for quality assessment. Note that pristine-quality images are only used during training. Our work provides a powerful and differentiable metric for blind IRs, especially for GAN-based methods. Extensive experiments show that our results can even be close to the performance of full-reference settings.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Zheng_2021_ICCV, author = {Zheng, Heliang and Yang, Huan and Fu, Jianlong and Zha, Zheng-Jun and Luo, Jiebo}, title = {Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {10242-10251} }