Hallucinated-IQA: No-Reference Image Quality Assessment via Adversarial Learning

Kwan-Yee Lin, Guanxiang Wang; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 732-741

Abstract


No-reference image quality assessment (NR-IQA) is a fundamental yet challenging task in low-level computer vision community. The difficulty is particularly pronounced for the limited information, for which the corresponding reference for comparison is typically absent. Although various feature extraction mechanisms have been leveraged from natural scene statistics to deep neural networks in previous methods, the performance bottleneck still exists. In this work, we propose a hallucination-guided quality regression network to address the issue. We firstly generate a hallucinated reference constrained on the distorted image, to compensate the absence of the true reference. Then, we pair the information of hallucinated reference with the distorted image, and forward them to the regressor to learn the perceptual discrepancy with the guidance of an implicit ranking relationship within the generator, and therefore produce the precise quality prediction. To demonstrate the effectiveness of our approach, comprehensive experiments are evaluated on four popular image quality assessment benchmarks. Our method significantly outperforms all the previous state-of-the-art methods by large margins. The code and model are publicly available on the project page https://kwanyeelin.github.io/projects/HIQA/HIQA.html

Related Material


[pdf] [Supp] [arXiv]
[bibtex]
@InProceedings{Lin_2018_CVPR,
author = {Lin, Kwan-Yee and Wang, Guanxiang},
title = {Hallucinated-IQA: No-Reference Image Quality Assessment via Adversarial Learning},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}