Blind Predicting Similar Quality Map for Image Quality Assessment

Da Pan, Ping Shi, Ming Hou, Zefeng Ying, Sizhe Fu, Yuan Zhang; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 6373-6382

Abstract


A key problem in blind image quality assessment (BIQA) is how to effectively model the properties of human visual system in a data-driven manner. In this paper, we propose a simple and efficient BIQA model based on a novel framework which consists of a fully convolutional neural network (FCNN) and a pooling network to solve this problem. In principle, FCNN is capable of predicting a pixel-by-pixel similar quality map only from a distorted image by using the intermediate similarity maps derived from conventional full-reference image quality assessment methods. The predicted pixel-by-pixel quality maps have good consistency with the distortion correlations between the reference and distorted images. Finally, a deep pooling network regresses the quality map into a score. Experiments have demonstrated that our predictions outperform many state-of-the-art BIQA methods.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Pan_2018_CVPR,
author = {Pan, Da and Shi, Ping and Hou, Ming and Ying, Zefeng and Fu, Sizhe and Zhang, Yuan},
title = {Blind Predicting Similar Quality Map for Image Quality Assessment},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}