A Content-Aware Metric for Stitched Panoramic Image Quality Assessment

Luyu Yang, Zhigang Tan, Zhe Huang, Gene Cheung; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2017, pp. 2487-2494

Abstract


One key enabling component of immersive VR visual experience is the construction of panoramic images--each stitched into one wide-angle image from multiple smaller viewpoints. To better evaluate and design stitching algorithms, a lightweight yet accurate metric is desirable. In this paper, we design a metric specifically for stitched images by fusing a perceptual geometric error metric and a local structure-guided metric into one. For the geometric error, we compute the local variance of optical flow field energy. For the structure-guided metric, we compute intensity and chrominance gradient. The two metrics are content-adaptively combined based on the amount of image structures. Extensive experiments are conducted on our stitched image quality assessment (SIQA) dataset with 408 groups of examples. Results show our proposed metric outperforms state-of-the-art metrics and the two parts of metric complement each other. Our SIQA dataset is made publicly available as part of the submission.

Related Material


[pdf]
[bibtex]
@InProceedings{Yang_2017_ICCV,
author = {Yang, Luyu and Tan, Zhigang and Huang, Zhe and Cheung, Gene},
title = {A Content-Aware Metric for Stitched Panoramic Image Quality Assessment},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2017}
}