Sensor-Realistic Synthetic Data Engine for Multi-Frame High Dynamic Range Photography

Jinhan Hu, Gyeongmin Choe, Zeeshan Nadir, Osama Nabil, Seok-Jun Lee, Hamid Sheikh, Youngjun Yoo, Michael Polley; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 516-517

Abstract


Deep learning-based mobile imaging applications are often limited by the lack of training data. To this end, researchers have resorted to using synthetic training data. However, pure synthetic data does not accurately mimic the distribution of the real data. To improve the utility of synthetic data, we present a systematic pipeline that takes synthetic data coming purely from a game engine and then produces synthetic data with real sensor characteristics such as noise and color gamut. We validate the utility of our sensor-realistic synthetic data for multi-frame high dynamic range (HDR) photography using a Samsung Galaxy S10 Plus smartphone. The result of training two baseline neural networks using our sensor realistic synthetic data modeled for the S10 Plus show that our sensor realistic synthetic data improves the quality of HDR photography on the modeled device. The synthetic dataset is publicly available at https://github.com/nadir-zeeshan/sensor-realistic-synthetic-data.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Hu_2020_CVPR_Workshops,
author = {Hu, Jinhan and Choe, Gyeongmin and Nadir, Zeeshan and Nabil, Osama and Lee, Seok-Jun and Sheikh, Hamid and Yoo, Youngjun and Polley, Michael},
title = {Sensor-Realistic Synthetic Data Engine for Multi-Frame High Dynamic Range Photography},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}