Underwater Light Field Retention: Neural Rendering for Underwater Imaging

Tian Ye, Sixiang Chen, Yun Liu, Yi Ye, Erkang Chen, Yuche Li; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 488-497

Abstract


Underwater Image Rendering aims to generate a true-to-life underwater image from a given clean one, which could be applied to various practical applications such as underwater image enhancement, camera filter, and virtual gaming. We explore two less-touched but challenging problems in underwater image rendering, namely, i) how to render diverse underwater scenes by a single neural network? ii) how to adaptively learn the underwater light fields from natural exemplars, i,e., realistic underwater images? To this end, we propose a neural rendering method for underwater imaging, dubbed UWNR (Underwater Neural Rendering). Specifically, UWNR is a data-driven neural network that implicitly learns the natural degenerated model from authentic underwater images, avoiding introducing erroneous biases by hand-craft imaging models. Compared with existing underwater image generation methods, UWNR utilizes the natural light field to simulate the main characteristics of the underwater scene. Thus, it is able to synthesize a wide variety of underwater images from one clean image with various realistic underwater images. Extensive experiments demonstrate that our approach achieves better visual effects and quantitative metrics over previous methods. Moreover, we adopt UWNR to build an open Large Neural Rendering Underwater Dataset containing various types of water quality, dubbed LNRUD.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Ye_2022_CVPR, author = {Ye, Tian and Chen, Sixiang and Liu, Yun and Ye, Yi and Chen, Erkang and Li, Yuche}, title = {Underwater Light Field Retention: Neural Rendering for Underwater Imaging}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {488-497} }