-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Li_2021_ICCV, author = {Li, Jiaxin and Feng, Zijian and She, Qi and Ding, Henghui and Wang, Changhu and Lee, Gim Hee}, title = {MINE: Towards Continuous Depth MPI With NeRF for Novel View Synthesis}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {12578-12588} }
MINE: Towards Continuous Depth MPI With NeRF for Novel View Synthesis
Abstract
In this paper, we propose MINE to perform novel view synthesis and depth estimation via dense 3D reconstruction from a single image. Our approach is a continuous depth generalization of the Multiplane Images (MPI) by introducing the NEural radiance fields (NeRF). Given a single image as input, MINE predicts a 4-channel image (RGB and volume density) at arbitrary depth values to jointly reconstruct the camera frustum and fill in occluded contents. The reconstructed and inpainted frustum can then be easily rendered into novel RGB or depth views using differentiable rendering. Extensive experiments on RealEstate10K, KITTI and Flowers Light Fields show that our MINE outperforms state-of-the-art by a large margin in novel view synthesis. We also achieve competitive results in depth estimation on iBims-1 and NYU-v2 without annotated depth supervision. Our source code is available at https://github.com/vincentfung13/MINE
Related Material