NeLF-Pro: Neural Light Field Probes for Multi-Scale Novel View Synthesis

Zinuo You, Andreas Geiger, Anpei Chen; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 19833-19843

Abstract


We present NeLF-Pro a novel representation to model and reconstruct light fields in diverse natural scenes that vary in extent and spatial granularity. In contrast to previous fast reconstruction methods that represent the 3D scene globally we model the light field of a scene as a set of local light field feature probes parameterized with position and multi-channel 2D feature maps. Our central idea is to bake the scene's light field into spatially varying learnable representations and to query point features by weighted blending of probes close to the camera - allowing for mipmap representation and rendering. We introduce a novel vector-matrix-matrix (VMM) factorization technique that effectively represents the light field feature probes as products of core factors (i.e. VM) shared among local feature probes and a basis factor (i.e. M) - efficiently encoding internal relationships and patterns within the scene.Experimentally we demonstrate that NeLF-Pro significantly boosts the performance of feature grid-based representations and achieves fast reconstruction with better rendering quality while maintaining compact modeling. Project page: sinoyou.github.io/nelf-pro

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{You_2024_CVPR, author = {You, Zinuo and Geiger, Andreas and Chen, Anpei}, title = {NeLF-Pro: Neural Light Field Probes for Multi-Scale Novel View Synthesis}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {19833-19843} }