Factored-NeuS: Reconstructing Surfaces, Illumination, and Materials of Possibly Glossy Objects

Yue Fan, Ningjing Fan, Ivan Skorokhodov, Oleg Voynov, Savva Ignatyev, Evgeny Burnaev, Peter Wonka, Yiqun Wang; Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR), 2025, pp. 21317-21327

Abstract


We develop a method that recovers the surface, materials, and illumination of a scene from its posed multi-view images. In contrast to prior work, it does not require any additional data and can handle glossy objects or bright lighting. It is a progressive inverse rendering approach, which consists of three stages. In the first stage, we reconstruct the scene radiance and signed distance function (SDF) with a novel regularization strategy for specular reflections. We propose to explain a pixel color using both surface and volume rendering jointly, which allows for handling complex view-dependent lighting effects for surface reconstruction. In the second stage, we distill light visibility and indirect illumination from the learned SDF and radiance field using learnable mapping functions. Finally, we design a method for estimating the ratio of incoming direct light reflected in a specular manner and use it to reconstruct the materials and direct illumination. Experimental results demonstrate that the proposed method outperforms the current state-of-the-art in recovering surfaces, materials, and lighting without relying on any additional data.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Fan_2025_CVPR, author = {Fan, Yue and Fan, Ningjing and Skorokhodov, Ivan and Voynov, Oleg and Ignatyev, Savva and Burnaev, Evgeny and Wonka, Peter and Wang, Yiqun}, title = {Factored-NeuS: Reconstructing Surfaces, Illumination, and Materials of Possibly Glossy Objects}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {21317-21327} }