NLOS-NeuS: Non-line-of-sight Neural Implicit Surface

Yuki Fujimura, Takahiro Kushida, Takuya Funatomi, Yasuhiro Mukaigawa; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 10532-10541

Abstract


Non-line-of-sight (NLOS) imaging is conducted to infer invisible scenes from indirect light on visible objects. The neural transient field (NeTF) was proposed for representing scenes as neural radiance fields in NLOS scenes. We propose NLOS neural implicit surface (NLOS-NeuS), which extends the NeTF to neural implicit surfaces with a signed distance function (SDF) for reconstructing three-dimensional surfaces in NLOS scenes. We introduce two constraints as loss functions for correctly learning an SDF to avoid non-zero level-set surfaces. We also introduce a lower bound constraint of an SDF based on the geometry of the first-returning photons. The experimental results indicate that these constraints are essential for learning a correct SDF in NLOS scenes. Compared with previous methods with discretized representation, NLOS-NeuS with the neural continuous representation enables us to reconstruct smooth surfaces while preserving fine details in NLOS scenes. To the best of our knowledge, this is the first study on neural implicit surfaces with volume rendering in NLOS scenes. Project page: https://yfujimura. github.io/nlos-neus/

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Fujimura_2023_ICCV, author = {Fujimura, Yuki and Kushida, Takahiro and Funatomi, Takuya and Mukaigawa, Yasuhiro}, title = {NLOS-NeuS: Non-line-of-sight Neural Implicit Surface}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {10532-10541} }