-
[pdf]
[arXiv]
[bibtex]@InProceedings{Xie_2025_WACV, author = {Xie, Liuyue and Guo, Jiancong and Jeni, L\'aszl\'o A. and Jia, Zhiheng and Li, Mingyang and Zhou, Yunwen and Guo, Chao}, title = {Through the Curved Cover: Synthesizing Cover Aberrated Scenes with Refractive Field}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {9632-9641} }
Through the Curved Cover: Synthesizing Cover Aberrated Scenes with Refractive Field
Abstract
Recent extended reality headsets and field robots have adopted covers to protect the front-facing cameras from environmental hazards and falls. The surface irregularities on the cover can lead to optical aberrations like blurring and non-parametric distortions. Novel view synthesis methods like NeRF and 3D Gaussian Splatting are ill-equipped to synthesize from sequences with optical aberrations. To address this challenge we introduce SynthCover to enable novel view synthesis through protective covers for downstream extended reality applications. SynthCover employs a Refractive Field that estimates the cover's geometry enabling precise analytical calculation of refracted rays. Experiments on synthetic and real-world scenes demonstrate our method's ability to accurately model scenes viewed through protective covers achieving a significant improvement in rendering quality compared to prior methods. We also show that the model can adjust well to various cover geometries with synthetic sequences captured with covers of different surface curvatures. To motivate further studies on this problem we provide the benchmarked dataset containing real and synthetic walkable scenes captured with protective cover optical aberrations.
Related Material