-
[pdf]
[supp]
[bibtex]@InProceedings{Deng_2025_CVPR, author = {Deng, Jiaxi and Wang, Yushen and Meng, Haitao and Hou, Zuoxun and Chang, Yi and Chen, Gang}, title = {OmniStereo: Real-time Omnidireactional Depth Estimation with Multiview Fisheye Cameras}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {1003-1012} }
OmniStereo: Real-time Omnidireactional Depth Estimation with Multiview Fisheye Cameras
Abstract
Fast and reliable omnidirectional 3D sensing is essential to many applications such as autonomous driving, robotics and drone navigation. While many well-recognized methods have been developed to produce high-quality omnidirectional 3D information, they are too slow for real-time computation, limiting their feasibility in practical applications. Motivated by these shortcomings, we propose an efficient omnidirectional depth sensing framework, called OmniStereo, which generates high-quality 3D information in real-time. Unlike prior works, OmniStereo employs Cassini projection to simplify the photometric matching and introduces a lightweight stereo matching network to minimize computational overhead. Additionally, OmniStereo proposes a novel fusion method to handle depth discontinuities and invalid pixels complemented by a refinement module to reduce mapping-introduced errors and recover fine details. As a result, OmniStereo achieves state-of-the-art (SOTA) accuracy, surpassing the second-best method over 32% in MAE, while maintaining real-time efficiency. It operates more than 16.5xfaster than the second-best method in accuracy on TITAN RTX, achieving 12.3 FPS on embedded device Jetson AGX Orin, underscoring its suitability for real-world deployment. The code is available at https://github.com/DengJiaxi1/OmniStereo.
Related Material