MV-Map: Offboard HD-Map Generation with Multi-view Consistency

Ziyang Xie, Ziqi Pang, Yu-Xiong Wang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 8658-8668

Abstract


While bird's-eye-view (BEV) perception models can be useful for building high-definition maps (HD-Maps) with less human labor, their results are often unreliable and demonstrate noticeable inconsistencies in the predicted HD-Maps from different viewpoints. This is because BEV perception is typically set up in an "onboard" manner, which restricts the computation and consequently prevents algorithms from reasoning multiple views simultaneously. This paper overcomes these limitations and advocates a more practical "offboard" HD-Map generation setup that removes the computation constraints, based on the fact that HD-Maps are commonly reusable infrastructures built offline in data centers. To this end, we propose a novel offboard pipeline called MV-Map that capitalizes multi-view consistency and can handle an arbitrary number of frames with the key desgin of a "region-centric" framework. In MV-Map, the target HD-Maps are created by aggregating all the frames of onboard predictions, weighted by the confidence scores assigned by an "uncertainty network." To further enhance multi-view consistency, we augment the uncertainty network with the global 3D structure optimized by a voxelized neural radiance field (Voxel-NeRF). Extensive experiments on nuScenes show that our MV-Map significantly improves the quality of HD-Maps, further highlighting the importance of offboard methods for HD-Map generation.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Xie_2023_ICCV, author = {Xie, Ziyang and Pang, Ziqi and Wang, Yu-Xiong}, title = {MV-Map: Offboard HD-Map Generation with Multi-view Consistency}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {8658-8668} }