Long-Term Visual Map Sparsification With Heterogeneous GNN

Ming-Fang Chang, Yipu Zhao, Rajvi Shah, Jakob J. Engel, Michael Kaess, Simon Lucey; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 2406-2415

Abstract


We address the problem of map sparsification for longterm visual localization. A commonly employed assumption in map sparsification is that the pre-build map and the later capture localization query are consistent. However, this assumption can be easily violated in the dynamic world. Additionally, the map size grows as new data accumulate through time, causing large data overhead in the long term. In this paper, we aim to overcome the environmental changes and reduce the map size at the same time by selecting points that are valuable to future localization. Inspired by the recent progress in Graph Neural Network (GNN), we propose the first work that models SfM maps as heterogeneous graphs and predicts 3D point importance scores with a GNN, which enables us to directly exploit the rich information in the SfM map graph. Two novel supervisions are proposed: 1) a data-fitting term for selecting valuable points to future localization based on training queries; 2) a K-Cover term for selecting sparse points with full-map coverage. In the experiments on a long-term dataset with environmental changes, our method selected map points on stable and widely visible structures and outperformed baselines in localization performance. This work novelly connects SfM maps with the abundant modern GNN techniques and opens a new research avenue forward.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Chang_2022_CVPR, author = {Chang, Ming-Fang and Zhao, Yipu and Shah, Rajvi and Engel, Jakob J. and Kaess, Michael and Lucey, Simon}, title = {Long-Term Visual Map Sparsification With Heterogeneous GNN}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {2406-2415} }