EM-Fusion: Dynamic Object-Level SLAM With Probabilistic Data Association

Michael Strecke, Jorg Stuckler; The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 5865-5874

Abstract


The majority of approaches for acquiring dense 3D environment maps with RGB-D cameras assumes static environments or rejects moving objects as outliers. The representation and tracking of moving objects, however, has significant potential for applications in robotics or augmented reality. In this paper, we propose a novel approach to dynamic SLAM with dense object-level representations. We represent rigid objects in local volumetric signed distance function (SDF) maps, and formulate multi-object tracking as direct alignment of RGB-D images with the SDF representations. Our main novelty is a probabilistic formulation which naturally leads to strategies for data association and occlusion handling. We analyze our approach in experiments and demonstrate that our approach compares favorably with the state-of-the-art methods in terms of robustness and accuracy.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Strecke_2019_ICCV,
author = {Strecke, Michael and Stuckler, Jorg},
title = {EM-Fusion: Dynamic Object-Level SLAM With Probabilistic Data Association},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}