Tracking an RGB-D Camera Using Points and Planes

Esra Ataer-Cansizoglu, Yuichi Taguchi, Srikumar Ramalingam, Tyler Garaas; Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops, 2013, pp. 51-58


Planes are dominant in most indoor and outdoor scenes and the development of a hybrid algorithm that incorporates both point and plane features provides numerous advantages. In this regard, we present a tracking algorithm for RGB-D cameras using both points and planes as primitives. We show how to extend the standard predictionand-correction framework to include planes in addition to points. By fitting planes, we implicitly take care of the noise in the depth data that is typical in many commercially available 3D sensors. In comparison with the techniques that use only points, our tracking algorithm has fewer failure modes, and our reconstructed model is compact and more accurate. The tracking algorithm is supported by relocalization and bundle adjustment processes to demonstrate a real-time simultaneous localization and mapping (SLAM) system using a hand-held or robot-mounted RGB-D camera. Our experiments show large-scale indoor reconstruction results as point-based and plane-based 3D models, and demonstrate an improvement over the point-based tracking algorithms using a benchmark for RGB-D cameras.

Related Material

author = {Esra Ataer-Cansizoglu and Yuichi Taguchi and Srikumar Ramalingam and Tyler Garaas},
title = {Tracking an RGB-D Camera Using Points and Planes},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops},
month = {June},
year = {2013}