Dense Continuous-Time Tracking and Mapping With Rolling Shutter RGB-D Cameras
Christian Kerl, Jorg Stuckler, Daniel Cremers; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2015, pp. 2264-2272
Abstract
We propose a dense continuous-time tracking and mapping method for RGB-D cameras. We parametrize the camera trajectory using continuous B-splines and optimize the trajectory through dense, direct image alignment. Our method also directly models rolling shutter in both RGB and depth images within the optimization, which improves tracking and reconstruction quality for low-cost CMOS sensors. Using a continuous trajectory representation has a number of advantages over a discrete-time representation (e.g. camera poses at the frame interval). With splines, less variables need to be optimized than with a discrete representation, since the trajectory can be represented with fewer control points than frames. Splines also naturally include smoothness constraints on derivatives of the trajectory estimate. Finally, the continuous trajectory representation allows to compensate for rolling shutter effects, since a pose estimate is available at any exposure time of an image. Our approach demonstrates superior quality in tracking and reconstruction compared to approaches with discrete-time or global shutter assumptions.
Related Material
[pdf]
[
bibtex]
@InProceedings{Kerl_2015_ICCV,
author = {Kerl, Christian and Stuckler, Jorg and Cremers, Daniel},
title = {Dense Continuous-Time Tracking and Mapping With Rolling Shutter RGB-D Cameras},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}
}