Realtime Edge-Based Visual Odometry for a Monocular Camera

Juan Jose Tarrio, Sol Pedre; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2015, pp. 702-710

Abstract


In this work we present a novel algorithm for realtime visual odometry for a monocular camera. The main idea is to develop an approach between classical feature-based visual odometry systems and modern direct dense/semi-dense methods, trying to benefit from the best attributes of both. Similar to feature-based systems, we extract information from the images, instead of working with raw image intensities as direct methods. In particular, the information extracted are the edges present in the image, while the rest of the algorithm is designed to take advantage of the structural information provided when pixels are treated as edges. Edge extraction is an efficient and higly parallelizable operation. The edge depth information extracted is dense enough to allow acceptable surface fitting, similar to modern semi-dense methods. This is a valuable attribute that feature-based odometry lacks. Experimental results show that the proposed method has similar drift than state of the art feature-based and direct methods, and is a simple algorithm that runs at realtime and can be parallelized. Finally, we have also developed an inertial aided version that successfully stabilizes an unmanned air vehicle in complex indoor environments using only a frontal camera, while running the complete solution in the embedded hardware on board the vehicle.

Related Material


[pdf]
[bibtex]
@InProceedings{Tarrio_2015_ICCV,
author = {Tarrio, Juan Jose and Pedre, Sol},
title = {Realtime Edge-Based Visual Odometry for a Monocular Camera},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}
}