Visual Navigation Aid for the Blind in Dynamic Environments

Tung-Sing Leung, Gerard Medioni; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2014, pp. 565-572

Abstract


We describe a robust method to estimate egomotion in highly dynamic environments. Our application is a head mounted stereo system designed to help the visually impaired navigate. Instead of computing egomotion from 3D point correspondences in consecutive frames, we propose to find the ground plane, then decompose the 6DoF egomotion into the motion of the ground plane, and a planar motion on the ground plane. The ground plane is estimated at each frame by analysis of the disparity array. Next, we estimate the normal to the ground plane. This is done either from the visual data, or from the IMU reading. We evaluate the results on both synthetic and real scenes, and compare the results of the direct, 6 DoF estimate with our plane-based approach, with and without the IMU. We conclude that the egomotion estimation using this new approach produces significantly better results, both in simulation and on real data sets.

Related Material


[pdf]
[bibtex]
@InProceedings{Leung_2014_CVPR_Workshops,
author = {Leung, Tung-Sing and Medioni, Gerard},
title = {Visual Navigation Aid for the Blind in Dynamic Environments},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2014}
}