Learning Event-Based Height From Plane and Parallax

Kenneth Chaney, Alex Zihao Zhu, Kostas Daniilidis; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2019, pp. 0-0


Event-based cameras are a novel asynchronous sensing modality that provides exciting benefits, such as the ability to track fast moving objects with no motion blur and low latency, high dynamic range, and low power consumption. Given the low latency of the cameras, as well as their ability to work in challenging lighting conditions, these cameras are a natural fit for reactive problems such as fast local structure estimation. In this work, we propose a fast method to perform structure estimation for vehicles traveling in a roughly 2D environment (e.g. in an environment with a ground plane). Our method transfers the method of plane and parallax to events, which, given the homography to a ground plane and the pose of the camera, generates a warping of the events which removes the optical flow for events on the ground plane, while inducing flow for events above the ground plane. We then estimate dense flow in this warped space using a self-supervised neural network, which provides the height of all points in the scene. We evaluate our method on the Multi Vehicle Stereo Event Camera dataset, and show its ability to rapidly estimate the scene structure both at high speeds and in low lighting conditions.

Related Material

author = {Chaney, Kenneth and Zihao Zhu, Alex and Daniilidis, Kostas},
title = {Learning Event-Based Height From Plane and Parallax},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2019}