Robust Feature Tracking in DVS Event Stream using Bezier Mapping

Hochang Seok, Jongwoo Lim; The IEEE Winter Conference on Applications of Computer Vision (WACV), 2020, pp. 1658-1667

Abstract


Unlike conventional cameras, event cameras capture the intensity changes at each pixel with very little delay. Such changes are recorded as an event stream with their positions, timestamps, and polarities continuously, thus there is no notion of 'frame' as in conventional cameras. As many applications including 3D pose estimation use 2D trajectories of feature points, it is necessary to detect and track the feature points robustly and accurately in a continuous event stream. In conventional feature tracking algorithms for event streams, the events in fixed time intervals are converted into the event images by stacking the events at their pixel locations, and the features are tracked in the event images. Such simple stacking of events yields blurry event images due to the camera motion, and it can significantly degrade the tracking quality. We propose to align the events in the time intervals along Bezier curves to minimize the misalignment. Since the camera motion is unknown, the Bezier curve is estimated to maximize the variance of the warped event pixels. Instead of the initial patches for tracking, we use the temporally integrated template patches, as it captures rich texture information from accurately aligned events. Extensive experimental evaluations in 2D feature tracking as well as 3D pose estimation show that our method significantly outperforms the conventional approaches.

Related Material


[pdf]
[bibtex]
@InProceedings{Seok_2020_WACV,
author = {Seok, Hochang and Lim, Jongwoo},
title = {Robust Feature Tracking in DVS Event Stream using Bezier Mapping},
booktitle = {The IEEE Winter Conference on Applications of Computer Vision (WACV)},
month = {March},
year = {2020}
}