Fine-Grained Pointing Recognition for Natural Drone Guidance

Oscar L. Barbed, Pablo Azagra, Lucas Teixeira, Margarita Chli, Javier Civera, Ana C. Murillo; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 1040-1041

Abstract


Human action recognition systems are typically focused on identifying different actions, rather than fine grained variations of the same action. This work explores strategies to identify different pointing directions in order to build a natural interaction system to guide autonomous systems such as drones. Commanding a drone with hand-held panels or tablets is common practice but intuitive user-drone interfaces might have significant benefits. The system proposed in this work just requires the user to provide occasional high-level navigation commands by pointing the drone towards the desired motion direction. Due to the lack of data on these settings, we present a new benchmarking video dataset to validate our framework and facilitate future research on the area. Our results show good accuracy for pointing direction recognition, while running at interactive rates and exhibiting robustness to variability in user appearance, viewpoint, camera distance and scenery.

Related Material


[pdf]
[bibtex]
@InProceedings{Barbed_2020_CVPR_Workshops,
author = {Barbed, Oscar L. and Azagra, Pablo and Teixeira, Lucas and Chli, Margarita and Civera, Javier and Murillo, Ana C.},
title = {Fine-Grained Pointing Recognition for Natural Drone Guidance},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}