Convolutional Neural Network-Based Deep Urban Signatures With Application to Drone Localization

Karim Amer, Mohamed Samy, Reda ElHakim, Mahmoud Shaker, Mohamed ElHelw; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2017, pp. 2138-2145

Abstract


Most commercial Small Unmanned Aerial Vehicles (SUAVs) rely solely on Global Navigation Satellite Systems (GNSSs) - such as GPS and GLONASS - to perform localization tasks during the execution of autonomous navigation activities. Despite being fast and accurate, satellite-based navigation systems have typical vulnerabilities and pitfalls in urban settings that may prevent successful drone localization. This paper presents the novel concept of "Deep Urban Signatures" where a deep convolutional neural network is used to compute a unique characterization for each urban area or district based on the visual appearance of its architecture and landscape style. Such information is used to identify the district and subsequently perform localization. The paper presents the methodology to compute the signatures and discusses the experiments carried out using Google maps and Bing maps, with latter used to simulate footage captured by SUAVs at different altitudes and/or using different camera zoom levels. The results obtained demonstrate that Deep Urban Signatures can be used to successfully accomplish district-level aerial drone localization with future work comprising accurate localization within each identified district.

Related Material


[pdf]
[bibtex]
@InProceedings{Amer_2017_ICCV,
author = {Amer, Karim and Samy, Mohamed and ElHakim, Reda and Shaker, Mahmoud and ElHelw, Mohamed},
title = {Convolutional Neural Network-Based Deep Urban Signatures With Application to Drone Localization},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2017}
}