DrIFT: Autonomous Drone Dataset with Integrated Real and Synthetic Data Flexible Views and Transformed Domains

Fardad Dadboud, Hamid Azad, Varun Mehta, Miodrag Bolic, Iraj Mantegh; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 6900-6910

Abstract


Dependable visual drone detection is crucial for the secure integration of drones into the airspace. However drone detection accuracy is significantly affected by domain shifts due to environmental changes varied points of view and background shifts. To address these challenges we present the DrIFT dataset specifically developed for visual drone detection under domain shifts. DrIFT includes fourteen distinct domains each characterized by shifts in point of view synthetic-to-real data season and adverse weather. DrIFT uniquely emphasizes background shift by providing background segmentation maps to enable background-wise metrics and evaluation. Our new uncertainty estimation metric MCDO-map features lower postprocessing complexity surpassing traditional methods. We use the MCDO-map in our uncertainty-aware unsupervised domain adaptation method demonstrating superior performance to SOTA unsupervised domain adaptation techniques. The dataset is available at: https://github.com/CARG-uOttawa/DrIFT.git.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Dadboud_2025_WACV, author = {Dadboud, Fardad and Azad, Hamid and Mehta, Varun and Bolic, Miodrag and Mantegh, Iraj}, title = {DrIFT: Autonomous Drone Dataset with Integrated Real and Synthetic Data Flexible Views and Transformed Domains}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {6900-6910} }