Aerial Vehicle Tracking by Adaptive Fusion of Hyperspectral Likelihood Maps

Burak Uzkent, Aneesh Rangnekar, Matthew Hoffman; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2017, pp. 39-48

Abstract


Hyperspectral cameras provide unique spectral signatures that can be used to solve surveillance tasks. This paper proposes a novel real-time hyperspectral likelihood maps-aided tracking method (HLT) inspired by an adaptive hyperspectral sensor. We focus on the target detection part of a tracking system and remove the necessity to build any offline classifiers and tune large amount of hyper-parameters, instead learning a generative target model in an online manner for hyperspectral channels ranging from visible to infrared wavelengths. The key idea is that our adaptive fusion method can combine likelihood maps from multiple bands of hyperspectral imagery into one single more distinctive representation increasing the margin between mean value of foreground and background pixels in the fused map. Experimental results show that the HLT not only outperforms all established fusion methods but is on par with the current state-of-the-art hyperspectral target tracking frameworks.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Uzkent_2017_CVPR_Workshops,
author = {Uzkent, Burak and Rangnekar, Aneesh and Hoffman, Matthew},
title = {Aerial Vehicle Tracking by Adaptive Fusion of Hyperspectral Likelihood Maps},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {July},
year = {2017}
}