Deep Adaptive Fusion Network for High Performance RGBT Tracking

Yuan Gao, Chenglong Li, Yabin Zhu, Jin Tang, Tao He, Futian Wang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


Due to the complementarity of RGB and thermal data, RGBT tracking has received more and more attention in recent years because it can effectively solve the degradation of tracking performance in dark environments and bad weather conditions. How to effectively fuse the information from RGB and thermal modality is the key to give full play to their complementarities for effective RGBT tracking. In this paper, we propose a high performance RGBT tracking framework based on a novel deep adaptive fusion network, named DAFNet. Our DAFNet consists of a recursive fusion chain that could adaptively integrate all layer features in an end-to-end manner. Due to simple yet effective operations in DAFNet, our tracker is able to reach the near-real-time speed. Comparing with the state-of-the-art trackers on two public datasets, our DAFNet tracker achieves the outstanding performance and yields a new state-of-the-art in RGBT tracking.

Related Material


[pdf]
[bibtex]
@InProceedings{Gao_2019_ICCV,
author = {Gao, Yuan and Li, Chenglong and Zhu, Yabin and Tang, Jin and He, Tao and Wang, Futian},
title = {Deep Adaptive Fusion Network for High Performance RGBT Tracking},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}