Multi-Camera Vehicle Tracking Based on Occlusion-Aware and Inter-Vehicle Information

Yuming Liu, Xiaochun Zhang, Bingzhen Zhang, Xiaoyong Zhang, Sen Wang, Jianrong Xu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 3257-3264

Abstract


With the demands of analyzing and predicting traffic flow for applications in smart cities, Multi-Target Multi-Camera vehicle Tracking(MTMCT) at the city scale has become a fundamental problem. The MTMCT is challenging due to the view variations, frequent occlusions, and similar vehicle models in the same camera. This work proposes an MTMCT framework based on occlusion-aware and inter-vehicle information that can effectively match vehicle tracklets. The occlusion-aware module segments the tracklets of an occluded and occluding vehicle pair. It recalculates the similarity of the complete tracklets, which can handle the occlusions and suppress false detections. This work proposes an inter-vehicle information module to improve the matching accuracy. The module can enhance the ability to distinguish similar vehicles under the same camera at different times. The proposed whole framework consists of four modules: (1) vehicle detection and feature extraction by re-identification models, (2) single-camera tracking (SCT) to produce initial tracklets with an occlusion-aware module, (3) tracklets similarity by inter-vehicle association, (4) clustering in adjacent cameras for multi-camera tracklets matching. The proposed method obtains IDF1 score of 0.8285 on the Track-1 multi-camera vehicle tracking task of the 2022 AI City Challenge.

Related Material


[pdf]
[bibtex]
@InProceedings{Liu_2022_CVPR, author = {Liu, Yuming and Zhang, Xiaochun and Zhang, Bingzhen and Zhang, Xiaoyong and Wang, Sen and Xu, Jianrong}, title = {Multi-Camera Vehicle Tracking Based on Occlusion-Aware and Inter-Vehicle Information}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {3257-3264} }