DiffMOT: A Real-time Diffusion-based Multiple Object Tracker with Non-linear Prediction

Weiyi Lv, Yuhang Huang, Ning Zhang, Ruei-Sung Lin, Mei Han, Dan Zeng; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 19321-19330

Abstract


In Multiple Object Tracking objects often exhibit non-linear motion of acceleration and deceleration with irregular direction changes. Tacking-by-detection (TBD) trackers with Kalman Filter motion prediction work well in pedestrian-dominant scenarios but fall short in complex situations when multiple objects perform non-linear and diverse motion simultaneously. To tackle the complex non-linear motion we propose a real-time diffusion-based MOT approach named DiffMOT. Specifically for the motion predictor component we propose a novel Decoupled Diffusion-based Motion Predictor (D^2MP). It models the entire distribution of various motion presented by the data as a whole. It also predicts an individual object's motion conditioning on an individual's historical motion information. Furthermore it optimizes the diffusion process with much fewer sampling steps. As a MOT tracker the DiffMOT is real-time at 22.7FPS and also outperforms the state-of-the-art on DanceTrack and SportsMOT datasets with 62.3% and 76.2% in HOTA metrics respectively. To the best of our knowledge DiffMOT is the first to introduce a diffusion probabilistic model into the MOT to tackle non-linear motion prediction.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Lv_2024_CVPR, author = {Lv, Weiyi and Huang, Yuhang and Zhang, Ning and Lin, Ruei-Sung and Han, Mei and Zeng, Dan}, title = {DiffMOT: A Real-time Diffusion-based Multiple Object Tracker with Non-linear Prediction}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {19321-19330} }