MPM: Joint Representation of Motion and Position Map for Cell Tracking

Junya Hayashida, Kazuya Nishimura, Ryoma Bise; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 3823-3832

Abstract


Conventional cell tracking methods detect multiple cells in each frame (detection) and then associate the detection results in successive time-frames (association). Most cell tracking methods perform the association task independently from the detection task. However, there is no guarantee of preserving coherence between these tasks, and lack of coherence may adversely affect tracking performance. In this paper, we propose the Motion and Position Map (MPM) that jointly represents both detection and association for not only migration but also cell division. It guarantees coherence such that if a cell is detected, the corresponding motion flow can always be obtained. It is a simple but powerful method for multi-object tracking in dense environments. We compared the proposed method with current tracking methods under various conditions in real biological images and found that it outperformed the state-of-the-art (+5.2% improvement compared to the second-best).

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Hayashida_2020_CVPR,
author = {Hayashida, Junya and Nishimura, Kazuya and Bise, Ryoma},
title = {MPM: Joint Representation of Motion and Position Map for Cell Tracking},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}