Robust Trajectory Clustering for Motion Segmentation

Feng Shi, Zhong Zhou, Jiangjian Xiao, Wei Wu; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 3088-3095

Abstract


Due to occlusions and objects' non-rigid deformation in the scene, the obtained motion trajectories from common trackers may contain a number of missing or mis-associated entries. To cluster such corrupted point based trajectories into multiple motions is still a hard problem. In this paper, we present an approach that exploits temporal and spatial characteristics from tracked points to facilitate segmentation of incomplete and corrupted trajectories, thereby obtain highly robust results against severe data missing and noises. Our method first uses the Discrete Cosine Transform (DCT) bases as a temporal smoothness constraint on trajectory projection to ensure the validity of resulting components to repair pathological trajectories. Then, based on an observation that the trajectories of foreground and background in a scene may have different spatial distributions, we propose a two-stage clustering strategy that first performs foreground-background separation then segments remaining foreground trajectories. We show that, with this new clustering strategy, sequences with complex motions can be accurately segmented by even using a simple translational model. Finally, a series of experiments on Hopkins 155 dataset and Berkeley motion segmentation dataset show the advantage of our method over other state-of-the-art motion segmentation algorithms in terms of both effectiveness and robustness.

Related Material


[pdf]
[bibtex]
@InProceedings{Shi_2013_ICCV,
author = {Shi, Feng and Zhou, Zhong and Xiao, Jiangjian and Wu, Wei},
title = {Robust Trajectory Clustering for Motion Segmentation},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}