Grouping Crowd-Sourced Mobile Videos for Cross-Camera Tracking

Nathan Frey, Matthew Antone; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2013, pp. 800-807

Abstract


Public adoption of camera-equipped mobile phones has given the average observer of an event the ability to capture their perspective and upload the video for online viewing (e.g. YouTube). When traditional wide-area surveillance systems fail to capture an area or time of interest, crowd-sourced videos can provide the information needed for event reconstruction. This paper presents the first end-to-end method for automatic cross-camera tracking from crowd-sourced mobile video data. Our processing (1) sorts videos into overlapping space-time groups, (2) finds the inter-camera relationships from objects within each view, and (3) provides an end user with multiple stabilized views of tracked objects. We demonstrate the system's effectiveness on a real dataset collected from YouTube.

Related Material


[pdf]
[bibtex]
@InProceedings{Frey_2013_CVPR_Workshops,
author = {Frey, Nathan and Antone, Matthew},
title = {Grouping Crowd-Sourced Mobile Videos for Cross-Camera Tracking},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2013}
}