EgoPoints: Advancing Point Tracking for Egocentric Videos

Ahmad Darkhalil, Rhodri Guerrier, Adam W. Harley, Dima Damen; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 8545-8554

Abstract


We introduce EgoPoints a benchmark for point tracking in egocentric videos. We annotate 4.7K challenging tracks in egocentric sequences. Compared to the popular TAP-Vid-DAVIS evaluation benchmark we include 9x more points that go out-of-view and 59x more points that require re-identification (ReID) after returning to view. To measure the performance of models on these challenging points we introduce evaluation metrics that specifically monitor tracking performance on points in-view out-of-view and points that require re-identification. We then propose a pipeline to create semi-real sequences with automatic ground truth. We generate 11K such sequences by combining dynamic Kubric objects with scene points from EPIC Fields. When fine-tuning point tracking methods on these sequences and evaluating on our annotated EgoPoints sequences we improve CoTracker across all metrics including the tracking accuracy d^*_avg by 2.7 percentage points and accuracy on ReID sequences (ReIDd_avg) by 2.4 points. We also improve d^*_avg and ReIDd_avg of PIPs++ by 0.3 and 2.8 respectively.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Darkhalil_2025_WACV, author = {Darkhalil, Ahmad and Guerrier, Rhodri and Harley, Adam W. and Damen, Dima}, title = {EgoPoints: Advancing Point Tracking for Egocentric Videos}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {8545-8554} }