Human Hands As Probes for Interactive Object Understanding

Mohit Goyal, Sahil Modi, Rishabh Goyal, Saurabh Gupta; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 3293-3303

Abstract


Interactive object understanding, or what we can do to objects and how is a long-standing goal of computer vision. In this paper, we tackle this problem through observation of human hands in in-the-wild egocentric videos. We demonstrate that observation of what human hands interact with and how can provide both the relevant data and the necessary supervision. Attending to hands, readily localizes and stabilizes active objects for learning and reveals places where interactions with objects occur. Analyzing the hands shows what we can do to objects and how. We apply these basic principles on the EPIC-KITCHENS dataset, and successfully learn state-sensitive features, and object affordances (regions of interaction and afforded grasps), purely by observing hands in egocentric videos.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Goyal_2022_CVPR, author = {Goyal, Mohit and Modi, Sahil and Goyal, Rishabh and Gupta, Saurabh}, title = {Human Hands As Probes for Interactive Object Understanding}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {3293-3303} }