Neural Task Graphs: Generalizing to Unseen Tasks From a Single Video Demonstration

De-An Huang, Suraj Nair, Danfei Xu, Yuke Zhu, Animesh Garg, Li Fei-Fei, Silvio Savarese, Juan Carlos Niebles; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 8565-8574

Abstract


Our goal is to generate a policy to complete an unseen task given just a single video demonstration of the task in a given domain. We hypothesize that to successfully generalize to unseen complex tasks from a single video demonstration, it is necessary to explicitly incorporate the compositional structure of the tasks into the model. To this end, we propose Neural Task Graph (NTG) Networks, which use conjugate task graph as the intermediate representation to modularize both the video demonstration and the derived policy. We empirically show NTG achieves inter-task generalization on two complex tasks: Block Stacking in BulletPhysics and Object Collection in AI2-THOR. NTG improves data efficiency with visual input as well as achieve strong generalization without the need for dense hierarchical supervision. We further show that similar performance trends hold when applied to real-world data. We show that NTG can effectively predict task structure on the JIGSAWS surgical dataset and generalize to unseen tasks.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Huang_2019_CVPR,
author = {Huang, De-An and Nair, Suraj and Xu, Danfei and Zhu, Yuke and Garg, Animesh and Fei-Fei, Li and Savarese, Silvio and Niebles, Juan Carlos},
title = {Neural Task Graphs: Generalizing to Unseen Tasks From a Single Video Demonstration},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}