Taskonomy: Disentangling Task Transfer Learning
Amir R. Zamir, Alexander Sax, William Shen, Leonidas J. Guibas, Jitendra Malik, Silvio Savarese; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 3712-3722
Abstract
Do visual tasks have a relationship, or are they unrelated? For instance, could having surface normals simplify estimating the depth of an image? Intuition answers these questions positively, implying existence of a structure among visual tasks. Knowing this structure has notable uses; it is the concept underlying transfer learning and, for example, can provide a principled way for reusing supervision among related tasks, finding what tasks transfer well to an arbitrary target task, or solving many tasks in one system without piling up the complexity. This paper proposes a fully computational approach for finding the structure of the space of visual tasks. This is done via a sampled dictionary of twenty six 2D, 2.5D, 3D, and semantic tasks, and modeling their (1st and higher order) transfer dependencies in a latent space. The product can be viewed as a computational taxonomic map for task transfer learning. We study the consequences of this structure, e.g. the nontrivial emerged relationships, and exploit them to reduce the demand for labeled data. For example, we show that the total number of labeled datapoints needed for solving a set of 10 tasks can be reduced by roughly 2/3 while keeping the performance nearly the same. Users can employ a provided Binary Integer Programming solver that leverages the taxonomy to find efficient supervision policies for their own use cases.
Related Material
[pdf]
[arXiv]
[video]
[
bibtex]
@InProceedings{Zamir_2018_CVPR,
author = {Zamir, Amir R. and Sax, Alexander and Shen, William and Guibas, Leonidas J. and Malik, Jitendra and Savarese, Silvio},
title = {Taskonomy: Disentangling Task Transfer Learning},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}