-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Renggli_2022_CVPR, author = {Renggli, Cedric and Pinto, Andr\'e Susano and Rimanic, Luka and Puigcerver, Joan and Riquelme, Carlos and Zhang, Ce and Lu\v{c}i\'c, Mario}, title = {Which Model To Transfer? Finding the Needle in the Growing Haystack}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {9205-9214} }
Which Model To Transfer? Finding the Needle in the Growing Haystack
Abstract
Transfer learning has been recently popularized as a data-efficient alternative to training models from scratch, in particular for computer vision tasks where it provides a remarkably solid baseline. The emergence of rich model repositories, such as TensorFlow Hub, enables the practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As these repositories keep growing exponentially, efficiently selecting a good model for the task at hand becomes paramount. We provide a formalization of this problem through a familiar notion of regret and introduce the predominant strategies, namely task-agnostic (e.g. ranking models by their ImageNet performance) and task-aware search strategies (such as linear or kNN evaluation). We conduct a large-scale empirical study and show that both task-agnostic and task-aware methods can yield high regret. We then propose a simple and computationally efficient hybrid search strategy which outperforms the existing approaches. We highlight the practical benefits of the proposed solution on a set of 19 diverse vision tasks.
Related Material