Efficient Transferability Assessment for Selection of Pre-Trained Detectors

Zhao Wang, Aoxue Li, Zhenguo Li, Qi Dou; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 1679-1689

Abstract


Large-scale pre-training followed by downstream fine-tuning is an effective solution for transferring deep-learning-based models. Since finetuning all possible pre-trained models is computational costly, we aim to predict the transferability performance of these pre-trained models in a computational efficient manner. Different from previous work that seek out suitable models for downstream classification and segmentation tasks, this paper studies the efficient transferability assessment of pre-trained object detectors. To this end, we build up a detector transferability benchmark which contains a large and diverse zoo of pre-trained detectors with various architectures, source datasets and training schemes. Given this zoo, we adopt 6 target datasets from 5 diverse domains as the downstream target tasks for evaluation. Further, we propose to assess classification and regression sub-tasks simultaneously in a unified framework. Additionally, we design a complementary metric for evaluating tasks with varying objects. Experimental results demonstrate that our method outperforms other state-of-the-art approaches in assessing transferability under different target domains while efficiently reducing wall-clock time 32x and requiring a mere 5.2% memory footprint compared to brute-force fine-tuning of all pre-trained detectors. Our assessment code and benchmark will be publicly available.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Wang_2024_WACV, author = {Wang, Zhao and Li, Aoxue and Li, Zhenguo and Dou, Qi}, title = {Efficient Transferability Assessment for Selection of Pre-Trained Detectors}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {1679-1689} }