-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Yu_2024_CVPR, author = {Yu, Han and Zhang, Xingxuan and Xu, Renzhe and Liu, Jiashuo and He, Yue and Cui, Peng}, title = {Rethinking the Evaluation Protocol of Domain Generalization}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {21897-21908} }
Rethinking the Evaluation Protocol of Domain Generalization
Abstract
Domain generalization aims to solve the challenge of Out-of-Distribution (OOD) generalization by leveraging common knowledge learned from multiple training domains to generalize to unseen test domains. To accurately evaluate the OOD generalization ability it is required that test data information is unavailable. However the current domain generalization protocol may still have potential test data information leakage. This paper examines the risks of test data information leakage from two aspects of the current evaluation protocol: supervised pretraining on ImageNet and oracle model selection. We propose modifications to the current protocol that we should employ self-supervised pretraining or train from scratch instead of employing the current supervised pretraining and we should use multiple test domains. These would result in a more precise evaluation of OOD generalization ability. We also rerun the algorithms with the modified protocol and introduce new leaderboards to encourage future research in domain generalization with a fairer comparison.
Related Material