OVANet: One-vs-All Network for Universal Domain Adaptation

Kuniaki Saito, Kate Saenko; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 9000-9009

Abstract


Universal Domain Adaptation (UNDA) aims to handle both domain-shift and category-shift between two datasets, where the main challenge is to transfer knowledge while rejecting "unknown" classes which are absent in the labeled source data but present in the unlabeled target data. Existing methods manually set a threshold to reject "unknown" samples based on validation or a pre-defined ratio of "unknown" samples, but this strategy is not practical. In this paper, we propose a method to learn the threshold using source samples and to adapt it to the target domain. Our idea is that a minimum inter-class distance in the source domain should be a good threshold to decide between "known" or "unknown" in the target. To learn the inter- and intra-class distance, we propose to train a one-vs-all classifier for each class using labeled source data. Then, we adapt the open-set classifier to the target domain by minimizing class entropy. The resulting framework is the simplest of all baselines of UNDA and is insensitive to the value of a hyper-parameter, yet outperforms baselines with a large margin.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Saito_2021_ICCV, author = {Saito, Kuniaki and Saenko, Kate}, title = {OVANet: One-vs-All Network for Universal Domain Adaptation}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {9000-9009} }