Gradual Source Domain Expansion for Unsupervised Domain Adaptation

Thomas Westfechtel, Hao-Wei Yeh, Dexuan Zhang, Tatsuya Harada; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 1946-1955

Abstract


Unsupervised domain adaptation (UDA) tries to overcome the need of a large labeled dataset by transferring knowledge from a source dataset, with lots of labeled data, to a target dataset, that has no labeled data. Since there are no labels in the target domain, early misalignment might propagate into the later stages and lead to an error build-up. In order to overcome this problem, we propose a gradual source domain expansion (GSDE) algorithm. GSDE trains the UDA task several times from scratch, but each time expands the source dataset with target data. In particular, the highest scoring target data of the previous run are employed as pseudo-source samples with their respective pseudo-label. Using this strategy, the pseudo source samples induce knowledge extracted from the previous run directly from the start of the new training. This helps align the two domains better especially in the early training epochs. In this study, we first introduce a strong baseline network and apply our GSDE strategy to it. We conduct experiments and ablation studies on three benchmarks (Office-31, OfficeHome, and DomainNet) and outperform state-of-the-art methods. We further show that the proposed GSDE strategy can improve the accuracy of a variety of different state-of-the-art UDA approaches.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Westfechtel_2024_WACV, author = {Westfechtel, Thomas and Yeh, Hao-Wei and Zhang, Dexuan and Harada, Tatsuya}, title = {Gradual Source Domain Expansion for Unsupervised Domain Adaptation}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {1946-1955} }