Memory-Efficient Pseudo-Labeling for Online Source-Free Universal Domain Adaptation using a Gaussian Mixture Model

Pascal Schlachter, Simon Wagner, Bin Yang; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 6425-6434

Abstract


In practice domain shifts are likely to occur between training and test data necessitating domain adaptation (DA) to adjust the pre-trained source model to the target domain. Recently universal domain adaptation (UniDA) has gained attention for addressing the possibility of an additional category (label) shift between the source and target domain. This means new classes can appear in the target data some source classes may no longer be present or both at the same time. For practical applicability UniDA methods must handle both source-free and online scenarios enabling adaptation without access to the source data and performing batch-wise updates in parallel with prediction. In an online setting preserving knowledge across batches is crucial. However existing methods often require substantial memory which is impractical because memory is limited and valuable in particular on embedded systems. Therefore we consider memory-efficiency as an additional constraint. To achieve memory-efficient online source-free universal domain adaptation (SF-UniDA) we propose a novel method that continuously captures the distribution of known classes in the feature space using a Gaussian mixture model (GMM). This approach combined with entropy-based out-of-distribution detection allows for the generation of reliable pseudo-labels. Finally we combine a contrastive loss with a KL divergence loss to perform the adaptation. Our approach not only achieves state-of-the-art results in all experiments on the DomainNet and Office-Home datasets but also significantly outperforms the existing methods on the challenging VisDA-C dataset setting a new benchmark for online SF-UniDA. Our code is available at https://github.com/pascalschlachter/GMM.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Schlachter_2025_WACV, author = {Schlachter, Pascal and Wagner, Simon and Yang, Bin}, title = {Memory-Efficient Pseudo-Labeling for Online Source-Free Universal Domain Adaptation using a Gaussian Mixture Model}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {6425-6434} }