-
[pdf]
[bibtex]@InProceedings{Fu_2024_ACCV, author = {Fu, Zhuobin and Chang, Kan and Ling, Mingyang and Zhang, Qingzhi and Qi, Enze}, title = {Auxiliary Domain-guided Adaptive Detection in Adverse Weather Conditions}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {3964-3981} }
Auxiliary Domain-guided Adaptive Detection in Adverse Weather Conditions
Abstract
To enhance detection accuracy in adverse weather conditions, domain adaptation methods that extract domain-invariant features from both the source and target domains have been proposed for one-stage detectors. However, the use of pseudo-labels in the instance-level domain adaptation inevitably introduces noise. To tackle this challenge, we propose an auxiliary domain-guided adaptive one-stage detection method. Firstly, a generative network is used to transform source domain images into the auxiliary domain. To form a suitable auxiliary domain that can provide reliable guidance for instance-level adaptation of the detector, the generated images are required to possess a similar style to that of the target domain, while also being restricted to maintaining the same object categories and location information as the source domain images. Secondly, for instance-level adaptation, we treat the same object from different domains as positive samples and different objects as negative samples, and utilize contrastive learning to ensure that only the domain shift, rather than other differences in data distributions between distinct instances, is reduced during adaptation. Experimental results demonstrate that the proposed algorithm achieves a significant improvement over state-of-the-art (SOTA) algorithms on real datasets captured under adverse weather conditions.
Related Material