-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Liu_2025_WACV, author = {Liu, Xiaoyu and Zhou, Beitong and Yue, Zuogong and Cheng, Cheng}, title = {PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {6517-6527} }
PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning
Abstract
Recently the usage of Contrastive Representation Learning (CRL) as a pre-training technique improves the performance of learning with noisy labels (LNL) methods. However instead of pre-training when trivially combining CRL loss with LNL methods as an end-to-end framework the empirical experiments show severe degeneration of the performance. We verify through experiments that this issue is caused by optimization conflicts of losses and propose an end-to-end PLReMix framework by introducing a Pseudo-Label Relaxed (PLR) contrastive loss. This PLR loss constructs a reliable negative set of each sample by filtering out its inappropriate negative pairs alleviating the loss conflicts by trivially combining these losses. The proposed PLR loss is pluggable and we have integrated it into other LNL methods observing their improved performance. Furthermore a two-dimensional Gaussian Mixture Model is adopted to distinguish clean and noisy samples by leveraging semantic information and model outputs simultaneously. Experiments on multiple benchmark datasets demonstrate the effectiveness of the proposed method. Codes will be available.
Related Material