-
[pdf]
[supp]
[bibtex]@InProceedings{Park_2025_CVPR, author = {Park, Jae Hyeon and Jeon, Joo Hyeon and Lee, Jae Yun and Ahn, Sangyeon and Cha, Min Hee and Kim, Min Geol and Nam, Hyeok and Cho, Sung In}, title = {Dynamic Pseudo Labeling via Gradient Cutting for High-Low Entropy Exploration}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2025}, pages = {20602-20611} }
Dynamic Pseudo Labeling via Gradient Cutting for High-Low Entropy Exploration
Abstract
This study addresses the limitations of existing dynamic pseudo-labeling (DPL) techniques, which often utilize static or dynamic thresholds for confident sample selection. The existing methods fail to capture the non-linear relationship between task accuracy and model confidence, particularly in the context of overconfidence. This can limit the model's learning opportunities for high entropy samples that significantly influence a model's generalization ability. To solve this, we propose a novel gradient pass-based DPL technique that incorporates the high-entropy samples, which are typically overlooked. Our approach introduces two classifiers--low gradient pass (LGP) and high gradient pass (HGP)--to derive over- and under-confident dynamic thresholds that indicate the class-wise overconfidence acceleration, respectively. By combining the under- and over-confident states from the GP classifiers, we create a more adaptive and accurate PL method. Our main contributions highlight the importance of considering both low and high-confidence samples in enhancing the model's robustness and generalization for improved PL performance.
Related Material