TiDAL: Learning Training Dynamics for Active Learning

Seong Min Kye, Kwanghee Choi, Hyeongmin Byun, Buru Chang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 22335-22345

Abstract


Active learning (AL) aims to select the most useful data samples from an unlabeled data pool and annotate them to expand the labeled dataset under a limited budget.Especially, uncertainty-based methods choose the most uncertain samples, which are known to be effective in improving model performance.However, AL literature often overlooks training dynamics (TD), defined as the ever-changing model behavior during optimization via stochastic gradient descent, even though other research areas have empirically shown that TD provides important clues for measuring the data uncertainty. In this paper, we first provide theoretical and empirical evidence to argue the usefulness of utilizing the ever-changing model behavior rather than the fully trained model snapshot. We then propose a novel AL method, Training Dynamics for Active Learning (TiDAL), which efficiently predicts the training dynamics of unlabeled data to estimate their uncertainty. Experimental results show that our TiDAL achieves better or comparable performance on both balanced and imbalanced benchmark datasets compared to state-of-the-art AL methods, which estimate data uncertainty using only static information after model training.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Kye_2023_ICCV, author = {Kye, Seong Min and Choi, Kwanghee and Byun, Hyeongmin and Chang, Buru}, title = {TiDAL: Learning Training Dynamics for Active Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {22335-22345} }