Measuring Asymmetric Gradient Discrepancy in Parallel Continual Learning

Fan Lyu, Qing Sun, Fanhua Shang, Liang Wan, Wei Feng; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 11411-11420

Abstract


In Parallel Continual Learning (PCL), the parallel multiple tasks start and end training unpredictably, thus suffering from training conflict and catastrophic forgetting issues. The two issues are raised because the gradients from parallel tasks differ in directions and magnitudes. Thus, in this paper, we formulate the PCL into a minimum distance optimization problem among gradients and propose an explicit Asymmetric Gradient Distance (AGD) to evaluate the gradient discrepancy in PCL. AGD considers both gradient magnitude ratios and directions, and has a tolerance when updating with a small gradient of inverse direction, which reduces the imbalanced influence of gradients on parallel task training. Moreover, we propose a novel Maximum Discrepancy Optimization (MaxDO) strategy to minimize the maximum discrepancy among multiple gradients. Solving by MaxDO with AGD, parallel training reduces the influence of the training conflict and suppresses the catastrophic forgetting of finished tasks. Extensive experiments validate the effectiveness of our approach on three image recognition datasets.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Lyu_2023_ICCV, author = {Lyu, Fan and Sun, Qing and Shang, Fanhua and Wan, Liang and Feng, Wei}, title = {Measuring Asymmetric Gradient Discrepancy in Parallel Continual Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {11411-11420} }