On Generalizing Beyond Domains in Cross-Domain Continual Learning

Christian Simon, Masoud Faraki, Yi-Hsuan Tsai, Xiang Yu, Samuel Schulter, Yumin Suh, Mehrtash Harandi, Manmohan Chandraker; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 9265-9274

Abstract


In the real world, humans have the ability to accumulate new knowledge in any conditions. However, deeplearning suffers from the phenomenon so-called catastrophic forgetting of the previously observed knowledge after learning a new task. Many recent methods focus on preventing catastrophic forgetting under a typical assumption of thetrain and test data following a similar distribution. In thiswork, we consider the more realistic scenario of continuallearning under domain shifts where the model is able to gen-eralize its inference to a an unseen domain. To this end, wepropose to make use of sample correlations of the learning tasks in the classifiers where the subsequent optimization isperformed over similarity measures obtained in a similar fashion to the Mahalanobis distance computation. In addition, we also propose an approach based on the exponential moving average of the parameters for better knowledge distillation, allowing a further adaptation to the old model. We demonstrate in our experiments that, to a great extent, the past continual learning algorithms fail to handle the forgetting issue under multiple distributions, while our proposed approach identifies the task under domain shift where insome cases can boost up the performance up to 10% on the challenging datasets e.g., DomainNet and OfficeHome.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Simon_2022_CVPR, author = {Simon, Christian and Faraki, Masoud and Tsai, Yi-Hsuan and Yu, Xiang and Schulter, Samuel and Suh, Yumin and Harandi, Mehrtash and Chandraker, Manmohan}, title = {On Generalizing Beyond Domains in Cross-Domain Continual Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {9265-9274} }