Faster convergence and Uncorrelated gradients in Self-Supervised Online Continual Learning

Koyo Imai, Naoto Hayashi, Tsubasa Hirakawa, Takayoshi Yamashita, Hironobu Fujiyoshi; Proceedings of the Asian Conference on Computer Vision (ACCV), 2024, pp. 436-453

Abstract


Self-Supervised Online Continual Learning (SSOCL) focuses on continuously training neural networks from data streams. This presents a more realistic Self-Supervised Learning (SSL) problem setting, where the goal is to learn directly from real-world data streams. However, common SSL requires multiple offline training sessions with fixed IID datasets to acquire appropriate feature representations. In contrast, SSOCL involves learning from a non-IID data stream where the data distribution changes over time, and new data is added sequentially. Consequently, the challenges are insufficient learning with changing data distributions and the learning inferior feature representations from non-IID data streams. In this study, we propose a method to address these challenges in SSOCL. The proposal method consists of a Multi-Crop Contrastive Loss, TCR Loss, and data selection based on cosine similarity to representative features. Multi-Crop Contrastive Loss and TCR Loss enable quick adaptation to changes in data distribution. Cosine similarity-based data selection ensures diverse data is stored in the replay buffer, facilitating learning from non-IID data streams. The proposed method shows superior accuracy compared to existing methods in evaluations using CIFAR-10, CIFAR-100, ImageNet-100, and CORe50.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Imai_2024_ACCV, author = {Imai, Koyo and Hayashi, Naoto and Hirakawa, Tsubasa and Yamashita, Takayoshi and Fujiyoshi, Hironobu}, title = {Faster convergence and Uncorrelated gradients in Self-Supervised Online Continual Learning}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {436-453} }