Lark: Low-Rank Updates After Knowledge Localization for Few-shot Class-Incremental Learning

Jinxin Shi, Jiabao Zhao, Yifan Yang, Xingjiao Wu, Jiawen Li, Liang He; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2025, pp. 3607-3617

Abstract


For Few-Shot Class-Incremental Learning (FSCIL), direct fine-tuning causes significant parameter shifts, resulting in catastrophic forgetting and increased resource consumption. While, freezing the pre-trained backbone exacerbates the inconsistency between the backbone and the evolving classifier. To overcome these challenges, we introduce a method called Low-Rank updates after knowledge localization (Lark). In the knowledge localization phase, the Fisher Information Matrix is calculated to measure the sensitivity of parameters in different layers to previously acquired knowledge. This phase ultimately identifies the parameters within the model that are most suitable for learning new knowledge. In the subsequent incremental editing phase, a low-rank incremental update strategy is applied. This strategy ensures that the model parameter updates adhere to a Rank-One matrix structure. By doing so, it minimizes alterations to the original parameters, thereby enabling the model to integrate new knowledge while retaining as much of the previous knowledge as possible. Extensive experimental results demonstrate that the Lark method achieves significant performance improvements on the CIFAR100, mini-ImageNet, and CUB200 datasets, surpassing current state-of-the-art methods.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Shi_2025_ICCV, author = {Shi, Jinxin and Zhao, Jiabao and Yang, Yifan and Wu, Xingjiao and Li, Jiawen and He, Liang}, title = {Lark: Low-Rank Updates After Knowledge Localization for Few-shot Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2025}, pages = {3607-3617} }