Meta Module Generation for Fast Few-Shot Incremental Learning

Shudong Xie, Yiqun Li, Dongyun Lin, Tin Lay Nwe, Sheng Dong; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


There are two challenging problems in applying standard Deep Neural Networks (DNNs) for incremental learning with a few examples: (i) DNNs do not perform well when little training data is available; (ii) DNNs suffer from catastrophic forgetting when used for incremental class learning. To simultaneously address both problems, we propose Meta Module Generation (MetaMG), a meta-learning method that enables a module generator to rapidly generate a category module from a few examples for a scalable classification network to recognize a new category. The old categories are not forgotten after new categories are added in. Comprehensive experiments conducted on 4 datasets show that our method is promising for fast incremental learning in few-shot setting. Further experiments on the miniImageNet dataset show that even it is not specially designed for the N-wayK-shot learning problem, MetaMG can sitll perform relatively well especially for 20-way K-shot setting.

Related Material


[pdf]
[bibtex]
@InProceedings{Xie_2019_ICCV,
author = {Xie, Shudong and Li, Yiqun and Lin, Dongyun and Lay Nwe, Tin and Dong, Sheng},
title = {Meta Module Generation for Fast Few-Shot Incremental Learning},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}