Density Map Distillation for Incremental Object Counting

Chenshen Wu, Joost van de Weijer; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2023, pp. 2506-2515

Abstract


In this paper, we investigate the problem of incremental learning for object counting, where a method must learn to count a variety of object classes from a sequence of datasets. A naive approach to incremental object counting would suffer from catastrophic forgetting, where it would suffer from a dramatic performance drop on previous tasks. In this paper, we propose a new exemplar-free functional regularization method, called Density Map Distillation (DMD). During training, we introduce a new counter head for each task and introduce a distillation loss to prevent forgetting of previous tasks. As an additional novelty, we introduce a cross-task adaptor that projects the features of the current backbone to the previous backbone. This projector allows for the learning of new features while the backbone retains the relevant features for previous tasks. Finally, we set up experiments of incremental learning for counting new objects. Results confirm that our method greatly reduces catastrophic forgetting and outperforms existing methods.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Wu_2023_CVPR, author = {Wu, Chenshen and van de Weijer, Joost}, title = {Density Map Distillation for Incremental Object Counting}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {2506-2515} }