Semi-Supervised Deep Learning with Memory

Yanbei Chen, Xiatian Zhu, Shaogang Gong; Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 268-283

Abstract


We consider the semi-supervised multi-class classification problem of learning from sparse labelled and abundant unlabelled training data. To address this problem, existing semi-supervised deep learning methods often rely on the up-to-date “network-in-training” to formulate the semi-supervised learning objective. This ignores both the discriminative feature representation and the model inference uncertainty revealed by the network in the preceding learning iterations, referred to as the memory of model learning. In this work, we propose a novel Memory-Assisted Deep Neural Network (MA-DNN) capable of exploiting the memory of model learning to enable semi-supervised learning. Specifically, we introduce a memory mechanism into the network training process as an assimilation-accommodation interaction between the network and an external memory module. Experiments demonstrate the advantages of the proposed MA-DNN model over the state-of-the-art semi-supervised deep learning methods on three image classification benchmark datasets: SVHN, CIFAR10, and CIFAR100.

Related Material


[pdf]
[bibtex]
@InProceedings{Chen_2018_ECCV,
author = {Chen, Yanbei and Zhu, Xiatian and Gong, Shaogang},
title = {Semi-Supervised Deep Learning with Memory},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}