Deep Transfer Learning for Multiple Class Novelty Detection

Pramuditha Perera, Vishal M. Patel; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 11544-11552

Abstract


We propose a transfer learning-based solution for the problem of multiple class novelty detection. In particular, we propose an end-to-end deep-learning based approach in which we investigate how the knowledge contained in an external, out-of-distributional dataset can be used to improve the performance of a deep network for visual novelty detection. Our solution differs from the standard deep classification networks on two accounts. First, we use a novel loss function, membership loss, in addition to the classical cross-entropy loss for training networks. Secondly, we use the knowledge from the external dataset more effectively to learn globally negative filters, filters that respond to generic objects outside the known class set. We show that thresholding the maximal activation of the proposed network can be used to identify novel objects effectively. Extensive experiments on four publicly available novelty detection datasets show that the proposed method achieves significant improvements over the state-of-the-art methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Perera_2019_CVPR,
author = {Perera, Pramuditha and Patel, Vishal M.},
title = {Deep Transfer Learning for Multiple Class Novelty Detection},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}