Unsupervised Out-of-Distribution Detection by Maximum Classifier Discrepancy

Qing Yu, Kiyoharu Aizawa; The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 9518-9526

Abstract


Since deep learning models have been implemented in many commercial applications, it is important to detect out-of-distribution (OOD) inputs correctly to maintain the performance of the models, ensure the quality of the collected data, and prevent the applications from being used for other-than-intended purposes. In this work, we propose a two-head deep convolutional neural network (CNN) and maximize the discrepancy between the two classifiers to detect OOD inputs. We train a two-head CNN consisting of one common feature extractor and two classifiers which have different decision boundaries but can classify in-distribution (ID) samples correctly. Unlike previous methods, we also utilize unlabeled data for unsupervised training and we use these unlabeled data to maximize the discrepancy between the decision boundaries of two classifiers to push OOD samples outside the manifold of the in-distribution (ID) samples, which enables us to detect OOD samples that are far from the support of the ID samples. Overall, our approach significantly outperforms other state-of-the-art methods on several OOD detection benchmarks and two cases of real-world simulation.

Related Material


[pdf]
[bibtex]
@InProceedings{Yu_2019_ICCV,
author = {Yu, Qing and Aizawa, Kiyoharu},
title = {Unsupervised Out-of-Distribution Detection by Maximum Classifier Discrepancy},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}