-
[pdf]
[supp]
[code]
[bibtex]@InProceedings{Techapanurak_2020_ACCV, author = {Techapanurak, Engkarat and Suganuma, Masanori and Okatani, Takayuki}, title = {Hyperparameter-Free Out-of-Distribution Detection Using Cosine Similarity}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {November}, year = {2020} }
Hyperparameter-Free Out-of-Distribution Detection Using Cosine Similarity
Abstract
The ability to detect out-of-distribution (OOD) samples is vital to secure the reliability of deep neural networks in real-world applications. Considering the nature of OOD samples, detection methods should not have hyperparameters that need to be tuned depending on incoming OOD samples. However, most of the recently proposed methods do not meet this requirement, leading to compromised performance in real-world applications. In this paper, we propose a simple and computationally efficient, hyperparameter-free method that uses cosine similarity. Although recent studies show its effectiveness for metric learning, it remains uncertain if cosine similarity works well also for OOD detection and, if so, why. We provide an intuitive explanation of why cosine similarity works better than the standard methods that use the maximum of softmax outputs or logits. Besides, there are several differences in the design of output layers, which are essential to achieve the best performance. We show through experiments that our method outperforms the existing methods on the evaluation test recently proposed by Shafaei et al., which takes the above issue of hyperparameter dependency into account; it achieves at least comparable performance to the state-of-the-art on the conventional test, where other methods but ours are allowed to use explicit OOD samples for determining hyperparameters.
Related Material