Simple and Effective Out-of-Distribution Detection via Cosine-based Softmax Loss

SoonCheol Noh, DongEon Jeong, Jee-Hyong Lee; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 16560-16569

Abstract


Deep learning models need to detect out-of-distribution (OOD) data in the inference stage because they are trained to estimate the train distribution and infer the data sampled from the distribution. Many methods have been proposed, but they have some limitations, such as requiring additional data, input processing, or high computational cost. Moreover, most methods have hyperparameters to be set by users, which have a significant impact on the detection rate. We propose a simple and effective OOD detection method by combining the feature norm and the Mahalanobis distance obtained from classification models trained with the cosine-based softmax loss. Our method is practical because it does not use additional data for training, is about three times faster when inferencing than the methods using the input processing, and is easy to apply because it does not have any hyperparameters for OOD detection. We confirm that our method is superior to or at least comparable to state-of-the-art OOD detection methods through the experiments.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Noh_2023_ICCV, author = {Noh, SoonCheol and Jeong, DongEon and Lee, Jee-Hyong}, title = {Simple and Effective Out-of-Distribution Detection via Cosine-based Softmax Loss}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {16560-16569} }