-
[pdf]
[supp]
[bibtex]@InProceedings{Udayangani_2025_WACV, author = {Udayangani, Nimeshika and Dolatabadi, Hadi Mohaghegh and Erfani, Sarah and Leckie, Christopher}, title = {Exploiting Inter-Sample Information for Long-Tailed Out-of-Distribution Detection}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {8535-8544} }
Exploiting Inter-Sample Information for Long-Tailed Out-of-Distribution Detection
Abstract
Detecting out-of-distribution (OOD) data is essential for safe deployment of deep neural networks (DNNs). This problem becomes particularly challenging in the presence of long-tailed in-distribution (ID) datasets often leading to high false positive rates (FPR) and low tail-class ID classification accuracy. In this paper we demonstrate that exploiting inter-sample relationships using a graph-based representation can significantly improve OOD detection in long-tailed recognition of vision datasets. To this end we use the feature space of a pre-trained model to initialize our graph structure. We account for the differences between the activation layer distribution of the pre-training vs. training data and actively introduce Gaussianization to alleviate any deviations from a standard normal distribution in the activation layers of the pre-trained model. We then refine this initial graph representation using graph convolutional networks (GCNs) to arrive at a feature space suitable for long-tailed OOD detection. This leads us to address the inferior performance observed in ID tail-classes within existing OOD detection methods. Experiments over three benchmarks CIFAR10-LT CIFAR100-LT and ImageNet-LT demonstrate that our method outperforms the state-of-the-art approaches by a large margin in terms of FPR and tail-class ID classification accuracy.
Related Material