-
[pdf]
[supp]
[bibtex]@InProceedings{Fan_2024_CVPR, author = {Fan, Ke and Liu, Tong and Qiu, Xingyu and Wang, Yikai and Huai, Lian and Shangguan, Zeyu and Gou, Shuang and Liu, Fengjian and Fu, Yuqian and Fu, Yanwei and Jiang, Xingqun}, title = {Test-Time Linear Out-of-Distribution Detection}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {23752-23761} }
Test-Time Linear Out-of-Distribution Detection
Abstract
Out-of-Distribution (OOD) detection aims to address the excessive confidence prediction by neural networks by triggering an alert when the input sample deviates significantly from the training distribution (in-distribution) indicating that the output may not be reliable. Current OOD detection approaches explore all kinds of cues to identify OOD data such as finding irregular patterns in the feature space logit space gradient space or the raw image space. Surprisingly we observe a linear trend between the OOD score produced by current OOD detection algorithms and the network features on several datasets. We conduct a thorough investigation theoretically and empirically to analyze and understand the meaning of such a linear trend in OOD detection. This paper proposes a Robust Test-time Linear method (RTL) to utilize such linear trends like a `free lunch' when we have a batch of data to perform OOD detection. By using a simple linear regression as a test time adaptation we can make a more precise OOD prediction. We further propose an online variant of the proposed method which achieves promising performance and is more practical for real applications. Theoretical analysis is given to prove the effectiveness of our methods. Extensive experiments on several OOD datasets show the efficacy of RTL for OOD detection tasks significantly improving the results of base OOD detectors. Project will be available at https://github.com/kfan21/RTL.
Related Material