-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Bai_2024_CVPR, author = {Bai, Yichen and Han, Zongbo and Cao, Bing and Jiang, Xiaoheng and Hu, Qinghua and Zhang, Changqing}, title = {ID-like Prompt Learning for Few-Shot Out-of-Distribution Detection}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {17480-17489} }
ID-like Prompt Learning for Few-Shot Out-of-Distribution Detection
Abstract
Out-of-distribution (OOD) detection methods often exploit auxiliary outliers to train model identifying OOD samples especially discovering challenging outliers from auxiliary outliers dataset to improve OOD detection. However they may still face limitations in effectively distinguishing between the most challenging OOD samples that are much like in-distribution (ID) data i.e. ID-like samples. To this end we propose a novel OOD detection framework that discovers ID-like outliers using CLIP from the vicinity space of the ID samples thus helping to identify these most challenging OOD samples. Then a prompt learning framework is proposed that utilizes the identified ID-like outliers to further leverage the capabilities of CLIP for OOD detection. Benefiting from the powerful CLIP we only need a small number of ID samples to learn the prompts of the model without exposing other auxiliary outlier datasets. By focusing on the most challenging ID-like OOD samples and elegantly exploiting the capabilities of CLIP our method achieves superior few-shot learning performance on various real-world image datasets (e.g. in 4-shot OOD detection on the ImageNet-1k dataset our method reduces the average FPR95 by 12.16% and improves the average AUROC by 2.76% compared to state-of-the-art methods).
Related Material