SOoD: Self-Supervised Out-of-Distribution Detection Under Domain Shift for Multi-Class Colorectal Cancer Tissue Types
The goal of out-of-distribution (OoD) detection is to identify unseen categories of inputs different from those seen during training, which is an important requirement for the safe deployment of deep neural networks in computational pathology. Additionally, to make OoD detection applicable in clinical applications, one may encounter image data distribution shifts. This paper argues that practical OoD detection should handle both semantic shift and data distribution shift simultaneously. We propose a new self-supervised OoD detector for colorectal cancer tissue types based on a clustering scheme. Our work's central tenet benefits from multi-view consistency learning with a supplementary view based on style augmentation to mitigate domain shift. The learned representation is then adapted to minimize images' predictive entropy to segregate in-distribution examples from OoDs on the target data domain. We evaluated our method on two public colorectal tissue types datasets. Our method achieved state-of-the-art OoD detection performance over various self-supervised baselines. The code, data, and models are available at https://github.com/BehzadBozorgtabar/SOoD.