SK-RD4AD : Skip-Connected Reverse Distillation For Robust One-Class Anomaly Detection

EunJu Park, Taekyung Kim, Minju Kim, Hojun Lee, Gil-Jun Lee; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2025, pp. 3984-3992

Abstract


Anomaly detection plays a critical role in industrial, healthcare, and security applications by enabling early identification of defects. While Reverse Knowledge Distillation (KD) has shown promise for one-class anomaly detection, existing models often suffer from deep feature loss due to excessive compression in the Student network, limiting their ability to detect fine-grained anomalies. We propose SK-RD4AD, a novel framework that introduces non-corresponding skip connections from intermediate Teacher layers to deeper Student layers. This cross-hierarchical feature transfer preserves multi-scale representations, enhancing both semantic alignment and anomaly localization. Extensive experiments on MVTec-AD, VisA, and VAD demonstrate that SK-RD4AD consistently outperforms prior methods. Specifically, it improves AUROC by +3.5% on VAD, boosts AUPRO by +21% on VisA, and achieves +1% gains on MVTec-AD. The model shows particular robustness on challenging cases such as the Transistor category in MVTec-AD and generalizes well across diverse domains. Our results establish SK-RD4AD as a robust and scalable solution for real-world one-class anomaly detection. Code is available at: https://github.com/pej0918/SK-RD4AD

Related Material


[pdf]
[bibtex]
@InProceedings{Park_2025_CVPR, author = {Park, EunJu and Kim, Taekyung and Kim, Minju and Lee, Hojun and Lee, Gil-Jun}, title = {SK-RD4AD : Skip-Connected Reverse Distillation For Robust One-Class Anomaly Detection}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2025}, pages = {3984-3992} }