Attention-Guided Prototype Mixing: Diversifying Minority Context on Imbalanced Whole Slide Images Classification Learning

Farchan Hakim Raswa, Chun-Shien Lu, Jia-Ching Wang; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 7624-7633

Abstract


Real-world medical datasets often suffer from class imbalance, which can lead to degraded performance due to limited samples of the minority class. In another line of research, Transformer-based multiple instance learning (Transformer-MIL) has shown promise in addressing the pairwise correlation between instances in medical whole slide images (WSIs) with gigapixel resolution and non-uniform sizes. However, these characteristics pose challenges for state-of-the-art (SOTA) oversampling methods aiming at diversifying the minority context in imbalanced WSIs. In this paper, we propose an Attention-Guided Prototype Mixing scheme at the WSI level. We leverage Transformer-MIL training to determine the distribution of semantic instances and identify relevant instances for cutting and pasting across different WSI (bag of instances). To our knowledge, applying Transformer is often limited by memory requirements and time complexity, particularly when dealing with gigabyte-sized WSIs. We introduce the concept of prototype instances that have smaller representations while preserving the uniform size and intrinsic features of the WSI. We demonstrate that our proposed method can boost performance compared to competitive SOTA oversampling and augmentation methods at an imbalanced WSI level.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Raswa_2024_WACV, author = {Raswa, Farchan Hakim and Lu, Chun-Shien and Wang, Jia-Ching}, title = {Attention-Guided Prototype Mixing: Diversifying Minority Context on Imbalanced Whole Slide Images Classification Learning}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {7624-7633} }