Maximum Entropy Information Bottleneck for Uncertainty-Aware Stochastic Embedding

Sungtae An, Nataraj Jammalamadaka, Eunji Chong; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2023, pp. 3809-3818

Abstract


Stochastic embedding has several advantages over deterministic embedding, such as the capability of associating uncertainty with the resulting embedding and robustness to noisy data. This is especially useful when the input data has ambiguity (e.g., blurriness or corruption) which often happens with in-the-wild settings. Many existing methods for stochastic embedding are limited by the assumption that the embedding follows a standard normal distribution under the variational information bottleneck principle. We present a different variational approach to stochastic embedding in which maximum entropy acts as the bottleneck, which we call Maximum Entropy Information Bottleneck or MEIB. We show that models trained with the MEIB objective outperform existing methods in terms of regularization, perturbation robustness, probabilistic contrastive learning, and risk-controlled recognition performance.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{An_2023_CVPR, author = {An, Sungtae and Jammalamadaka, Nataraj and Chong, Eunji}, title = {Maximum Entropy Information Bottleneck for Uncertainty-Aware Stochastic Embedding}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {3809-3818} }