-
[pdf]
[arXiv]
[bibtex]@InProceedings{Tang_2021_ICCV, author = {Tang, Zhiqiang and Gao, Yunhe and Zhu, Yi and Zhang, Zhi and Li, Mu and Metaxas, Dimitris N.}, title = {CrossNorm and SelfNorm for Generalization Under Distribution Shifts}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {52-61} }
CrossNorm and SelfNorm for Generalization Under Distribution Shifts
Abstract
Traditional normalization techniques (e.g., Batch Normalization and Instance Normalization) generally and simplistically assume that training and test data follow the same distribution. As distribution shifts are inevitable in real-world applications, well-trained models with previous normalization methods can perform badly in new environments. Can we develop new normalization methods to improve generalization robustness under distribution shifts? In this paper, we answer the question by proposing CrossNorm and SelfNorm. CrossNorm exchanges channel-wise mean and variance between feature maps to enlarge training distribution, while SelfNorm uses attention to recalibrate the statistics to bridge gaps between training and test distributions. CrossNorm and SelfNorm can complement each other, though exploring different directions in statistics usage. Extensive experiments on different fields (vision and language), tasks (classification and segmentation), settings (supervised and semi-supervised), and distribution shift types (synthetic and natural) show the effectiveness. Code is available at https://github.com/amazon-research/crossnorm-selfnorm
Related Material