Delving Into the Estimation Shift of Batch Normalization in a Network

Lei Huang, Yi Zhou, Tian Wang, Jie Luo, Xianglong Liu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 763-772

Abstract


Batch normalization (BN) is a milestone technique in deep learning. It normalizes the activation using mini-batch statistics during training but the estimated population statistics during inference. This paper focuses on investigating the estimation of population statistics. We define the estimation shift magnitude of BN to quantitatively measure the difference between its estimated population statistics and expected ones. Our primary observation is that the estimation shift can be accumulated due to the stack of BN in a network, which has detriment effects for the test performance. We further find a batch-free normalization (BFN) can block such an accumulation of estimation shift. These observations motivate our design of XBNBlock that replace one BN with BFN in the bottleneck block of residual-style networks. Experiments on the ImageNet and COCO benchmarks show that XBNBlock consistently improves the performance of different architectures, including ResNet and ResNeXt, by a significant margin and seems to be more robust to distribution shift.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Huang_2022_CVPR, author = {Huang, Lei and Zhou, Yi and Wang, Tian and Luo, Jie and Liu, Xianglong}, title = {Delving Into the Estimation Shift of Batch Normalization in a Network}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {763-772} }