Improving Accuracy of Binary Neural Networks Using Unbalanced Activation Distribution

Hyungjun Kim, Jihoon Park, Changhun Lee, Jae-Joon Kim; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021, pp. 7862-7871

Abstract


Binarization of neural network models is considered as one of the promising methods to deploy deep neural network models on resource-constrained environments such as mobile devices. However, Binary Neural Networks (BNNs) tend to suffer from severe accuracy degradation compared to the full-precision counterpart model. Several techniques were proposed to improve the accuracy of BNNs. One of the approaches is to balance the distribution of binary activations so that the amount of information in the binary activations becomes maximum. Based on extensive analysis, in stark contrast to previous work, we argue that unbalanced activation distribution can actually improve the accuracy of BNNs. We also show that adjusting the threshold values of binary activation functions results in the unbalanced distribution of the binary activation, which increases the accuracy of BNN models. Experimental results show that the accuracy of previous BNN models (e.g. XNOR-Net and Bi-Real-Net) can be improved by simply shifting the threshold values of binary activation functions without requiring any other modification.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Kim_2021_CVPR, author = {Kim, Hyungjun and Park, Jihoon and Lee, Changhun and Kim, Jae-Joon}, title = {Improving Accuracy of Binary Neural Networks Using Unbalanced Activation Distribution}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2021}, pages = {7862-7871} }