Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary Layers in Deep Neural Networks

Hongyi Pan, Diaa Badawi, Ahmet Enis Cetin; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2021, pp. 4650-4659

Abstract


In this paper, we propose a novel layer based on fast Walsh-Hadamard transform (WHT) and smooth-thresholding to replace 1x1 convolution layers in deep neural networks. In the WHT domain, we denoise the transform domain coefficients using the new smooth-thresholding non-linearity, a smoothed version of the well-known soft-thresholding operator. We also introduce a family of multiplication-free operators from the basic 2x2 Hadamard transform to implement 3x3 depthwise separable convolution layers. Using these two types of layers, we replace the bottleneck layers in MobileNet-V2 to reduce the network's number of parameters with a slight loss in accuracy. For example, by replacing the final third bottleneck layers, we reduce the number of parameters from 2.270M to 947K. This reduces the accuracy from 95.21% to 92.88% on the CIFAR-10 dataset. Our approach significantly improves the speed of data processing. The fast Walsh-Hadamard transform has a computational complexity of O(m\log_2 m). As a result, it is computationally more efficient than the 1x1 convolution layer. The fast Walsh-Hadamard layer processes a tensor in \mathbb R ^ 10x32x32x1024 about 2 times faster than 1x1 convolution layer on NVIDIA Jetson Nano computer board.

Related Material


[pdf]
[bibtex]
@InProceedings{Pan_2021_CVPR, author = {Pan, Hongyi and Badawi, Diaa and Cetin, Ahmet Enis}, title = {Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary Layers in Deep Neural Networks}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2021}, pages = {4650-4659} }