Searching for Energy-Efficient Hybrid Adder-Convolution Neural Networks

Wenshuo Li, Xinghao Chen, Jinyu Bai, Xuefei Ning, Yunhe Wang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 1943-1952

Abstract


As convolutional neural networks (CNNs) are more and more widely used in computer vision area, the energy consumption of CNNs has become the focus of researchers. For edge devices, both the battery life and the inference latency are critical and directly affect user experience. Recently, great progress has been made in the design of neural architectures and new operators. The emergence of neural architecture search technology has improved the performance of network step by step, and liberated the productivity of engineers to a certain extent. New operators, such as AdderNets, make it possible to further improve the energy efficiency of neural networks. In this paper, we explore the fusion of new adder operators and common convolution operators into state-of-the-art light-weight networks, GhostNet, to search for models with better energy efficiency and performance. Our proposed search equilibrium strategy ensures that the adder and convolution operators can be treated fairly in the search, and the resulting model achieves the same accuracy of 73.9% with GhostNet on the ImageNet dataset at an extremely low power consumption of 0.612 mJ. When keeping the same energy consumption, the accuracy reaches 74.3% which is 0.4% higher than original GhostNet.

Related Material


[pdf]
[bibtex]
@InProceedings{Li_2022_CVPR, author = {Li, Wenshuo and Chen, Xinghao and Bai, Jinyu and Ning, Xuefei and Wang, Yunhe}, title = {Searching for Energy-Efficient Hybrid Adder-Convolution Neural Networks}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {1943-1952} }