-
[pdf]
[bibtex]@InProceedings{Atabey_2025_ICCV, author = {Atabey, Salih and Akag\"und\"uz, Erdem}, title = {Binary SqueezeNet: Enhancing Parameter Efficiency in Binary Neural Networks}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2025}, pages = {4111-4120} }
Binary SqueezeNet: Enhancing Parameter Efficiency in Binary Neural Networks
Abstract
Binarization, one of the most aggressive neural network compression techniques, reduces precision from 32-bit floating point to just 1 bit. The primary and most challenging trade-off of this technique is between model performance and the number of parameters/operations. This study focuses on building a binary convolutional neural network while significantly improving memory efficiency for effective deployment. To this end, we propose a novel architecture by introducing several modifications to the binarized SqueezeNet. These modifications, including narrowed and expanded residuals, parallel branching, differentiable binary activation, react operations, and scaling factors, significantly reduce the number of parameters while preserving accuracy. We conduct an ablation study to analyze the effects of the proposed and employed methodologies. As a result, we present two models with parameter sizes of 11.1 and 18.6 megabits, achieving a top-1 accuracy of 45.1% on the ImageNet dataset, outperforming most binary neural networks with comparable accuracy in terms of hardware utilization.
Related Material
