QBitOpt: Fast and Accurate Bitwidth Reallocation During Training

Jorn Peters, Marios Fournarakis, Markus Nagel, Mart van Baalen, Tijmen Blankevoort; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 1282-1291

Abstract


Quantizing neural networks is one of the most effective methods for achieving efficient inference on mobile and embedded devices. In particular, mixed precision quantized (MPQ) networks, whose layers can be quantized to different bitwidths, achieve better task performance for the same re-source constraint compared to networks with homogeneous bitwidths. However, finding the optimal bitwidth allocation is a challenging problem as the search space grows exponentially with the number of layers in the network. In this paper, we propose QBitOpt, a novel algorithm for updating bitwidths during quantization-aware training (QAT). We formulate the bitwidth allocation problem as a constraint optimization problem. By combining fast-to-compute sensitivities with efficient solvers during QAT, QBitOpt can produce mixed-precision networks with high task performance guaranteed to satisfy strict resource constraints. This contrasts with existing mixed-precision methods that learn bitwidths using gradients and cannot provide such guarantees. We evaluate QBitOpt on ImageNet and confirm we outperform fixed-precision methods. We also achieve comparable accuracy to other mixed-precision methods while always meeting the exact resource constraint without the need for hyperparameter search over regularization strength.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Peters_2023_ICCV, author = {Peters, Jorn and Fournarakis, Marios and Nagel, Markus and van Baalen, Mart and Blankevoort, Tijmen}, title = {QBitOpt: Fast and Accurate Bitwidth Reallocation During Training}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {1282-1291} }