-
[pdf]
[bibtex]@InProceedings{Li_2023_CVPR, author = {Li, Zhangheng and Gong, Yu and Zhang, Zhenyu and Xue, Xingyun and Chen, Tianlong and Liang, Yi and Yuan, Bo and Wang, Zhangyang}, title = {Accelerable Lottery Tickets With the Mixed-Precision Quantization}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {4604-4612} }
Accelerable Lottery Tickets With the Mixed-Precision Quantization
Abstract
In recent years, the lottery tickets hypothesis has gained widespread popularity as a means of network compression. However, the practical application of lottery tickets for hardware acceleration is difficult due to their element-wise unstructured sparsity nature. In this paper, we argue that network pruning can be seen as a special case of network quantization, and relax the hard network pruning with mixed-precision quantization in an unstructured manner, which makes it possible for real hardware acceleration. We successfully validate the wide existence of quantized lottery tickets, namely MPQ-tickets, that can match or even surpass the performance of corresponding full-precision dense networks on various representative benchmarks. Also, we demonstrate that MPQ-tickets have much higher flexibility than vanilla lottery tickets, and largely benefit from pruning when compared to QNNs. Moreover, the MPQ-tickets achieve up to 8x hardware acceleration of inference speed and 14x less memory consumption than full-precision models.
Related Material