-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Xiang_2024_CVPR, author = {Xiang, Jingyang and Li, Siqi and Chen, Junhao and Chen, Zhuangzhi and Huang, Tianxin and Peng, Linpeng and Liu, Yong}, title = {MaxQ: Multi-Axis Query for N:M Sparsity Network}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {15845-15854} }
MaxQ: Multi-Axis Query for N:M Sparsity Network
Abstract
N:M sparsity has received increasing attention due to its remarkable performance and latency trade-off compared with structured and unstructured sparsity. However existing N:M sparsity methods do not differentiate the relative importance of weights among blocks and leave important weights underappreciated. Besides they directly apply N:M sparsity to the whole network which will cause severe information loss. Thus they are still sub-optimal. In this paper we propose an efficient and effective Multi-Axis Query methodology dubbed as MaxQ to rectify these problems. During the training MaxQ employs a dynamic approach to generate soft N:M masks considering the weight importance across multiple axes. This method enhances the weights with more importance and ensures more effective updates. Meanwhile a sparsity strategy that gradually increases the percentage of N:M weight blocks is applied which allows the network to heal from the pruning-induced damage progressively. During the runtime the N:M soft masks can be precomputed as constants and folded into weights without causing any distortion to the sparse pattern and incurring additional computational overhead. Comprehensive experiments demonstrate that MaxQ achieves consistent improvements across diverse CNN architectures in various computer vision tasks including image classification object detection and instance segmentation. For ResNet50 with 1:16 sparse pattern MaxQ can achieve 74.6% top-1 accuracy on ImageNet and improve by over 2.8% over the state-of-the-art. Codes and checkpoints are available at https://github.com/JingyangXiang/MaxQ.
Related Material