-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Li_2021_CVPR, author = {Li, Yuchao and Lin, Shaohui and Liu, Jianzhuang and Ye, Qixiang and Wang, Mengdi and Chao, Fei and Yang, Fan and Ma, Jincheng and Tian, Qi and Ji, Rongrong}, title = {Towards Compact CNNs via Collaborative Compression}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2021}, pages = {6438-6447} }
Towards Compact CNNs via Collaborative Compression
Abstract
Channel pruning and tensor decomposition have received extensive attention in convolutional neural network compression. However, these two techniques are traditionally deployed in an isolated manner, leading to significant accuracy drop when pursuing high compression rates. In this paper, we propose a Collaborative Compression (CC) scheme, which joints channel pruning and tensor decomposition to compress CNN models by simultaneously learning the model sparsity and low-rankness. Specifically, we first investigate the compression sensitivity of each layer in the network, and then propose a Global Compression Rate Optimization that transforms the decision problem of compression rate into an optimization problem. After that, we propose multi-step heuristic compression to remove redundant compression units step-by-step, which fully considers the effect of the remaining compression space (i.e., unremoved compression units). Our method demonstrates superior performance gains over previous ones on various datasets and backbone architectures. For example, we achieve 52.9% FLOPs reduction by removing 48.4% parameters on ResNet-50 with only a Top-1 accuracy drop of 0.56% on ImageNet 2012.
Related Material