VACL: Variance-Aware Cross-Layer Regularization for Pruning Deep Residual Networks

Susan Gao, Xin Liu, Lung-Sheng Chien, William Zhang, Jose M. Alvarez; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


Improving weight sparsity is a common strategy for producing light-weight deep neural networks. However, pruning models with residual learning is more challenging. In this paper, we introduce a novel approach to address this problem. Our method puts the ith filters of layers connected by skip-connections into one regularization group. Additionally, we define Variance-Aware Cross-Layer (VACL) regularization which takes into account both the first and second-order statistics of the connected layers to constrain the variance within a group. Our approach can effectively improve the structural sparsity of residual models. For CIFAR10, the proposed method reduces a ResNet model by up to 79.5% with no accuracy drop, and reduces a ResNeXt model by up to 82% with < 1% accuracy drop. For ImageNet, it yields a pruned ratio of up to 63:3% with < 1% top-5 accuracy drop. Our experimental results show that the proposed approach significantly outperforms other state-of-the-art methods in terms of overall model size and accuracy.

Related Material


[pdf]
[bibtex]
@InProceedings{Gao_2019_ICCV,
author = {Gao, Susan and Liu, Xin and Chien, Lung-Sheng and Zhang, William and Alvarez, Jose M.},
title = {VACL: Variance-Aware Cross-Layer Regularization for Pruning Deep Residual Networks},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}