Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric

Haoran Duan, Hui Li; Proceedings of the Asian Conference on Computer Vision (ACCV), 2020

Abstract


Channel pruning is an effective way to accelerate deep convolutional neural networks. However, it is still a challenge to reduce the computational complexity while preserving the performance of deep models. In this paper, we propose a novel channel pruning method via the Wasserstein metric. First, the output features of a channel are aggregated through the Wasserstein barycenter, which is called the basic response of the channel. Then the channel discrepancy based on the Wasserstein distance is introduced to measure channel importance, by considering both the channel's feature representation ability and the substitutability of the basic responses. Finally, channels with the least discrepancies are removed directly, and the loss in accuracy of the pruned model is regained by fine-tuning. Extensive experiments on popular benchmarks and various network architectures demonstrate that the proposed approach outperforms the existing methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Duan_2020_ACCV, author = {Duan, Haoran and Li, Hui}, title = {Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {November}, year = {2020} }