Neural Network Compression Using Higher-Order Statistics and Auxiliary Reconstruction Losses

Christos Chatzikonstantinou, Georgios Th. Papadopoulos, Kosmas Dimitropoulos, Petros Daras; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 716-717

Abstract


In this paper, the problem of pruning and compressing the weights of various layers of deep neural networks is investigated. The proposed method aims to remove redundant filters from the network in order to reduce computational complexity and storage requirements, while improving the performance of the original network. More specifically, a novel filter selection criterion is introduced based on the fact that filters whose weights follow a Gaussian distribution correspond to hidden units that do not capture important aspects of data. To this end, Higher Order Statistics (HOS) are used and filters with low cumulant values that do not deviate significantly from Gaussian distribution are identified and removed from the network. In addition, a novel pruning strategy is proposed aiming to decide on the pruning ratio of each individual layer using the Shapiro-Wilk normality test. The use of auxiliary MSE losses (intermediate and after the softmax layer) during the fine-tuning phase further improves the overall performance of the compressed network. Extensive experiments with different network architectures and comparison with state-of-the-art approaches on well-known public datasets, such as CIFAR-10, CIFAR-100 and ILSCVR-12, demonstrate the great potential of the proposed approach.

Related Material


[pdf]
[bibtex]
@InProceedings{Chatzikonstantinou_2020_CVPR_Workshops,
author = {Chatzikonstantinou, Christos and Papadopoulos, Georgios Th. and Dimitropoulos, Kosmas and Daras, Petros},
title = {Neural Network Compression Using Higher-Order Statistics and Auxiliary Reconstruction Losses},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}