Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

Zhu Liao, Victor Quétu, Van-Tam Nguyen, Enzo Tartaglione; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 1402-1406

Abstract


Pruning is a widely used technique for reducing the size of deep neural networks while maintaining their performance. However, such a technique, despite being able to massively compress deep models, is hardly able to remove entire layers from a model (even when structured): is this an addressable task? In this study, we introduce EGP, an innovative Entropy Guided Pruning algorithm aimed at reducing the size of deep neural networks while preserving their performance. The key focus of EGP is to prioritize pruning connections in layers with low entropy, ultimately leading to their complete removal. Through extensive experiments conducted on popular models like ResNet-18 and Swin-T, our findings demonstrate that EGP effectively compresses deep neural networks while maintaining competitive performance levels. Our results not only shed light on the underlying mechanism behind the advantages of unstructured pruning, but also pave the way for further investigations into the intricate relationship between entropy, pruning techniques, and deep learning performance. The EGP algorithm and its insights hold great promise for advancing the field of network compression and optimization.

Related Material


[pdf]
[bibtex]
@InProceedings{Liao_2023_ICCV, author = {Liao, Zhu and Qu\'etu, Victor and Nguyen, Van-Tam and Tartaglione, Enzo}, title = {Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {1402-1406} }