-
[pdf]
[arXiv]
[bibtex]@InProceedings{Shang_2024_CVPR, author = {Shang, Yuzhang and Xu, Dan and Liu, Gaowen and Kompella, Ramana Rao and Yan, Yan}, title = {Efficient Multitask Dense Predictor via Binarization}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {15899-15908} }
Efficient Multitask Dense Predictor via Binarization
Abstract
Multi-task learning for dense prediction has emerged as a pivotal area in computer vision enabling simultaneous processing of diverse yet interrelated pixel-wise prediction tasks. However the substantial computational demands of state-of-the-art (SoTA) models often limit their widespread deployment. This paper addresses this challenge by introducing network binarization to compress resource-intensive multi-task dense predictors. Specifically our goal is to significantly accelerate multi-task dense prediction models via Binary Neural Networks (BNNs) while maintaining and even improving model performance at the same time. To reach this goal we propose a Binary Multi-task Dense Predictor Bi-MTDP and several variants of \bimtdp in which a multi-task dense predictor is constructed via specified binarized modules. Our systematical analysis of this predictor reveals that performance drop from binarization is primarily caused by severe information degradation. To address this issue we introduce a deep information bottleneck layer that enforces representations for downstream tasks satisfying Gaussian distribution in forward propagation. Moreover we introduce a knowledge distillation mechanism to correct the direction of information flow in backward propagation. Intriguingly one variant of Bi-MTDP outperforms full-precision (FP) multi-task dense prediction SoTAs ARTC (CNN-based) and InvPT (ViT-based). This result indicates that Bi-MTDP is not merely a naive trade-off between performance and efficiency but is rather a benefit of the redundant information flow thanks to the multi-task architecture.
Related Material