Multi-Level Dispersion Residual Network for Efficient Image Super-Resolution

Yanyu Mao, Nihao Zhang, Qian Wang, Bendu Bai, Wanying Bai, Haonan Fang, Peng Liu, Mingyue Li, Shengbo Yan; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2023, pp. 1660-1669


Recently, single image super-resolution (SISR) has made great progress, especially through the combination of convolutional neural network (CNN) and Transformer, but the huge model complexity is not desirable for the efficient image super-resolution (EISR), nor is it affordable for edge devices. As a result, many lightweight methods have been investigated for EISR, such as distillation and pruning. However, investigating more powerful attention mechanisms is also a promising solution to improve network efficiency. In this paper, we propose a multi-level dispersion residual network (MDRN) for EISR. As the basic block of MDRN, enhanced attention distillation block (EADB) includes the proposed multi-level dispersion spatial attention (MDSA) and enhanced contrast-aware channel attention (ECCA), respectively. MDSA introduces multi-scale and variance information to obtain more accurate spatial attention distribution. ECCA effectively combines lightweight convolution layers and residual connections to improve the efficiency of channel attention. The experimental results show that the proposed methods are effective and our MDRN achieves a better balance of performance and complexity than the SOTA models. In addition, we won the first place in the model complexity track of the NTIRE 2023 Efficient SR Challenge.

Related Material

@InProceedings{Mao_2023_CVPR, author = {Mao, Yanyu and Zhang, Nihao and Wang, Qian and Bai, Bendu and Bai, Wanying and Fang, Haonan and Liu, Peng and Li, Mingyue and Yan, Shengbo}, title = {Multi-Level Dispersion Residual Network for Efficient Image Super-Resolution}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {1660-1669} }