Reciprocal Attention Mixing Transformer for Lightweight Image Restoration

Haram Choi, Cheolwoong Na, Jihyeon Oh, Seungjae Lee, Jinseop Kim, Subeen Choe, Jeongmin Lee, Taehoon Kim, Jihoon Yang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2024, pp. 5992-6002

Abstract


Although many recent works have made advancements in the image restoration (IR) field they often suffer from an excessive number of parameters. Another issue is that most Transformer-based IR methods focus only on either local or global features leading to limited receptive fields or deficient parameter issues. To address these problems we propose a lightweight network Reciprocal Attention Mixing Transformer (RAMiT). It employs our proposed dimensional reciprocal attention mixing Transformer (D-RAMiT) blocks which compute bi-dimensional self-attentions in parallel with different numbers of multi-heads. The bi-dimensional attentions help each other to complement their counterpart's drawbacks and are then mixed. Additionally we introduce a hierarchical reciprocal attention mixing (H-RAMi) layer that compensating for pixel-level information losses and utilizes semantic information while maintaining an efficient hierarchical structure. Furthermore we revisit and modify MobileNet V2 to attach efficient convolutions to our proposed components. The experimental results demonstrate that RAMiT achieves state-of-the-art performance on multiple lightweight IR tasks including super-resolution low-light enhancement deraining color denoising and grayscale denoising.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Choi_2024_CVPR, author = {Choi, Haram and Na, Cheolwoong and Oh, Jihyeon and Lee, Seungjae and Kim, Jinseop and Choe, Subeen and Lee, Jeongmin and Kim, Taehoon and Yang, Jihoon}, title = {Reciprocal Attention Mixing Transformer for Lightweight Image Restoration}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {5992-6002} }