-
[pdf]
[bibtex]@InProceedings{Gu_2025_CVPR, author = {Gu, Yubin and Meng, Yuan and Ji, Jiayi and Sun, Xiaoshuai}, title = {ACL: Activating Capability of Linear Attention for Image Restoration}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {17913-17923} }
ACL: Activating Capability of Linear Attention for Image Restoration
Abstract
Image restoration (IR), a key area in computer vision, has entered a new era with deep learning. Recent research has shifted toward Selective State Space Models (Mamba) to overcome CNNs' limited receptive fields and Transformers' computational inefficiency. However, due to Mamba's inherent one-dimensional scanning limitations, recent approaches have introduced multi-directional scanning to bolster inter-sequence correlations. Despite these enhancements, these methods still struggle with managing local pixel correlations across various directions. Moreover, the recursive computation in Mamba's SSM leads to reduced efficiency. To resolve these issues, we exploit the mathematical congruences between linear attention and SSM within Mamba to propose a novel model, ACL, which leverages news designs to Activate the Capability of Linear attention for IR. ACL integrates linear attention blocks instead of SSM within Mamba, serving as the core component of encoders/decoders, and aims to preserve a global perspective while boosting computational efficiency. Furthermore, we have designed a simple yet robust local enhancement module with multi-scale dilated convolutions to extract both coarse and fine features to improve local detail recovery. Experimental results confirm that our ACL model excels in classical IR tasks such as de-raining and de-blurring, while maintaining relatively low parameter counts and FLOPs.
Related Material