E2CANet: An Efficient and Effective Convolutional Attention Network for Semantic Segmentation

Yuerong Mu, Qiang Guo; Proceedings of the Asian Conference on Computer Vision (ACCV) Workshops, 2024, pp. 466-481

Abstract


Many semantic segmentation methods employ various attention mechanisms to improve segmentation accuracy. However, as the accuracy of the model increases, the computational cost is relatively expensive, which is not favorable for some practical applications. To solve this problem, this paper presents an efficient and effective convolutional attention network (E2CANet), which is designed to achieve a good trade-off between segmentation accuracy and computational efficiency. E2CANet adopts an encoder-decoder architecture with skip connections to preserve details and semantic information. For the encoder, we use cheap convolutional operations to introduce two different attentions, i.e. global attention and multi-scale attention, which can significantly reduce the computational cost while highlighting important features and suppressing unnecessary ones. A lightweight All-MLP decoder, which only consists of six linear layers, is used to aggregate features from the encoder. The simple design of this decoder is also the key to reduce computational complexity. Extensive experiments are performed on ADE20K, Cityscapes, and COCO-stuff datasets. The proposed E2CANet delivers very competitive results on all datasets. Especially, E2CANet-Tiny (a lightweight version of E2CANet) achieves 41.92% mIoU on ADE20K dataset with less than 4.4M parameters, which demonstrates the efficiency and effectiveness of our method. Code is available at https://github.com/muyuerong/E2CANet.

Related Material


[pdf]
[bibtex]
@InProceedings{Mu_2024_ACCV, author = {Mu, Yuerong and Guo, Qiang}, title = {E2CANet: An Efficient and Effective Convolutional Attention Network for Semantic Segmentation}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV) Workshops}, month = {December}, year = {2024}, pages = {466-481} }