-
[pdf]
[bibtex]@InProceedings{Kang_2022_ACCV, author = {Kang, Zhaoxiang and Li, Zonglin and Liu, Qinglin and Zhu, Yuhe and Zhou, Hongfei and Zhang, Shengping}, title = {Lightweight Image Matting via Efficient Non-Local Guidance}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {2884-2900} }
Lightweight Image Matting via Efficient Non-Local Guidance
Abstract
Natural image matting aims to estimate the opacity of foreground
objects. Most existing approaches involve prohibitive parameters,
daunting computational complexity, and redundant dependency. In this
paper, we propose a lightweight matting method termed LiteMatting,
which learns the local smoothness of color space and affinities between
neighboring pixels to estimate the alpha mattes. Specifically, a modified
mobile block is adopted to construct an encoder-decoder framework,
which reduces parameters while retaining sufficient spatial and channel
information. In addition, a Long-Short Range Pyramid Pooling Module
(LSRPPM) is introduced to extend the reception field by capturing
long-range dependency between regions distributed discretely. Finally,
an Efficient Non-Local Block (ENB) is presented for guiding high-level
semantics propagation from low-level detail features to refine the alpha
mattes. Extensive experiments demonstrate that our method achieves a
favorable trade-off between accuracy and efficiency. Compared with most
state-of-the-art approaches, our method attains an immense descent in
parameters and FLOPs with 30% and 13%, respectively, while achieving
an improvement of over 15% in SAD metrics. Code and model are
available at https://github.com/kzx2018/LiteMatting.
Related Material