GM-MoE: Low-Light Enhancement with Gated-Mechanism Mixture-of-Experts

Minwen Liao, Haobo Dong, Xinyi Wang, Kurban Ubul, Yihua Shao, Ziyang Yan; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2025, pp. 8766-8776

Abstract


Low-light enhancement has wide applications in autonomous driving, 3D reconstruction, remote sensing, surveillance, and so on, which can significantly improve information utilization. However, most existing methods lack generalization and are limited to specific tasks such as image recovery. To address these issues, we propose \textbf Gated-Mechanism Mixture-of-Experts (GM-MoE), the first framework to introduce a mixture-of-experts network for low-light image enhancement. GM-MoE comprises a dynamic gated weight conditioning network and three sub-expert networks, each specializing in a distinct enhancement task. Combining a self-designed gated mechanism that dynamically adjusts the weights of the sub-expert networks for different data domains. Additionally, we integrate local and global feature fusion within sub-expert networks to enhance image quality by capturing multi-scale features. Experimental results demonstrate that the GM-MoE achieves superior generalization compared to over 20 existing approaches, reaching state-of-the-art performance on PSNR on 5 benchmarks and SSIM on 4 benchmarks, respectively. Code is available at: https://github.com/Sameenok/gm-moe-lowlight-enhancement.git

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Liao_2025_ICCV, author = {Liao, Minwen and Dong, Haobo and Wang, Xinyi and Ubul, Kurban and Shao, Yihua and Yan, Ziyang}, title = {GM-MoE: Low-Light Enhancement with Gated-Mechanism Mixture-of-Experts}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2025}, pages = {8766-8776} }