-
[pdf]
[bibtex]@InProceedings{Weng_2024_ACCV, author = {Weng, Jingchong and Li, Boyang and Huang, Kai}, title = {Event-based Image Enhancement Under High Dynamic Range Scenarios}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {2456-2470} }
Event-based Image Enhancement Under High Dynamic Range Scenarios
Abstract
Event cameras, as bio-inspired vision sensors with a high dynamic range, are capable of addressing the problems of local overexposure or underexposure that conventional frame-based cameras encounter in scenarios with high dynamic range or fluctuating lighting conditions. Due to the modality gap between the two types of cameras, simple direct fusion is not feasible. Additionally, the ghosting artifacts caused by the deviation in the camera positions and frame-rates also affects the quality of final fused image. To solve the problems, this paper proposes a joint framework that combines locally poor-exposed frames with event streams captured by the event camera to enhance the images with detailed textures in high dynamic range scenarios. Specifically, a lightweight multi-scale receptive field block is employed for rapid modality conversion from event streams to frames. Besides, a dual-branch fusion module is proposed to align features and remove ghosting artifacts. Experimental results demonstrate that the proposed method effectively mitigates information loss in both highly bright and dark regions of images across a range of extreme lighting conditions, generating the both realistic and natural images.
Related Material