-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Xiao_2023_CVPR, author = {Xiao, Zeyu and Gao, Ruisheng and Liu, Yutong and Zhang, Yueyi and Xiong, Zhiwei}, title = {Toward Real-World Light Field Super-Resolution}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {3408-3418} }
Toward Real-World Light Field Super-Resolution
Abstract
Deep learning has opened up new possibilities for light field super-resolution (SR), but existing methods trained on synthetic datasets with simple degradations (e.g., bicubic downsampling) suffer from poor performance when applied to complex real-world scenarios. To address this problem, we introduce LytroZoom, the first real-world light field SR dataset capturing paired low- and high-resolution light fields of diverse indoor and outdoor scenes using a Lytro ILLUM camera. Additionally, we propose the Omni-Frequency Projection Network (OFPNet), which decomposes the omni-frequency components and iteratively enhances them through frequency projection operations to address spatially variant degradation processes present in all frequency components. Experiments demonstrate that models trained on LytroZoom outperform those trained on synthetic datasets and are generalizable to diverse content and devices. Quantitative and qualitative evaluations verify the superiority of OFPNet. We believe this work will inspire future research in real-world light field SR. Code and dataset are available at https://github.com/zeyuxiao1997/RealLFSR.
Related Material