Auto-Encoded Supervision for Perceptual Image Super-Resolution

MinKyu Lee, Sangeek Hyun, Woojin Jun, Jae-Pil Heo; Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR), 2025, pp. 17958-17968

Abstract


This work tackles the fidelity objective in the perceptual super-resolution (SR) task. Specifically, we address the shortcomings of pixel-level \mathcal L _\text p loss (\mathcal L _\text pix ) in the GAN-based SR framework. Since \mathcal L _\text pix is known to have a trade-off relationship against perceptual quality, prior methods often multiply a small scale factor or utilize low-pass filters. However, this work shows that these circumventions fail to address the fundamental factor that induces blurring. Accordingly, we focus on two points: 1) precisely discriminating the subcomponent of \mathcal L _\text pix that contributes to blurring, and 2) guiding reconstruction only based on the factor that is free from this trade-off relationship. We show that this can be achieved in a surprisingly simple manner, with an Auto-Encoder (AE) pretrained using \mathcal L _\text pix . Based on this insight, we propose the Auto-Encoded Supervision for Optimal Penalization loss (\mathcal L _\text AESOP ), a novel loss function that measures distance in the AE space (the space after the decoder, not the bottleneck), rather than in the raw pixel space. By simply substituting \mathcal L _\text pix with \mathcal L _\text AESOP , we can provide effective reconstruction guidance without compromising perceptual quality. Designed for simplicity, our method enables easy integration into existing SR frameworks. Extensive experiments demonstrate the effectiveness of AESOP.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Lee_2025_CVPR, author = {Lee, MinKyu and Hyun, Sangeek and Jun, Woojin and Heo, Jae-Pil}, title = {Auto-Encoded Supervision for Perceptual Image Super-Resolution}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {17958-17968} }