-
[pdf]
[bibtex]@InProceedings{Zhang_2025_CVPR, author = {Zhang, Ziyang and Simo-Serra, Edgar}, title = {G-Buffer Supported Neural Screen-space Refraction Baking for Real-Time Global Illumination}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR) Workshops}, month = {June}, year = {2025}, pages = {604-611} }
G-Buffer Supported Neural Screen-space Refraction Baking for Real-Time Global Illumination
Abstract
We propose a neural screen-space refraction baking method for global illumination rendering, with applicability to real-time 3D games. Existing neural global illumination rendering methods often struggle with refractive objects due to the lack of texture information in G-Buffers. While some existing approaches extend neural global illumination to refractive objects by predicting texture maps (UV maps), they are limited to objects with simple geometry and UV maps. In contrast, our method bakes refracted textures without these assumptions by directly encoding the world coordinates of refracted objects into the neural network instead of UV coordinates. Our experiments demonstrate that our approach performs better on refraction rendering than previous methods. Additionally, we investigate the differences in neural network performance when baking coordinates in different spaces, such as world space, screen space, and UV space, showing the best results yielded by baking in world-space coordinates.
Related Material