-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Gao_2025_WACV, author = {Gao, Zhiyuan and Teng, Wenbin and Chen, Gonglin and Wu, Jinsen and Xu, Ningli and Qin, Rongjun and Feng, Andrew and Zhao, Yajie}, title = {Skyeyes: Ground Roaming using Aerial View Images}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {3045-3054} }
Skyeyes: Ground Roaming using Aerial View Images
Abstract
Integrating aerial imagery-based scene generation into applications like autonomous driving and gaming enhances realism in 3D environments but challenges remain in creating detailed content for occluded areas and ensuring real-time consistent rendering. In this paper we introduce Skyeyes a novel framework that can generate photorealistic sequences of ground view images using only aerial view inputs thereby creating a ground roaming experience. More specifically we combine a 3D representation with a view consistent generation model which ensures coherence between generated images. This method allows for the creation of geometrically consistent ground view images even with large view gaps. The images maintain improved spatial-temporal coherence and realism enhancing scene comprehension and visualization from aerial perspectives. To the best of our knowledge there are no publicly available datasets that contain pairwise geo-aligned aerial and ground view imagery. Therefore we build a large synthetic and geo-aligned dataset using Unreal Engine. Both qualitative and quantitative analyses on this synthetic dataset display superior results compared to other leading synthesis approaches. See the project page for more results: https://chaoren2357.github.io/website-skyeyes/
Related Material