HybridGS: Decoupling Transients and Statics with 2D and 3D Gaussian Splatting

Jingyu Lin, Jiaqi Gu, Lubin Fan, Bojian Wu, Yujing Lou, Renjie Chen, Ligang Liu, Jieping Ye; Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR), 2025, pp. 788-797

Abstract


Generating high-quality novel view renderings of 3D Gaussian Splatting (3DGS) in scenes featuring transient objects is challenging. We propose a novel hybrid representation, termed as HybridGS, using 2D Gaussians for transient objects per image and maintaining traditional 3D Gaussians for the whole static scenes. 3DGS is suited for modeling static scenes that assume multi-view consistency, but the transient objects appear occasionally and do not adhere to the assumption, thus we model them as planar objects from a single view by 2D Gaussians. Our novel representation decomposes the scene from the perspective of fundamental viewpoint consistency. Additionally, we present a multi-view supervision method for 3DGS that leverages information from co-visible regions, further enhancing the distinctions between the transients and statics. Then, we propose a straightforward yet effective multi-stage training strategy to ensure robust training and view synthesis. Experiments on benchmarks show our state-of-the-art performance of novel view synthesis in indoor and outdoor scenes, even in the presence of distracting elements. Project page: https://gujiaqivadin.github.io/hybridgs/

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Lin_2025_CVPR, author = {Lin, Jingyu and Gu, Jiaqi and Fan, Lubin and Wu, Bojian and Lou, Yujing and Chen, Renjie and Liu, Ligang and Ye, Jieping}, title = {HybridGS: Decoupling Transients and Statics with 2D and 3D Gaussian Splatting}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {788-797} }