Low Latency Point Cloud Rendering with Learned Splatting

Yueyu Hu, Ran Gong, Qi Sun, Yao Wang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2024, pp. 5752-5761

Abstract


Point cloud is a critical 3D representation with many emerging applications. Because of the point sparsity and irregularity high-quality rendering of point clouds is challenging and often requires complex computations to recover the continuous surface representation. On the other hand to avoid visual discomfort the motion-to-photon latency has to be very short under 10 ms. Existing rendering solutions lack in either quality or speed. To tackle these challenges we present a framework that unlocks interactive free-viewing and high-fidelity point cloud rendering. We pre-train a neural network to estimate 3D elliptical Gaussians from arbitrary point clouds and use differentiable surface splatting to render smooth texture and surface normal for arbitrary views. Our approach does not require per-scene optimization and enable real-time rendering of dynamic point cloud. Experimental results demonstrate the proposed solution enjoys superior visual quality and speed as well as generalizability to different scene content and robustness to compression artifacts.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Hu_2024_CVPR, author = {Hu, Yueyu and Gong, Ran and Sun, Qi and Wang, Yao}, title = {Low Latency Point Cloud Rendering with Learned Splatting}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {5752-5761} }