ParticleNeRF: A Particle-Based Encoding for Online Neural Radiance Fields

Jad Abou-Chakra, Feras Dayoub, Niko Sünderhauf; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 5975-5984

Abstract


While existing Neural Radiance Fields (NeRFs) for dynamic scenes are offline methods with an emphasis on visual fidelity, our paper addresses the online use case that prioritises real-time adaptability. We present ParticleNeRF, a new approach that dynamically adapts to changes in the scene geometry by learning an up-to-date representation online, every 200ms. ParticleNeRF achieves this using a novel particle-based parametric encoding. We couple features to particles in space and backpropagate the photometric reconstruction loss into the particles' position gradients, which are then interpreted as velocity vectors. Governed by a lightweight physics system to handle collisions, this lets the features move freely with the changing scene geometry. We demonstrate ParticleNeRF on various dynamic scenes containing translating, rotating, articulated, and deformable objects. ParticleNeRF is the first online dynamic NeRF and achieves fast adaptability with better visual fidelity than brute-force online InstantNGP and other baseline approaches on dynamic scenes with online constraints.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Abou-Chakra_2024_WACV, author = {Abou-Chakra, Jad and Dayoub, Feras and S\"underhauf, Niko}, title = {ParticleNeRF: A Particle-Based Encoding for Online Neural Radiance Fields}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {5975-5984} }