Accelerating Neural Field Training via Soft Mining

Shakiba Kheradmand, Daniel Rebain, Gopal Sharma, Hossam Isack, Abhishek Kar, Andrea Tagliasacchi, Kwang Moo Yi; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 20071-20080

Abstract


We present an approach to accelerate Neural Field training by efficiently selecting sampling locations. While Neural Fields have recently become popular it is often trained by uniformly sampling the training domain or through handcrafted heuristics. We show that improved convergence and final training quality can be achieved by a soft mining technique based on importance sampling: rather than either considering or ignoring a pixel completely we weigh the corresponding loss by a scalar. To implement our idea we use Langevin Monte-Carlo sampling. We show that by doing so regions with higher error are being selected more frequently leading to more than 2x improvement in convergence speed. The code and related resources for this study are publicly available at https://ubc-vision.github.io/nf-soft-mining/.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Kheradmand_2024_CVPR, author = {Kheradmand, Shakiba and Rebain, Daniel and Sharma, Gopal and Isack, Hossam and Kar, Abhishek and Tagliasacchi, Andrea and Yi, Kwang Moo}, title = {Accelerating Neural Field Training via Soft Mining}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {20071-20080} }