On Quantizing Implicit Neural Representations

Cameron Gordon, Shin-Fang Chng, Lachlan MacDonald, Simon Lucey; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2023, pp. 341-350

Abstract


The role of quantization within implicit/coordinate neural networks is still not fully understood. We note that using a canonical fixed quantization scheme during training produces poor performance at low bit-rates due to the network weight distributions changing over the course of training. In this work, we show that a non-uniform quantization of neural weights can lead to significant improvements. Specifically, we demonstrate that a clustered quantization enables improved reconstruction. Finally, by characterising a trade-off between quantization and network capacity, we demonstrate that it is possible (while memory inefficient) to reconstruct signals using binary neural networks. We demonstrate our findings experimentally on 2D image reconstruction and 3D radiance fields; and show that simple quantization methods and architecture search can achieve compression of NeRF to less than 16kb with minimal loss in performance (323x smaller than the original NeRF).

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Gordon_2023_WACV, author = {Gordon, Cameron and Chng, Shin-Fang and MacDonald, Lachlan and Lucey, Simon}, title = {On Quantizing Implicit Neural Representations}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2023}, pages = {341-350} }