Quantized Generative Models for Solving Inverse Problems

Nareddy Kartheek Kumar Reddy, Vinayak Killedar, Chandra Sekhar Seelamantula; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 1528-1533

Abstract


Generative priors have been shown to be highly successful in solving inverse problems. In this paper, we consider quantized generative models i.e., the generator network weights come from a learnt finite alphabet. Quantized neural networks are efficient in terms of memory and computation. They are ideally suited for deployment in a practical setting involving low-precision hardware. In this paper, we solve non-linear inverse problems using quantized generative models. We introduce a new meta-learning framework that makes use of proximal operators and jointly optimizes the quantized weights of the generative model, parameters of the sensing network, and the latent-space representation. Experimental validation is carried out using standard datasets - MNIST, CIFAR10, SVHN, and STL10. The results show that the performance of 32-bit networks can be achieved using 4-bit networks. The performance of 1-bit networks is about 0.7 to 2 dB inferior, while saving significantly (32x) on the model size.

Related Material


[pdf]
[bibtex]
@InProceedings{Reddy_2023_ICCV, author = {Reddy, Nareddy Kartheek Kumar and Killedar, Vinayak and Seelamantula, Chandra Sekhar}, title = {Quantized Generative Models for Solving Inverse Problems}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {1528-1533} }