Adaptive Bitrate Quantization Scheme Without Codebook for Learned Image Compression

Jonas Löhdefink, Jonas Sitzmann, Andreas Bär, Tim Fingscheidt; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 1732-1737

Abstract


We propose a generic approach to quantization without codebook in learned image compression called one-hot max (OHM) quantization. It reorganizes the feature space resulting in an additional dimension, along which vector quantization yields one-hot vectors by comparing activations. Furthermore, we show how to integrate OHM quantization into a compression system with bitrate adaptation, i.e., full control over bitrate during inference. We perform experiments on both MNIST and Kodak and report on rate-distortion trade-offs comparing with the integer rounding reference. For low bitrates (< 0.4 bpp), our proposed quantizer yields better performance while exhibiting also other advantageous training and inference properties. Code is available at https://github.com/ifnspaml/OHMQ.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Lohdefink_2022_CVPR, author = {L\"ohdefink, Jonas and Sitzmann, Jonas and B\"ar, Andreas and Fingscheidt, Tim}, title = {Adaptive Bitrate Quantization Scheme Without Codebook for Learned Image Compression}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {1732-1737} }