Entropy Coding-Based Lossless Compression of Asynchronous Event Sequences

Ionut Schiopu, Radu Ciprian Bilcu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2023, pp. 3923-3930

Abstract


The event sensor acquires large amounts of data as events are triggered at microsecond time resolution. In this paper, a novel entropy coding-based method is proposed for encoding asynchronous event sequences. The proposed method employs the event coding framework, where: (i) the input sequence is rearranged as a set of same-timestamp subsequences, where each subsequence is represented of a set of data structures (DSs); and (ii) each DS is encoded by a specific version of the triple threshold partition (TTP) algorithm, where a bitstream collects the binary representation of a set of data elements. A first contribution consists in improving the low-complexity algorithm, LLC-ARES, by modifying the TTP algorithm to employ entropy coding-based techniques to efficient encode the set of data elements. An adaptive Markov model encodes each data element by modelling its symbol probability distribution. Six different types of data elements are distinguished, each having a different support symbol alphabet. Another contribution consists in exploring novel prediction strategies, for the unsorted spatial dimension, and parameter initialization, for the new error distributions. The experimental evaluation demonstrates that the proposed method achieves an improved average coding performance of 28.03%, 35.27%, and 64.54% compared with the state-of-the-art data compression codecs Bzip2, LZMA, and ZLIB, respectively, and 21.4% compared with LLC-ARES.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Schiopu_2023_CVPR, author = {Schiopu, Ionut and Bilcu, Radu Ciprian}, title = {Entropy Coding-Based Lossless Compression of Asynchronous Event Sequences}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2023}, pages = {3923-3930} }