-
[pdf]
[bibtex]@InProceedings{Wzorek_2025_CVPR, author = {Wzorek, Piotr and Blachut, Krzysztof and Jeziorek, Kamil and Kryjak, Tomasz}, title = {Live Demonstration: Real-time event-data processing with Graph Convolutional Neural Networks and SoC FPGA}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR) Workshops}, month = {June}, year = {2025}, pages = {5103-5104} }
Live Demonstration: Real-time event-data processing with Graph Convolutional Neural Networks and SoC FPGA
Abstract
With growing demand for robust low-power and low-latency computer vision, event cameras are becoming increasingly common in mobile robotics and autonomous vehicles due to their high temporal resolution, ability to operate in adverse lightning conditions and reduction of redundant information. However, fully exploiting the benefits of these sensors requires algorithms capable of processing sparse data efficiently and optimised for a specialised target platform. To address this demand, we present a hardware-aware model of Graph Convolutional Neural Networks (GCNNs) implemented on an embedded heterogeneous SoC FPGA. Leveraging the temporal sparsity of event data, our approach integrates partially asynchronous processing and 3D MaxPool layers to achieve real-time, high-accuracy object classification on embedded platforms. A key strength of our architecture is its scalability -- customisable hardware modules enable flexible configurations to balance resource utilisation and latency. In this real-time demonstration we will showcase the capabilities of our GCNN model, address its limitations, and discuss future development directions. We will use the system operating on a heterogeneous SoC FPGA to classify event data captured directly by the integrated camera or read from an SD card with simulated time intervals between events. Our work has been shared in a open-source repository.
Related Material