A Cortically-Inspired Architecture for Event-Based Visual Motion Processing: From Design Principles to Real-World Applications
We developed and tested the architecture of a bio-inspired Spiking Neural Network for motion estimation. The computation performed by the retina is emulated by the neuromorphic event-based image sensor DAVIS346 which constitutes the input of our network. We obtained neurons highly tuned to spatial frequency and orientation of the stimulus through a combination of feed-forward excitatory connections modeled as an elongated Gaussian kernel and recurrent inhibitory connections from two clusters of neurons within the same cortical layers. Sums over adjacent nodes weighted by time-variable synapses are used to attain Gabor-like spatio-temporal V1 receptive fields with selectivity to the stimulus' motion. In order to gain the invariance to the stimulus phase, the two polarities of the events provided by the neuromorphic sensor were exploited,which allowed us to build two pairs of quadrature filters from which we obtain Motion Energy detectors as described in . Finally, a decoding stage allows us to compute optic flow from the Motion Detector layers. We tested the approach proposed with both synthetic and natural stimuli.