Rethinking Spiking Self-Attention Mechanism: Implementing a-XNOR Similarity Calculation in Spiking Transformers

Yichen Xiao, Shuai Wang, Dehao Zhang, Wenjie Wei, Yimeng Shan, Xiaoli Liu, Yulin Jiang, Malu Zhang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2025, pp. 5444-5454

Abstract


Transformers significantly raise the performance limits across various tasks, spurring research into integrating them into spiking neural networks. However, a notable performance gap remains between existing spiking Transformers and their artificial neural network counterparts. Here, we first analyze the cause of this gap and attribute it to the dot product's ineffectiveness in measuring similarity between spiking queries and keys, due to numerous non-spiking events. To address this, we propose a novel a-XNOR similarity measure tailored for spike trains. It redefines the correlation between non-spike pairs as a specific value a, effectively overcoming the limitations of dot-product similarity. Furthermore, considering the sparse nature of spike trains where spikes carry more information than non-spikes, the a-XNOR similarity correspondingly highlights the distinct importance of spikes over non-spikes. Extensive experiments demonstrate that a-XNOR similarity significantly improves performance across different spiking Transformer architectures on various static and neuromorphic datasets, further revealing the potential of spiking Transformers.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Xiao_2025_CVPR, author = {Xiao, Yichen and Wang, Shuai and Zhang, Dehao and Wei, Wenjie and Shan, Yimeng and Liu, Xiaoli and Jiang, Yulin and Zhang, Malu}, title = {Rethinking Spiking Self-Attention Mechanism: Implementing a-XNOR Similarity Calculation in Spiking Transformers}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2025}, pages = {5444-5454} }