-
[pdf]
[bibtex]@InProceedings{Saghand_2025_WACV, author = {Saghand, Esmat Ghasemi and Lai-Yuen, Susana K.}, title = {MONAS-ESNN: Multi-Objective Neural Architecture Search for Efficient Spiking Neural Networks}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {9178-9187} }
MONAS-ESNN: Multi-Objective Neural Architecture Search for Efficient Spiking Neural Networks
Abstract
Spiking Neural Networks (SNNs) have emerged as a compelling alternative to traditional Artificial Neural Networks (ANNs) due to their energy efficiency and biological plausibility. However current SNN models often rely on ANN architectures that may not fully exploit the unique properties of SNNs. Neural Architecture Search (NAS) approaches have been shown to automate the identification of suitable architectures for various applications. Nevertheless very few works have been presented on NAS for SNNs and particularly for identifying architectures that achieve high accuracy while capitalizing on the energy efficiency property of SNNs. In this paper we present a Multi-Objective Neural Architecture Search for Efficient SNNs (MONAS-ESNN) approach that utilizes a training-free NAS to discover SNN architectures that optimize both accuracy and energy efficiency. The proposed MONAS-ESNN uses the NSGA-II evolutionary algorithm to optimize both objective functions while leveraging the unique temporal dynamics of SNNs. We also introduce a new Adjusted Sparsity-Aware Hamming Distance (ASAHD) that enhances the evaluation of potential architectures by representing diverse spike activation patterns for different types of spiking neurons. Experimental results on CIFAR-10 CIFAR-100 and Tiny-ImageNet-200 datasets demonstrate that MONAS-ESNN identifies SNN architectures that have higher accuracy and are more efficient as measured by the number of generated spikes than existing methods. Therefore the proposed MONAS-ESNN can automate the search and discovery of SNN architectures with higher accuracy fewer generated spikes and faster convergence paving the way for more energy-efficient neural networks.
Related Material