High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron

Front Neurosci. 2023 Mar 8:17:1141701. doi: 10.3389/fnins.2023.1141701. eCollection 2023.

Abstract

Spiking neural networks (SNNs) have attracted intensive attention due to the efficient event-driven computing paradigm. Among SNN training methods, the ANN-to-SNN conversion is usually regarded to achieve state-of-the-art recognition accuracies. However, many existing ANN-to-SNN techniques impose lengthy post-conversion steps like threshold balancing and weight renormalization, to compensate for the inherent behavioral discrepancy between artificial and spiking neurons. In addition, they require a long temporal window to encode and process as many spikes as possible to better approximate the real-valued ANN neurons, leading to a high inference latency. To overcome these challenges, we propose a calcium-gated bipolar leaky integrate and fire (Ca-LIF) spiking neuron model to better approximate the functions of the ReLU neurons widely adopted in ANNs. We also propose a quantization-aware training (QAT)-based framework leveraging an off-the-shelf QAT toolkit for easy ANN-to-SNN conversion, which directly exports the learned ANN weights to SNNs requiring no post-conversion processing. We benchmarked our method on typical deep network structures with varying time-step lengths from 8 to 128. Compared to other research, our converted SNNs reported competitively high-accuracy performance, while enjoying relatively short inference time steps.

Keywords: ANN-to-SNN conversion; deep SNNs; neuromorphic computing; quantization-aware training; spiking neural network.

Grants and funding

This study was funded in part by the National Key Research and Development Program of China (Grant No. 2019YFB2204303), in part by the National Natural Science Foundation of China (Grant No. U20A20205), in part by the Key Project of Chongqing Science and Technology Foundation (Grant Nos. cstc2019jcyj-zdxmX0017 and cstc2021ycjh-bgzxm0031), in part by the Pilot Research Project (Grant No. H20201100) from Chongqing Xianfeng Electronic Institute Co., Ltd., in part by the Open Research Funding from the State Key Laboratory of Computer Architecture, ICT, CAS (Grant No. CARCH201908), in part by the innovation funding from the Chongqing Social Security Bureau and Human Resources Dept. (Grant No. cx2020018), and in part by the Chongqing Science and Technology Foundation (Postdoctoral Foundation) (Grant No. cstc2021jcyj-bsh0126).