Spike Attention Coding for Spiking Neural Networks

IEEE Trans Neural Netw Learn Syst. 2023 Sep 11:PP. doi: 10.1109/TNNLS.2023.3310263. Online ahead of print.

Abstract

Spiking neural networks (SNNs), an important family of neuroscience-oriented intelligent models, play an essential role in the neuromorphic computing community. Spike rate coding and temporal coding are the mainstream coding schemes in the current modeling of SNNs. However, rate coding usually suffers from limited representation resolution and long latency, while temporal coding usually suffers from under-utilization of spike activities. To this end, we propose spike attention coding (SAC) for SNNs. By introducing learnable attention coefficients for each time step, our coding scheme can naturally unify rate coding and temporal coding, and then flexibly learn optimal coefficients for better performance. Several normalization and regularization techniques are further incorporated to control the range and distribution of the learned attention coefficients. Extensive experiments on classification, generation, and regression tasks are conducted and demonstrate the superiority of the proposed coding scheme. This work provides a flexible coding scheme to enhance the representation power of SNNs and extends their application scope beyond the mainstream classification scenario.