Going Deeper in Spiking Neural Networks: VGG and Residual Architectures

Front Neurosci. 2019 Mar 7:13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.

Abstract

Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this paper, we propose a novel algorithmic technique for generating an SNN with a deep architecture, and demonstrate its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet. Our technique applies to both VGG and Residual network architectures, with significantly better accuracy than the state-of-the-art. Finally, we present analysis of the sparse event-driven computations to demonstrate reduced hardware overhead when operating in the spiking domain.

Keywords: event-driven neural networks; neuromorphic computing; sparsity; spiking neural networks; visual recognition.