Sign backpropagation: An on-chip learning algorithm for analog RRAM neuromorphic computing systems

Neural Netw. 2018 Dec:108:217-223. doi: 10.1016/j.neunet.2018.08.012. Epub 2018 Sep 1.

Abstract

Currently, powerful deep learning models usually require significant resources in the form of processors and memory, which leads to very high energy consumption. The emerging resistive random access memory (RRAM) has shown great potential for constructing a scalable and energy-efficient neural network. However, it is hard to port a high-precision neural network from conventional digital CMOS hardware systems to analog RRAM systems owing to the variability of RRAM devices. A suitable on-chip learning algorithm should be developed to retrain or improve the performance of the neural network. In addition, determining how to integrate the periphery digital computations and analog RRAM crossbar is still a challenge. Here, we propose an on-chip learning algorithm, named sign backpropagation (SBP), for RRAM-based multilayer perceptron (MLP) with binary interfaces (0, 1) in forward process and 2-bit (±1, 0) in backward process. The simulation results show that the proposed method and architecture can achieve a comparable classification accuracy with MLP on MNIST dataset, meanwhile it can save area and energy cost by the calculation and storing of the intermediate results and take advantages of the RRAM crossbar potential in neuromorphic computing.

Keywords: Multilayer perceptron (MLP); Neural network; Neuromorphic computing; On-chip learning; Resistive random-access memory (RRAM).

MeSH terms

  • Algorithms
  • Deep Learning* / trends
  • Memory
  • Neural Networks, Computer*