Complementary Memtransistor-Based Multilayer Neural Networks for Online Supervised Learning Through (Anti-)Spike-Timing-Dependent Plasticity

IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6640-6651. doi: 10.1109/TNNLS.2021.3082911. Epub 2022 Oct 27.

Abstract

We propose a complete hardware-based architecture of multilayer neural networks (MNNs), including electronic synapses, neurons, and periphery circuitry to implement supervised learning (SL) algorithm of extended remote supervised method (ReSuMe). In this system, complementary (a pair of n- and p-type) memtransistors (C-MTs) are used as an electrical synapse. By applying the learning rule of spike-timing-dependent plasticity (STDP) to the memtransistor connecting presynaptic neuron to the output one whereas the contrary anti-STDP rule to the other memtransistor connecting presynaptic neuron to the teacher one, extended ReSuMe with multiple layers is realized without the usage of those complicated supervising modules in previous approaches. In this way, both the C-MT-based chip area and power consumption of the learning circuit for weight updating operation are drastically decreased comparing with the conventional single memtransistor (S-MT)-based designs. Two typical benchmarks, the linearly nonseparable benchmark XOR problem and Mixed National Institute of Standards and Technology database (MNIST) recognition have been successfully tackled using the proposed MNN system while impact of the nonideal factors of realistic devices has been evaluated.

MeSH terms

  • Models, Neurological*
  • Neural Networks, Computer*
  • Neuronal Plasticity / physiology
  • Supervised Machine Learning
  • Synapses / physiology