Spike-Timing-Dependent Plasticity With Activation-Dependent Scaling for Receptive Fields Development

IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5215-5228. doi: 10.1109/TNNLS.2021.3069683. Epub 2022 Oct 5.

Abstract

Spike-timing-dependent plasticity (STDP) is one of the most popular and deeply biologically motivated forms of unsupervised Hebbian-type learning. In this article, we propose a variant of STDP extended by an additional activation-dependent scale factor. The consequent learning rule is an efficient algorithm, which is simple to implement and applicable to spiking neural networks (SNNs). It is demonstrated that the proposed plasticity mechanism combined with competitive learning can serve as an effective mechanism for the unsupervised development of receptive fields (RFs). Furthermore, the relationship between synaptic scaling and lateral inhibition is explored in the context of the successful development of RFs. Specifically, we demonstrate that maintaining a high level of synaptic scaling followed by its rapid increase is crucial for the development of neuronal mechanisms of selectivity. The strength of the proposed solution is assessed in classification tasks performed on the Modified National Institute of Standards and Technology (MNIST) data set with an accuracy level of 94.65% (a single network) and 95.17% (a network committee)-comparable to the state-of-the-art results of single-layer SNN architectures trained in an unsupervised manner. Furthermore, the training process leads to sparse data representation and the developed RFs have the potential to serve as local feature detectors in multilayered spiking networks. We also prove theoretically that when applied to linear Poisson neurons, our rule conserves total synaptic strength, guaranteeing the convergence of the learning process.

MeSH terms

  • Learning / physiology
  • Models, Neurological*
  • Neural Networks, Computer*
  • Neuronal Plasticity / physiology
  • Neurons / physiology