Nonsynaptic Error Backpropagation in Long-Term Cognitive Networks

IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):865-875. doi: 10.1109/TNNLS.2019.2910555. Epub 2019 May 6.

Abstract

We introduce a neural cognitive mapping technique named long-term cognitive network (LTCN) that is able to memorize long-term dependencies between a sequence of input and output vectors, especially in those scenarios that require predicting the values of multiple dependent variables at the same time. The proposed technique is an extension of a recently proposed method named short-term cognitive network that aims at preserving the expert knowledge encoded in the weight matrix while optimizing the nonlinear mappings provided by the transfer function of each neuron. A nonsynaptic, backpropagation-based learning algorithm powered by stochastic gradient descent is put forward to iteratively optimize four parameters of the generalized sigmoid transfer function associated with each neuron. Numerical simulations over 35 multivariate regression and pattern completion data sets confirm that the proposed LTCN algorithm attains statistically significant performance differences with respect to other well-known state-of-the-art methods.