Biologically-inspired neuronal adaptation improves learning in neural networks

Commun Integr Biol. 2023 Jan 17;16(1):2163131. doi: 10.1080/19420889.2022.2163131. eCollection 2023.

Abstract

Since humans still outperform artificial neural networks on many tasks, drawing inspiration from the brain may help to improve current machine learning algorithms. Contrastive Hebbian learning (CHL) and equilibrium propagation (EP) are biologically plausible algorithms that update weights using only local information (without explicitly calculating gradients) and still achieve performance comparable to conventional backpropagation. In this study, we augmented CHL and EP with Adjusted Adaptation, inspired by the adaptation effect observed in neurons, in which a neuron's response to a given stimulus is adjusted after a short time. We add this adaptation feature to multilayer perceptrons and convolutional neural networks trained on MNIST and CIFAR-10. Surprisingly, adaptation improved the performance of these networks. We discuss the biological inspiration for this idea and investigate why Neuronal Adaptation could be an important brain mechanism to improve the stability and accuracy of learning.

Keywords: Bio-plausible neural networks; contrastive Hebbian learning; equilibrium propagation; neuronal adaptation.

Grants and funding

This study is supported by NSERC DG, Compute Canada, and CIHR Project grants to AL.