Synaptic balancing: A biologically plausible local learning rule that provably increases neural network noise robustness without sacrificing task performance

PLoS Comput Biol. 2022 Sep 19;18(9):e1010418. doi: 10.1371/journal.pcbi.1010418. eCollection 2022 Sep.

Abstract

We introduce a novel, biologically plausible local learning rule that provably increases the robustness of neural dynamics to noise in nonlinear recurrent neural networks with homogeneous nonlinearities. Our learning rule achieves higher noise robustness without sacrificing performance on the task and without requiring any knowledge of the particular task. The plasticity dynamics-an integrable dynamical system operating on the weights of the network-maintains a multiplicity of conserved quantities, most notably the network's entire temporal map of input to output trajectories. The outcome of our learning rule is a synaptic balancing between the incoming and outgoing synapses of every neuron. This synaptic balancing rule is consistent with many known aspects of experimentally observed heterosynaptic plasticity, and moreover makes new experimentally testable predictions relating plasticity at the incoming and outgoing synapses of individual neurons. Overall, this work provides a novel, practical local learning rule that exactly preserves overall network function and, in doing so, provides new conceptual bridges between the disparate worlds of the neurobiology of heterosynaptic plasticity, the engineering of regularized noise-robust networks, and the mathematics of integrable Lax dynamical systems.

MeSH terms

  • Action Potentials / physiology
  • Learning / physiology
  • Models, Neurological*
  • Neural Networks, Computer
  • Neuronal Plasticity / physiology
  • Synapses / physiology
  • Task Performance and Analysis*

Grants and funding

The authors received no specific funding for this work.