Beyond spiking networks: The computational advantages of dendritic amplification and input segregation

Proc Natl Acad Sci U S A. 2023 Dec 5;120(49):e2220743120. doi: 10.1073/pnas.2220743120. Epub 2023 Nov 29.

Abstract

The brain can efficiently learn a wide range of tasks, motivating the search for biologically inspired learning rules for improving current artificial intelligence technology. Most biological models are composed of point neurons and cannot achieve state-of-the-art performance in machine learning. Recent works have proposed that input segregation (neurons receive sensory information and higher-order feedback in segregated compartments), and nonlinear dendritic computation would support error backpropagation in biological neurons. However, these approaches require propagating errors with a fine spatiotemporal structure to all the neurons, which is unlikely to be feasible in a biological network. To relax this assumption, we suggest that bursts and dendritic input segregation provide a natural support for target-based learning, which propagates targets rather than errors. A coincidence mechanism between the basal and the apical compartments allows for generating high-frequency bursts of spikes. This architecture supports a burst-dependent learning rule, based on the comparison between the target bursting activity triggered by the teaching signal and the one caused by the recurrent connections, providing support for target-based learning. We show that this framework can be used to efficiently solve spatiotemporal tasks, such as context-dependent store and recall of three-dimensional trajectories, and navigation tasks. Finally, we suggest that this neuronal architecture naturally allows for orchestrating "hierarchical imitation learning", enabling the decomposition of challenging long-horizon decision-making tasks into simpler subtasks. We show a possible implementation of this in a two-level network, where the high network produces the contextual signal for the low network.

Keywords: dendritic amplification; hierarchical imitation learning; pyramidal neuron; target-based learning.

MeSH terms

  • Action Potentials / physiology
  • Artificial Intelligence*
  • Brain / physiology
  • Machine Learning
  • Models, Neurological
  • Neurons* / physiology