Signatures of task learning in neural representations

Curr Opin Neurobiol. 2023 Dec:83:102759. doi: 10.1016/j.conb.2023.102759. Epub 2023 Sep 12.

Abstract

While neural plasticity has long been studied as the basis of learning, the growth of large-scale neural recording techniques provides a unique opportunity to study how learning-induced activity changes are coordinated across neurons within the same circuit. These distributed changes can be understood through an evolution of the geometry of neural manifolds and latent dynamics underlying new computations. In parallel, studies of multi-task and continual learning in artificial neural networks hint at a tradeoff between non-interference and compositionality as guiding principles to understand how neural circuits flexibly support multiple behaviors. In this review, we highlight recent findings from both biological and artificial circuits that together form a new framework for understanding task learning at the population level.

Publication types

  • Review

MeSH terms

  • Learning* / physiology
  • Neural Networks, Computer*
  • Neuronal Plasticity / physiology
  • Neurons / physiology