Contrastive learning through non-equilibrium memory

ArXiv [Preprint]. 2023 Dec 29:arXiv:2312.17723v1.

Abstract

Learning algorithms based on backpropagation have enabled transformative technological advances but alternatives based on local energy-based rules offer benefits in terms of biological plausibility and decentralized training. A broad class of such local learning rules involve \textit{contrasting} a clamped configuration with the free, spontaneous behavior of the system. However, comparisons of clamped and free configurations require explicit memory or switching between Hebbian and anti-Hebbian modes. Here, we show how a simple form of implicit non-equilibrium memory in the update dynamics of each ``synapse'' of a network naturally allows for contrastive learning. During training, free and clamped behaviors are shown in sequence over time using a sawtooth-like temporal protocol that breaks the symmetry between those two behaviors when combined with non-equilibrium update dynamics at each synapse. We show that the needed dynamics is implicit in integral feedback control, broadening the range of physical and biological systems naturally capable of contrastive learning. Finally, we show that non-equilibrium dissipation improves learning quality and determine the Landauer energy cost of contrastive learning through physical dynamics.

Publication types

  • Preprint