Fluctuation-dissipation-type theorem in stochastic linear learning

Phys Rev E. 2021 Sep;104(3-1):034126. doi: 10.1103/PhysRevE.104.034126.

Abstract

The fluctuation-dissipation theorem (FDT) is a simple yet powerful consequence of the first-order differential equation governing the dynamics of systems subject simultaneously to dissipative and stochastic forces. The linear learning dynamics, in which the input vector maps to the output vector by a linear matrix whose elements are the subject of learning, has a stochastic version closely mimicking the Langevin dynamics when a full-batch gradient descent scheme is replaced by that of a stochastic gradient descent. We derive a generalized FDT for the stochastic linear learning dynamics and verify its validity among the well-known machine learning data sets such as MNIST, CIFAR-10, and EMNIST.