Machine Learning Memory Kernels as Closure for Non-Markovian Stochastic Processes

IEEE Trans Neural Netw Learn Syst. 2022 Nov 14:PP. doi: 10.1109/TNNLS.2022.3210695. Online ahead of print.

Abstract

Finding the dynamical law of observable quantities lies at the core of physics. Within the particular field of statistical mechanics, the generalized Langevin equation (GLE) comprises a general model for the evolution of observables covering a great deal of physical systems with many degrees of freedom and an inherently stochastic nature. Although formally exact, GLE brings its own great challenges. It depends on the complete history of the observables under scrutiny, as well as the microscopic degrees of freedom, all of which are often inaccessible. We show that these drawbacks can be overcome by adopting elements of machine learning from empirical data, in particular coupling a multilayer perceptron (MLP) with the formal structure of GLE and calibrating the MLP with the data. This yields a powerful computational tool capable of describing noisy complex systems beyond the realms of statistical mechanics. It is exemplified with a number of representative examples from different fields: from a single colloidal particle and particle chains in a thermal bath to climatology and finance, showing in all cases excellent agreement with the actual observable dynamics. The new framework offers an alternative perspective for the study of nonequilibrium processes opening also a new route for stochastic modeling.