Machine learning of independent conservation laws through neural deflation

Phys Rev E. 2023 Aug;108(2):L022301. doi: 10.1103/PhysRevE.108.L022301.

Abstract

We introduce a methodology for seeking conservation laws within a Hamiltonian dynamical system, which we term "neural deflation." Inspired by deflation methods for steady states of dynamical systems, we propose to iteratively train a number of neural networks to minimize a regularized loss function accounting for the necessity of conserved quantities to be in involution and enforcing functional independence thereof consistently in the infinite-sample limit. The method is applied to a series of integrable and nonintegrable lattice differential-difference equations. In the former, the predicted number of conservation laws extensively grows with the number of degrees of freedom, while for the latter, it generically stops at a threshold related to the number of conserved quantities in the system. This data-driven tool could prove valuable in assessing a model's conserved quantities and its potential integrability.