Theory of overparametrization in quantum neural networks

Nat Comput Sci. 2023 Jun;3(6):542-551. doi: 10.1038/s43588-023-00467-6. Epub 2023 Jun 26.

Abstract

The prospect of achieving quantum advantage with quantum neural networks (QNNs) is exciting. Understanding how QNN properties (for example, the number of parameters M) affect the loss landscape is crucial to designing scalable QNN architectures. Here we rigorously analyze the overparametrization phenomenon in QNNs, defining overparametrization as the regime where the QNN has more than a critical number of parameters Mc allowing it to explore all relevant directions in state space. Our main results show that the dimension of the Lie algebra obtained from the generators of the QNN is an upper bound for Mc, and for the maximal rank that the quantum Fisher information and Hessian matrices can reach. Underparametrized QNNs have spurious local minima in the loss landscape that start disappearing when M ≥ Mc. Thus, the overparametrization onset corresponds to a computational phase transition where the QNN trainability is greatly improved. We then connect the notion of overparametrization to the QNN capacity, so that when a QNN is overparametrized, its capacity achieves its maximum possible value.