Statistical guarantees for regularized neural networks

Neural Netw. 2021 Oct:142:148-161. doi: 10.1016/j.neunet.2021.04.034. Epub 2021 Apr 30.

Abstract

Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural networks from data, especially for classes of estimators that are used in practice or at least similar to such. In this paper, we develop a general statistical guarantee for estimators that consist of a least-squares term and a regularizer. We then exemplify this guarantee with ℓ1-regularization, showing that the corresponding prediction error increases at most logarithmically in the total number of parameters and can even decrease in the number of layers. Our results establish a mathematical basis for regularized estimation of neural networks, and they deepen our mathematical understanding of neural networks and deep learning more generally.

Keywords: Deep learning; Neural networks; Prediction guarantees; Regularization.

MeSH terms

  • Least-Squares Analysis
  • Mathematics
  • Neural Networks, Computer*