A Geometric Interpretation of Stochastic Gradient Descent Using Diffusion Metrics

Entropy (Basel). 2020 Jan 15;22(1):101. doi: 10.3390/e22010101.

Abstract

This paper is a step towards developing a geometric understanding of a popular algorithm for training deep neural networks named stochastic gradient descent (SGD). We built upon a recent result which observed that the noise in SGD while training typical networks is highly non-isotropic. That motivated a deterministic model in which the trajectories of our dynamical systems are described via geodesics of a family of metrics arising from a certain diffusion matrix; namely, the covariance of the stochastic gradients in SGD. Our model is analogous to models in general relativity: the role of the electromagnetic field in the latter is played by the gradient of the loss function of a deep network in the former.

Keywords: deep learning; general relativity; stochastic gradient descent.