Extended Hamiltonian learning on Riemannian manifolds: theoretical aspects

IEEE Trans Neural Netw. 2011 May;22(5):687-700. doi: 10.1109/TNN.2011.2109395. Epub 2011 Mar 22.

Abstract

This paper introduces a general theory of extended Hamiltonian (second-order) learning on Riemannian manifolds, as an instance of learning by constrained criterion optimization. The dynamical learning equations are derived within the general framework of extended-Hamiltonian stationary-action principle and are expressed in a coordinate-free fashion. A theoretical analysis is carried out in order to compare the features of the dynamical learning theory with the features exhibited by the gradient-based ones. In particular, gradient-based learning is shown to be an instance of dynamical learning, and the classical gradient-based learning modified by a "momentum" term is shown to resemble discrete-time dynamical learning. Moreover, the convergence features of gradient-based and dynamical learning are compared on a theoretical basis. This paper discusses cases of learning by dynamical systems on manifolds of interest in the scientific literature, namely, the Stiefel manifold, the special orthogonal group, the Grassmann manifold, the group of symmetric positive definite matrices, the generalized flag manifold, and the real symplectic group of matrices.

Publication types

  • Comparative Study

MeSH terms

  • Algorithms*
  • Artificial Intelligence*
  • Computer Simulation
  • Mathematical Concepts
  • Models, Theoretical*
  • Neural Networks, Computer*