A stochastic variational framework for Recurrent Gaussian Processes models

Neural Netw. 2019 Apr:112:54-72. doi: 10.1016/j.neunet.2019.01.005. Epub 2019 Feb 1.

Abstract

Gaussian Processes (GPs) models have been successfully applied to the problem of learning from sequential observations. In such context, the family of Recurrent Gaussian Processes (RGPs) have been recently introduced with a specifically designed structure to handle dynamical data. However, RGPs present a limitation shared by most GP approaches: they become computationally infeasible when facing very large datasets. In the present work, with the aim of improving scalability, we modify the original variational approach used with RGPs in order to enable inference via stochastic mini-batch optimization, giving rise to the Stochastic Recurrent Variational Bayes (S-REVARB) framework. We review recent related literature and comprehensively contextualize it with our approach. Moreover, we propose two learning procedures, the Local and Global S-REVARB algorithms, which prevent computational costs from scaling with the number of training samples. The global variant permits even greater scalability by also preventing the number of variational parameters from increasing with the training set, through the use of neural networks as sequential recognition models. The proposed framework is evaluated in the task of dynamical system identification for large scale datasets, a scenario not readily supported by the standard batch inference for RGPs. The promising results indicate that the S-REVARB framework opens up the possibility of applying powerful hierarchical recurrent GP-based models to massive sequential data.

Keywords: Dynamical modeling; Gaussian Processes; Stochastic learning; Variational inference.

MeSH terms

  • Algorithms
  • Bayes Theorem
  • Learning*
  • Neural Networks, Computer*
  • Normal Distribution
  • Stochastic Processes*