Two-Stage Bayesian Optimization for Scalable Inference in State-Space Models

IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5138-5149. doi: 10.1109/TNNLS.2021.3069172. Epub 2022 Oct 5.

Abstract

State-space models (SSMs) are a rich class of dynamical models with a wide range of applications in economics, healthcare, computational biology, robotics, and more. Proper analysis, control, learning, and decision-making in dynamical systems modeled by SSMs depend on the accuracy of the inferred/learned model. Most of the existing inference techniques for SSMs are capable of dealing with very small systems, unable to be applied to most of the large-scale practical problems. Toward this, this article introduces a two-stage Bayesian optimization (BO) framework for scalable and efficient inference in SSMs. The proposed framework maps the original large parameter space to a reduced space, containing a small linear combination of the original space. This reduced space, which captures the most variability in the inference function (e.g., log likelihood or log a posteriori), is obtained by eigenvalue decomposition of the covariance of gradients of the inference function approximated by a particle filtering scheme. Then, an exponential reduction in the search space of parameters during the inference process is achieved through the proposed two-stage BO policy, where the solution of the first-stage BO policy in the reduced space specifies the search space of the second-stage BO in the original space. The proposed framework's accuracy and speed are demonstrated through several experiments, including real metagenomics data from a gut microbial community.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Bayes Theorem
  • Computational Biology* / methods
  • Neural Networks, Computer*
  • Probability
  • Space Simulation