Markov State Models: To Optimize or Not to Optimize

J Chem Theory Comput. 2024 Jan 23;20(2):977-988. doi: 10.1021/acs.jctc.3c01134. Epub 2024 Jan 1.

Abstract

Markov state models (MSM) are a popular statistical method for analyzing the conformational dynamics of proteins including protein folding. With all statistical and machine learning (ML) models, choices must be made about the modeling pipeline that cannot be directly learned from the data. These choices, or hyperparameters, are often evaluated by expert judgment or, in the case of MSMs, by maximizing variational scores such as the VAMP-2 score. Modern ML and statistical pipelines often use automatic hyperparameter selection techniques ranging from the simple, choosing the best score from a random selection of hyperparameters, to the complex, optimization via, e.g., Bayesian optimization. In this work, we ask whether it is possible to automatically select MSM models this way by estimating and analyzing over 16,000,000 observations from over 280,000 estimated MSMs. We find that differences in hyperparameters can change the physical interpretation of the optimization objective, making automatic selection difficult. In addition, we find that enforcing conditions of equilibrium in the VAMP scores can result in inconsistent model selection. However, other parameters that specify the VAMP-2 score (lag time and number of relaxation processes scored) have only a negligible influence on model selection. We suggest that model observables and variational scores should be only a guide to model selection and that a full investigation of the MSM properties should be undertaken when selecting hyperparameters.

MeSH terms

  • Bayes Theorem
  • Machine Learning
  • Markov Chains
  • Protein Folding
  • Proteins*
  • Vesicle-Associated Membrane Protein 2*

Substances

  • Vesicle-Associated Membrane Protein 2
  • Proteins