Large Deviations Properties of Maximum Entropy Markov Chains from Spike Trains

Entropy (Basel). 2018 Aug 3;20(8):573. doi: 10.3390/e20080573.

Abstract

We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. To find the maximum entropy Markov chain, we use the thermodynamic formalism, which provides insightful connections with statistical physics and thermodynamics from which large deviations properties arise naturally. We provide an accessible introduction to the maximum entropy Markov chain inference problem and large deviations theory to the community of computational neuroscience, avoiding some technicalities while preserving the core ideas and intuitions. We review large deviations techniques useful in spike train statistics to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability, and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

Keywords: computational neuroscience; entropy production; large deviation theory; maximum entropy principle; out-of-equilibrium statistical mechanics; spike train statistics; thermodynamic formalism.