Nonparametric estimation of Küllback-Leibler divergence

Neural Comput. 2014 Nov;26(11):2570-93. doi: 10.1162/NECO_a_00646. Epub 2014 Jul 24.

Abstract

In this letter, we introduce an estimator of Küllback-Leibler divergence based on two independent samples. We show that on any finite alphabet, this estimator has an exponentially decaying bias and that it is consistent and asymptotically normal. To explain the importance of this estimator, we provide a thorough analysis of the more standard plug-in estimator. We show that it is consistent and asymptotically normal, but with an infinite bias. Moreover, if we modify the plug-in estimator to remove the rare events that cause the bias to become infinite, the bias still decays at a rate no faster than O(1/n). Further, we extend our results to estimating the symmetrized Küllback-Leibler divergence. We conclude by providing simulation results, which show that the asymptotic properties of these estimators hold even for relatively small sample sizes.

MeSH terms

  • Artificial Intelligence*
  • Humans
  • Models, Theoretical*
  • Statistics, Nonparametric*
  • Stochastic Processes