A Generic Formula and Some Special Cases for the Kullback-Leibler Divergence between Central Multivariate Cauchy Distributions

Entropy (Basel). 2022 Jun 17;24(6):838. doi: 10.3390/e24060838.

Abstract

This paper introduces a closed-form expression for the Kullback-Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs are surveyed and some new results and properties are derived and discussed for the KLD. In addition, the KLD for MCDs is showed to be written as a function of Lauricella D-hypergeometric series FD(p). Finally, a comparison is made between the Monte Carlo sampling method to approximate the KLD and the numerical value of the closed-form expression of the latter. The approximation of the KLD by Monte Carlo sampling method are shown to converge to its theoretical value when the number of samples goes to the infinity.

Keywords: Kullback–Leibler divergence (KLD); Lauricella D-hypergeometric series; Multivariate Cauchy distribution (MCD); multiple power series.

Grants and funding

This research received no external funding.