On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius

Entropy (Basel). 2021 Apr 14;23(4):464. doi: 10.3390/e23040464.

Abstract

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.

Keywords: Bhattacharyya distance; Bregman divergence; Bregman information; Fenchel–Young divergence; Jensen-Shannon divergence; Rényi entropy; centroid; clustering; diversity index; exponential family; information projection; information radius; q-divergence; q-exponential family.