Information Geometric Approach on Most Informative Boolean Function Conjecture

Entropy (Basel). 2018 Sep 10;20(9):688. doi: 10.3390/e20090688.

Abstract

Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n → { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ; Y n ) is a dictator function, i.e., f ( x n ) = x i for some i. We propose a clustering problem, which is equivalent to the above problem where we emphasize an information geometry aspect of the equivalent problem. Moreover, we define a normalized geometric mean of measures and interesting properties of it. We also show that the conjecture is true when the arithmetic and geometric mean coincide in a specific set of measures.

Keywords: Boolean function; Bregman divergence; Jensen–Shannon divergence; clustering; geometric mean.