Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities

Entropy (Basel). 2018 Mar 9;20(3):182. doi: 10.3390/e20030182.

Abstract

Let Z be a standard Gaussian random variable, X be independent of Z, and t be a strictly positive scalar. For the derivatives in t of the differential entropy of X + t Z , McKean noticed that Gaussian X achieves the extreme for the first and second derivatives, among distributions with a fixed variance, and he conjectured that this holds for general orders of derivatives. This conjecture implies that the signs of the derivatives alternate. Recently, Cheng and Geng proved that this alternation holds for the first four orders. In this work, we employ the technique of linear matrix inequalities to show that: firstly, Cheng and Geng's method may not generalize to higher orders; secondly, when the probability density function of X + t Z is log-concave, McKean's conjecture holds for orders up to at least five. As a corollary, we also recover Toscani's result on the sign of the third derivative of the entropy power of X + t Z , using a much simpler argument.

Keywords: Gaussian optimality; differential entropy; entropy power; linear matrix inequality; log-concavity.