A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

Entropy (Basel). 2018 Mar 9;20(3):185. doi: 10.3390/e20030185.

Abstract

We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d ( x , x ^ ) = | x - x ^ | r , with r ≥ 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log ( π e ) ≈ 1 . 5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log ( π e 2 ) ≈ 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log ( π e 2 ) ≈ 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry.

Keywords: Shannon lower bound; channel capacity; differential entropy; hyperplane conjecture; log-concave distribution; rate-distortion function; reverse entropy power inequality.