Maximum entropy and Bayesian data analysis: Entropic prior distributions

Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Oct;70(4 Pt 2):046127. doi: 10.1103/PhysRevE.70.046127. Epub 2004 Oct 29.

Abstract

The problem of assigning probability distributions which reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of maximum (relative) entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.