Generalized information entropies depending only on the probability distribution

Phys Rev E Stat Nonlin Soft Matter Phys. 2013 Dec;88(6):062146. doi: 10.1103/PhysRevE.88.062146. Epub 2013 Dec 27.

Abstract

In the framework of superstatistics it has been shown that one can calculate the entropy of nonextensive statistical mechanics. We follow a similar procedure; we assume a Γ(χ(2)) distribution depending on β that also depends on a parameter p(l). From it we calculate the Boltzmann factor and show that it is possible to obtain the information entropy S=k∑(l=1)(Ω)s(p(l)), where s(p(l))=1-p(l)(p(l)). By maximizing this information measure, p(l) is calculated as function of βE(l) and, at this stage of the procedure, p(l) can be identified with the probability distribution. We show the validity of the saddle-point approximation and we also briefly discuss the generalization of one of the four Khinchin axioms. The modified axioms are then in accordance with the proposed entropy. As further possibilities, we also propose other entropies depending on p(l) that resemble the Kaniakadis and two possible Sharma-Mittal entropies. By expanding in series all entropies in this work we have as a first term the Shannon entropy.

Publication types

  • Research Support, Non-U.S. Gov't