Distinguishing between Clausius, Boltzmann and Pauling Entropies of Frozen Non-Equilibrium States

Entropy (Basel). 2019 Aug 15;21(8):799. doi: 10.3390/e21080799.

Abstract

In conventional textbook thermodynamics, entropy is a quantity that may be calculated by different methods, for example experimentally from heat capacities (following Clausius) or statistically from numbers of microscopic quantum states (following Boltzmann and Planck). It had turned out that these methods do not necessarily provide mutually consistent results, and for equilibrium systems their difference was explained by introducing a residual zero-point entropy (following Pauling), apparently violating the Nernst theorem. At finite temperatures, associated statistical entropies which count microstates that do not contribute to a body's heat capacity, differ systematically from Clausius entropy, and are of particular relevance as measures for metastable, frozen-in non-equilibrium structures and for symbolic information processing (following Shannon). In this paper, it is suggested to consider Clausius, Boltzmann, Pauling and Shannon entropies as distinct, though related, physical quantities with different key properties, in order to avoid confusion by loosely speaking about just "entropy" while actually referring to different kinds of it. For instance, zero-point entropy exclusively belongs to Boltzmann rather than Clausius entropy, while the Nernst theorem holds rigorously for Clausius rather than Boltzmann entropy. The discussion of those terms is underpinned by a brief historical review of the emergence of corresponding fundamental thermodynamic concepts.

Keywords: Nernst theorem; Pauling entropy; Shannon entropy; empirical entropy; frozen states; metastable states; non-equilibrium; residual entropy; statistical entropy; symbolic information.