Entropy, Economics, and Criticality

Entropy (Basel). 2022 Jan 28;24(2):210. doi: 10.3390/e24020210.

Abstract

Information theory is a well-established method for the study of many phenomena and more than 70 years after Claude Shannon first described it in A Mathematical Theory of Communication it has been extended well beyond Shannon's initial vision. It is now an interdisciplinary tool that is used from 'causal' information flow to inferring complex computational processes and it is common to see it play an important role in fields as diverse as neuroscience, artificial intelligence, quantum mechanics, and astrophysics. In this article, I provide a selective review of a specific aspect of information theory that has received less attention than many of the others: as a tool for understanding, modelling, and detecting non-linear phenomena in finance and economics. Although some progress has been made in this area, it is still an under-developed area that I argue has considerable scope for further development.

Keywords: criticality; economics; financial markets; network theory; non-linear dynamics; non-stationary processes; phase transitions; tipping points.