Methods of information theory and algorithmic complexity for network biology

Semin Cell Dev Biol. 2016 Mar:51:32-43. doi: 10.1016/j.semcdb.2016.01.011. Epub 2016 Jan 21.

Abstract

We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

Keywords: Algorithmic probability; Algorithmic randomness; Biological networks; Complex networks; Information theory; Kolmogorov complexity.

Publication types

  • Review

MeSH terms

  • Algorithms
  • Animals
  • Entropy
  • Humans
  • Information Theory*
  • Metabolic Networks and Pathways*