Synergy, redundancy, and multivariate information measures: an experimentalist's perspective

J Comput Neurosci. 2014 Apr;36(2):119-40. doi: 10.1007/s10827-013-0458-4. Epub 2013 Jul 3.

Abstract

Information theory has long been used to quantify interactions between two variables. With the rise of complex systems research, multivariate information measures have been increasingly used to investigate interactions between groups of three or more variables, often with an emphasis on so called synergistic and redundant interactions. While bivariate information measures are commonly agreed upon, the multivariate information measures in use today have been developed by many different groups, and differ in subtle, yet significant ways. Here, we will review these multivariate information measures with special emphasis paid to their relationship to synergy and redundancy, as well as examine the differences between these measures by applying them to several simple model systems. In addition to these systems, we will illustrate the usefulness of the information measures by analyzing neural spiking data from a dissociated culture through early stages of its development. Our aim is that this work will aid other researchers as they seek the best multivariate information measure for their specific research goals and system. Finally, we have made software available online which allows the user to calculate all of the information measures discussed within this paper.

Publication types

  • Review

MeSH terms

  • Action Potentials / physiology
  • Animals
  • Electronic Data Processing
  • Entropy
  • Humans
  • Information Theory*
  • Models, Neurological*
  • Neurons / physiology*
  • Probability