Do Brain Networks Evolve by Maximizing Their Information Flow Capacity?

PLoS Comput Biol. 2015 Aug 28;11(8):e1004372. doi: 10.1371/journal.pcbi.1004372. eCollection 2015 Aug.

Abstract

We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same behaviors observed in the brain dynamical networks of Caenorhabditis elegans and humans, networks of Hindmarsh-Rose neurons with graphs given by these brain networks. We make a strong case to verify our hypothesis by showing that the neural networks with the closest graph distance to the brain networks of Caenorhabditis elegans and humans are the Hindmarsh-Rose neural networks evolved with coupling strengths that maximize information flow capacity. Surprisingly, we find that global neural synchronization levels decrease during brain evolution, reflecting on an underlying global no Hebbian-like evolution process, which is driven by no Hebbian-like learning behaviors for some of the clusters during evolution, and Hebbian-like learning rules for clusters where neurons increase their synchronization.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Animals
  • Brain / physiology*
  • Caenorhabditis elegans
  • Computational Biology
  • Humans
  • Learning / physiology*
  • Male
  • Models, Neurological*
  • Nerve Net / physiology*
  • Neural Networks, Computer
  • Neurons / physiology*
  • Young Adult

Grants and funding

This work was funded by EP/I032606/1, http://www.epsrc.ac.uk/, Engineering and Physical Sciences Research Council, (CGA, SS, MSB). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.