Information in a Network of Neuronal Cells: Effect of Cell Density and Short-Term Depression

Biomed Res Int. 2016:2016:2769698. doi: 10.1155/2016/2769698. Epub 2016 Jun 14.

Abstract

Neurons are specialized, electrically excitable cells which use electrical to chemical signals to transmit and elaborate information. Understanding how the cooperation of a great many of neurons in a grid may modify and perhaps improve the information quality, in contrast to few neurons in isolation, is critical for the rational design of cell-materials interfaces for applications in regenerative medicine, tissue engineering, and personalized lab-on-a-chips. In the present paper, we couple an integrate-and-fire model with information theory variables to analyse the extent of information in a network of nerve cells. We provide an estimate of the information in the network in bits as a function of cell density and short-term depression time. In the model, neurons are connected through a Delaunay triangulation of not-intersecting edges; in doing so, the number of connecting synapses per neuron is approximately constant to reproduce the early time of network development in planar neural cell cultures. In simulations where the number of nodes is varied, we observe an optimal value of cell density for which information in the grid is maximized. In simulations in which the posttransmission latency time is varied, we observe that information increases as the latency time decreases and, for specific configurations of the grid, it is largely enhanced in a resonance effect.

MeSH terms

  • Cell Count
  • Entropy
  • Models, Neurological*
  • Nerve Net / cytology*
  • Neuronal Plasticity / physiology*
  • Neurons / cytology*
  • Reaction Time