Boundary conditions and phase transitions in neural networks. Theoretical results

Neural Netw. 2008 Sep;21(7):971-9. doi: 10.1016/j.neunet.2008.04.003. Epub 2008 May 5.

Abstract

The purpose of this paper is to present some relevant theoretical results on the asymptotic behaviour of finite neural networks (on lattices) when they are subjected to fixed boundary conditions. This work focuses on two different topics of interest from the biological point of view. First, it exhibits a link between the possible updating iteration modes in these networks, whatever the number of dimensions is. It proves that the effects of boundary conditions on neural networks do not depend on the updating iteration mode under the hypothesis of synaptic weight symmetry. Thus, if the asymptotic behaviour admits phase transitions, these phase transitions are observable for many updating iteration modes (from synchrony to asynchrony). Then, it shows that boundaries have no significant impact on one-dimensional neural networks. In order to prove this property, we present a new general mathematical approach based on the use of a projectivity matrix in order to simplify the problem. This approach allows the theoretical study of the asymptotic dynamics and of the boundary influence in neural networks. We will also introduce the numerical tools generalising the method in order to study phase transitions in more complex cases.

MeSH terms

  • Algorithms
  • Animals
  • Behavior / physiology*
  • Decision Support Techniques*
  • Humans
  • Neural Networks, Computer*
  • Nonlinear Dynamics*