Complexity and non-commutativity of learning operations on graphs

Biosystems. 2006 Jul;85(1):84-93. doi: 10.1016/j.biosystems.2006.03.001. Epub 2006 May 9.

Abstract

We present results from numerical studies of supervised learning operations in small recurrent networks considered as graphs, leading from a given set of input conditions to predetermined outputs. Graphs that have optimized their output for particular inputs with respect to predetermined outputs are asymptotically stable and can be characterized by attractors, which form a representation space for an associative multiplicative structure of input operations. As the mapping from a series of inputs onto a series of such attractors generally depends on the sequence of inputs, this structure is generally non-commutative. Moreover, the size of the set of attractors, indicating the complexity of learning, is found to behave non-monotonically as learning proceeds. A tentative relation between this complexity and the notion of pragmatic information is indicated.

MeSH terms

  • Artificial Intelligence*
  • Biometry
  • Computer Graphics
  • Neural Networks, Computer
  • Systems Biology