Comparison of universal approximators incorporating partial monotonicity by structure

Neural Netw. 2010 May;23(4):471-5. doi: 10.1016/j.neunet.2009.09.002. Epub 2009 Sep 17.

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN-MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Computational Biology
  • Computer Simulation
  • Neural Networks, Computer*
  • Pattern Recognition, Automated*