On Training Efficiency and Computational Costs of a Feed Forward Neural Network: A Review

Comput Intell Neurosci. 2015:2015:818243. doi: 10.1155/2015/818243. Epub 2015 Aug 31.

Abstract

A comprehensive review on the problem of choosing a suitable activation function for the hidden layer of a feed forward neural network has been widely investigated. Since the nonlinear component of a neural network is the main contributor to the network mapping capabilities, the different choices that may lead to enhanced performances, in terms of training, generalization, or computational costs, are analyzed, both in general-purpose and in embedded computing environments. Finally, a strategy to convert a network configuration between different activation functions without altering the network mapping capabilities will be presented.

Publication types

  • Review

MeSH terms

  • Artificial Intelligence* / economics
  • Feedback*
  • Humans
  • Neural Networks, Computer*
  • Teaching*