Backpropagation With N -D Vector-Valued Neurons Using Arbitrary Bilinear Products

IEEE Trans Neural Netw Learn Syst. 2020 Jul;31(7):2638-2652. doi: 10.1109/TNNLS.2019.2933882. Epub 2019 Sep 6.

Abstract

Vector-valued neural learning has emerged as a promising direction in deep learning recently. Traditionally, training data for neural networks (NNs) are formulated as a vector of scalars; however, its performance may not be optimal since associations among adjacent scalars are not modeled. In this article, we propose a new vector neural architecture called the Arbitrary BIlinear Product NN (ABIPNN), which processes information as vectors in each neuron, and the feedforward projections are defined using arbitrary bilinear products. Such bilinear products can include circular convolution, 7-D vector product, skew circular convolution, reversed-time circular convolution, or other new products that are not seen in the previous work. As a proof-of-concept, we apply our proposed network to multispectral image denoising and singing voice separation. Experimental results show that ABIPNN obtains substantial improvements when compared to conventional NNs, suggesting that associations are learned during training.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Deep Learning*
  • Humans
  • Neural Networks, Computer*
  • Neurons* / physiology
  • Support Vector Machine*