Beta Hebbian Learning as a New Method for Exploratory Projection Pursuit

Int J Neural Syst. 2017 Sep;27(6):1750024. doi: 10.1142/S0129065717500241. Epub 2017 Mar 16.

Abstract

In this research, a novel family of learning rules called Beta Hebbian Learning (BHL) is thoroughly investigated to extract information from high-dimensional datasets by projecting the data onto low-dimensional (typically two dimensional) subspaces, improving the existing exploratory methods by providing a clear representation of data's internal structure. BHL applies a family of learning rules derived from the Probability Density Function (PDF) of the residual based on the beta distribution. This family of rules may be called Hebbian in that all use a simple multiplication of the output of the neural network with some function of the residuals after feedback. The derived learning rules can be linked to an adaptive form of Exploratory Projection Pursuit and with artificial distributions, the networks perform as the theory suggests they should: the use of different learning rules derived from different PDFs allows the identification of "interesting" dimensions (as far from the Gaussian distribution as possible) in high-dimensional datasets. This novel algorithm, BHL, has been tested over seven artificial datasets to study the behavior of BHL parameters, and was later applied successfully over four real datasets, comparing its results, in terms of performance, with other well-known Exploratory and projection models such as Maximum Likelihood Hebbian Learning (MLHL), Locally-Linear Embedding (LLE), Curvilinear Component Analysis (CCA), Isomap and Neural Principal Component Analysis (Neural PCA).

Keywords: Exploratory projection pursuit; Hebbian learning; beta distribution; neural networks; unsupervised learning.

MeSH terms

  • Algorithms*
  • Databases, Factual / statistics & numerical data*
  • Machine Learning / statistics & numerical data*
  • Neural Networks, Computer