Global Negative Correlation Learning: A Unified Framework for Global Optimization of Ensemble Models

IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):4031-4042. doi: 10.1109/TNNLS.2021.3055734. Epub 2022 Aug 3.

Abstract

Ensembles are a widely implemented approach in the machine learning community and their success is traditionally attributed to the diversity within the ensemble. Most of these approaches foster diversity in the ensemble by data sampling or by modifying the structure of the constituent models. Despite this, there is a family of ensemble models in which diversity is explicitly promoted in the error function of the individuals. The negative correlation learning (NCL) ensemble framework is probably the most well-known algorithm within this group of methods. This article analyzes NCL and reveals that the framework actually minimizes the combination of errors of the individuals of the ensemble instead of minimizing the residuals of the final ensemble. We propose a novel ensemble framework, named global negative correlation learning (GNCL), which focuses on the optimization of the global ensemble instead of the individual fitness of its components. An analytical solution for the parameters of base regressors based on the NCL framework and the global error function proposed is also provided under the assumption of fixed basis functions (although the general framework could also be instantiated for neural networks with nonfixed basis functions). The proposed ensemble framework is evaluated by extensive experiments with regression and classification data sets. Comparisons with other state-of-the-art ensemble methods confirm that GNCL yields the best overall performance.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Humans
  • Machine Learning
  • Neural Networks, Computer*