Tree Broad Learning System for Small Data Modeling

IEEE Trans Neural Netw Learn Syst. 2022 Nov 3:PP. doi: 10.1109/TNNLS.2022.3216788. Online ahead of print.

Abstract

Broad learning system based on neural network (BLS-NN) has poor efficiency for small data modeling with various dimensions. Tree-based BLS (TBLS) is designed for small data modeling by introducing nondifferentiable modules and an ensemble strategy to the traditional broad learning system (BLS). TBLS replaces the neurons of BLS with the tree modules to map the input data. Moreover, we present three new TBLS variant methods and their incremental learning implementations, which are motivated by deep, broad, and ensemble learning. Their major distinction is reflected in the incremental learning strategies based on: 1) mean square error (mse); 2) pseudo-inverse; and 3) pseudo-inverse theory and stack representation. Therefore, this study further explores the domain of BLS based on the nondifferentiable modules. The simulations are compared with some state-of-the-art (SOTA) BLS-NN and tree methods under high-, medium-, and low-dimensional benchmark datasets. Results show that the proposed method outperforms the BLS-NN, and the modeling accuracy is remarkably improved with the small training data of the proposed TBLS.