SNR: Symbolic network-based rectifiable learning framework for symbolic regression

Neural Netw. 2023 Aug:165:1021-1034. doi: 10.1016/j.neunet.2023.06.046. Epub 2023 Jul 4.

Abstract

Symbolic regression (SR) can be utilized to unveil the underlying mathematical expressions that describe a given set of observed data. At present, SR can be categorized into two methods: learning-from-scratch and learning-with-experience. Compared to learning-from-scratch, learning-with-experience yields results that are comparable to those of several benchmarks and incurs significantly lower time costs for obtaining expressions. However, the learning-with-experience model performs poorly in terms of unseen data distributions and lacks a rectification tool, apart from constant optimization, which exhibits limited performance. In this study, we propose a Symbolic Network-based Rectifiable Learning Framework (SNR) that possesses the ability to correct errors. SNR adopts Symbolic Network (SymNet) to represent an expression, and the encoding of SymNet is designed to provide supervised information, with numerous self-generated expressions, to train a policy net (PolicyNet). The training of PolicyNet can offer prior knowledge to guide effective searches. Subsequently, the incorrectly predicted expressions are revised via a rectification mechanism. This rectification mechanism endows SNR with broader applicability. Experimental results demonstrate that our proposed method achieves the highest averaged coefficient of determination on self-generated datasets when compared with other state-of-the-art methods and yields more accurate results in public datasets.

Keywords: Learning-from-scratch; Learning-with-experience; Symbolic network; Symbolic regression.

MeSH terms

  • Benchmarking*
  • Knowledge
  • Learning*
  • Policy