Predicting the Critical Number of Layers for Hierarchical Support Vector Regression

Entropy (Basel). 2020 Dec 29;23(1):37. doi: 10.3390/e23010037.

Abstract

Hierarchical support vector regression (HSVR) models a function from data as a linear combination of SVR models at a range of scales, starting at a coarse scale and moving to finer scales as the hierarchy continues. In the original formulation of HSVR, there were no rules for choosing the depth of the model. In this paper, we observe in a number of models a phase transition in the training error-the error remains relatively constant as layers are added, until a critical scale is passed, at which point the training error drops close to zero and remains nearly constant for added layers. We introduce a method to predict this critical scale a priori with the prediction based on the support of either a Fourier transform of the data or the Dynamic Mode Decomposition (DMD) spectrum. This allows us to determine the required number of layers prior to training any models.

Keywords: dynamic mode decomposition; fourier transform; koopman operator; support vector regression.

Grants and funding