Error analysis for l q -coefficient regularized moving least-square regression

J Inequal Appl. 2018;2018(1):262. doi: 10.1186/s13660-018-1856-y. Epub 2018 Sep 25.

Abstract

We consider the moving least-square (MLS) method by the coefficient-based regression framework with l q -regularizer ( 1 q 2 ) and the sample dependent hypothesis spaces. The data dependent characteristic of the new algorithm provides flexibility and adaptivity for MLS. We carry out a rigorous error analysis by using the stepping stone technique in the error decomposition. The concentration technique with the l 2 -empirical covering number is also employed in our study to improve the sample error. We derive the satisfactory learning rate that can be arbitrarily close to the best rate O ( m - 1 ) under more natural and much simpler conditions.

Keywords: Data dependent hypothesis space; Learning rate; Moving least-square method; Regularization function; Uniform concentration inequality.