We consider the moving least-square (MLS) method by the coefficient-based regression framework with -regularizer and the sample dependent hypothesis spaces. The data dependent characteristic of the new algorithm provides flexibility and adaptivity for MLS. We carry out a rigorous error analysis by using the stepping stone technique in the error decomposition. The concentration technique with the -empirical covering number is also employed in our study to improve the sample error. We derive the satisfactory learning rate that can be arbitrarily close to the best rate under more natural and much simpler conditions.
Keywords: Data dependent hypothesis space; Learning rate; Moving least-square method; Regularization function; Uniform concentration inequality.