Variable Selection for Nonparametric Learning with Power Series Kernels

Neural Comput. 2019 Aug;31(8):1718-1750. doi: 10.1162/neco_a_01212. Epub 2019 Jul 1.

Abstract

In this letter, we propose a variable selection method for general nonparametric kernel-based estimation. The proposed method consists of two-stage estimation: (1) construct a consistent estimator of the target function, and (2) approximate the estimator using a few variables by 1-type penalized estimation. We see that the proposed method can be applied to various kernel nonparametric estimation such as kernel ridge regression, kernel-based density, and density-ratio estimation. We prove that the proposed method has the property of variable selection consistency when the power series kernel is used. Here, the power series kernel is a certain class of kernels containing polynomial and exponential kernels. This result is regarded as an extension of the variable selection consistency for the nonnegative garrote (NNG), a special case of the adaptive Lasso, to the kernel-based estimators. Several experiments, including simulation studies and real data applications, show the effectiveness of the proposed method.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Computer Simulation
  • Diabetes Mellitus / classification
  • Female
  • Humans
  • Logistic Models
  • Machine Learning*
  • Neoplasms / classification
  • Post-Cardiac Arrest Syndrome / classification
  • Renal Insufficiency, Chronic / classification
  • Statistics, Nonparametric