A local Vapnik-Chervonenkis complexity

Neural Netw. 2016 Oct:82:62-75. doi: 10.1016/j.neunet.2016.07.002. Epub 2016 Jul 18.

Abstract

We define in this work a new localized version of a Vapnik-Chervonenkis (VC) complexity, namely the Local VC-Entropy, and, building on this new complexity, we derive a new generalization bound for binary classifiers. The Local VC-Entropy-based bound improves on the original Vapnik's results because it is able to discard those functions that, most likely, will not be selected during the learning phase. The result is achieved by applying the localization principle to the original global complexity measure, in the same spirit of the Local Rademacher Complexity. By exploiting and improving a recently developed geometrical framework, we show that it is also possible to relate the Local VC-Entropy to the Local Rademacher Complexity by finding an admissible range for one given the other. In addition, the Local VC-Entropy allows one to reduce the computational requirements that arise when dealing with the Local Rademacher Complexity in binary classification problems.

Keywords: Complexity measures; Generalization error bounds; Local Rademacher Complexity; Local Vapnik–Chervonenkis entropy; Statistical Learning Theory.

MeSH terms

  • Entropy*
  • Machine Learning* / trends