An improved analysis of the Rademacher data-dependent bound using its self bounding property

Neural Netw. 2013 Aug:44:107-11. doi: 10.1016/j.neunet.2013.03.017. Epub 2013 Apr 3.

Abstract

The problem of assessing the performance of a classifier, in the finite-sample setting, has been addressed by Vapnik in his seminal work by using data-independent measures of complexity. Recently, several authors have addressed the same problem by proposing data-dependent measures, which tighten previous results by taking in account the actual data distribution. In this framework, we derive some data-dependent bounds on the generalization ability of a classifier by exploiting the Rademacher Complexity and recent concentration results: in addition of being appealing for practical purposes, as they exploit empirical quantities only, these bounds improve previously known results.

MeSH terms

  • Artificial Intelligence*
  • Statistics as Topic / methods
  • Statistics as Topic / trends*