Active Batch Selection via Convex Relaxations with Guaranteed Solution Bounds

IEEE Trans Pattern Anal Mach Intell. 2015 Oct;37(10):1945-58. doi: 10.1109/TPAMI.2015.2389848.

Abstract

Active learning techniques have gained popularity to reduce human effort in labeling data instances for inducing a classifier. When faced with large amounts of unlabeled data, such algorithms automatically identify the exemplar instances for manual annotation. More recently, there have been attempts towards a batch mode form of active learning, where a batch of data points is simultaneously selected from an unlabeled set. In this paper, we propose two novel batch mode active learning (BMAL) algorithms: BatchRank and BatchRand. We first formulate the batch selection task as an NP-hard optimization problem; we then propose two convex relaxations, one based on linear programming and the other based on semi-definite programming to solve the batch selection problem. Finally, a deterministic bound is derived on the solution quality for the first relaxation and a probabilistic bound for the second. To the best of our knowledge, this is the first research effort to derive mathematical guarantees on the solution quality of the BMAL problem. Our extensive empirical studies on 15 binary, multi-class and multi-label challenging datasets corroborate that the proposed algorithms perform at par with the state-of-the-art techniques, deliver high quality solutions and are robust to real-world issues like label noise and class imbalance.

MeSH terms

  • Algorithms*
  • Animals
  • Artificial Intelligence*
  • Computational Biology
  • Data Mining
  • Databases, Factual
  • Humans
  • Pattern Recognition, Automated / methods*