Discrete Box-Constrained Minimax Classifier for Uncertain and Imbalanced Class Proportions

IEEE Trans Pattern Anal Mach Intell. 2022 Jun;44(6):2923-2937. doi: 10.1109/TPAMI.2020.3046439. Epub 2022 May 5.

Abstract

This paper aims to build a supervised classifier for dealing with imbalanced datasets, uncertain class proportions, dependencies between features, the presence of both numeric and categorical features, and arbitrary loss functions. The Bayes classifier suffers when prior probability shifts occur between the training and testing sets. A solution is to look for an equalizer decision rule whose class-conditional risks are equal. Such a classifier corresponds to a minimax classifier when it maximizes the Bayes risk. We develop a novel box-constrained minimax classifier which takes into account some constraints on the priors to control the risk maximization. We analyze the empirical Bayes risk with respect to the box-constrained priors for discrete inputs. We show that this risk is a concave non-differentiable multivariate piecewise affine function. A projected subgradient algorithm is derived to maximize this empirical Bayes risk over the box-constrained simplex. Its convergence is established and its speed is bounded. The optimization algorithm is scalable when the number of classes is large. The robustness of our classifier is studied on diverse databases. Our classifier, jointly applied with a clustering algorithm to process mixed attributes, tends to equalize the class-conditional risks while being not too pessimistic.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Bayes Theorem
  • Cluster Analysis
  • Databases, Factual