Enhancing and improving the performance of imbalanced class data using novel GBO and SSG: A comparative analysis

Neural Netw. 2024 May:173:106157. doi: 10.1016/j.neunet.2024.106157. Epub 2024 Feb 2.

Abstract

Class imbalance problem (CIP) in a dataset is a major challenge that significantly affects the performance of Machine Learning (ML) models resulting in biased predictions. Numerous techniques have been proposed to address CIP, including, but not limited to, Oversampling, Undersampling, and cost-sensitive approaches. Due to its ability to generate synthetic data, oversampling techniques such as the Synthetic Minority Oversampling Technique (SMOTE) are the most widely used methodology by researchers. However, one of SMOTE's potential disadvantages is that newly created minor samples overlap with major samples. Therefore, the probability of ML models' biased performance toward major classes increases. Generative adversarial network (GAN) has recently garnered much attention due to their ability to create real samples. However, GAN is hard to train even though it has much potential. Considering these opportunities, this work proposes two novel techniques: GAN-based Oversampling (GBO) and Support Vector Machine-SMOTE-GAN (SSG) to overcome the limitations of the existing approaches. The preliminary results show that SSG and GBO performed better on the nine imbalanced benchmark datasets than several existing SMOTE-based approaches. Additionally, it can be observed that the proposed SSG and GBO methods can accurately classify the minor class with more than 90% accuracy when tested with 20%, 30%, and 40% of the test data. The study also revealed that the minor sample generated by SSG demonstrates Gaussian distributions, which is often difficult to achieve using original SMOTE and SVM-SMOTE.

Keywords: GAN; Imbalanced class data; Minor sample; Neural network; Oversampling; SMOTE.

MeSH terms

  • Algorithms*
  • Machine Learning*
  • Probability
  • Support Vector Machine