BrainGAN: Brain MRI Image Generation and Classification Framework Using GAN Architectures and CNN Models

Sensors (Basel). 2022 Jun 6;22(11):4297. doi: 10.3390/s22114297.

Abstract

Deep learning models have been used in several domains, however, adjusting is still required to be applied in sensitive areas such as medical imaging. As the use of technology in the medical domain is needed because of the time limit, the level of accuracy assures trustworthiness. Because of privacy concerns, machine learning applications in the medical field are unable to use medical data. For example, the lack of brain MRI images makes it difficult to classify brain tumors using image-based classification. The solution to this challenge was achieved through the application of Generative Adversarial Network (GAN)-based augmentation techniques. Deep Convolutional GAN (DCGAN) and Vanilla GAN are two examples of GAN architectures used for image generation. In this paper, a framework, denoted as BrainGAN, for generating and classifying brain MRI images using GAN architectures and deep learning models was proposed. Consequently, this study proposed an automatic way to check that generated images are satisfactory. It uses three models: CNN, MobileNetV2, and ResNet152V2. Training the deep transfer models with images made by Vanilla GAN and DCGAN, and then evaluating their performance on a test set composed of real brain MRI images. From the results of the experiment, it was found that the ResNet152V2 model outperformed the other two models. The ResNet152V2 achieved 99.09% accuracy, 99.12% precision, 99.08% recall, 99.51% area under the curve (AUC), and 0.196 loss based on the brain MRI images generated by DCGAN architecture.

Keywords: DCGANs; brain MRI images; deep learning; image classification; image generation; vanilla GANs.

MeSH terms

  • Brain / diagnostic imaging
  • Brain Neoplasms*
  • Humans
  • Machine Learning
  • Magnetic Resonance Imaging* / methods
  • Neuroimaging

Grants and funding

This research received no external funding.