Mixture of Experts for EEG-Based Seizure Subtype Classification

IEEE Trans Neural Syst Rehabil Eng. 2023:31:4781-4789. doi: 10.1109/TNSRE.2023.3337802. Epub 2023 Dec 7.

Abstract

Epilepsy is a pervasive neurological disorder affecting approximately 50 million individuals worldwide. Electroencephalogram (EEG) based seizure subtype classification plays a crucial role in epilepsy diagnosis and treatment. However, automatic seizure subtype classification faces at least two challenges: 1) class imbalance, i.e., certain seizure types are considerably less common than others; and 2) no a priori knowledge integration, so that a large number of labeled EEG samples are needed to train a machine learning model, particularly, deep learning. This paper proposes two novel Mixture of Experts (MoE) models, Seizure-MoE and Mix-MoE, for EEG-based seizure subtype classification. Particularly, Mix-MoE adequately addresses the above two challenges: 1) it introduces a novel imbalanced sampler to address significant class imbalance; and 2) it incorporates a priori knowledge of manual EEG features into the deep neural network to improve the classification performance. Experiments on two public datasets demonstrated that the proposed Seizure-MoE and Mix-MoE outperformed multiple existing approaches in cross-subject EEG-based seizure subtype classification. Our proposed MoE models may also be easily extended to other EEG classification problems with severe class imbalance, e.g., sleep stage classification.

MeSH terms

  • Electroencephalography
  • Epilepsy* / diagnosis
  • Humans
  • Neural Networks, Computer
  • Seizures / diagnosis
  • Signal Processing, Computer-Assisted*