Self-Supervised EEG Representation Learning with Contrastive Predictive Coding for Post-Stroke Patients

Int J Neural Syst. 2023 Dec;33(12):2350066. doi: 10.1142/S0129065723500661.

Abstract

Stroke patients are prone to fatigue during the EEG acquisition procedure, and experiments have high requirements on cognition and physical limitations of subjects. Therefore, how to learn effective feature representation is very important. Deep learning networks have been widely used in motor imagery (MI) based brain-computer interface (BCI). This paper proposes a contrast predictive coding (CPC) framework based on the modified s-transform (MST) to generate MST-CPC feature representations. MST is used to acquire the temporal-frequency feature to improve the decoding performance for MI task recognition. EEG2Image is used to convert multi-channel one-dimensional EEG into two-dimensional EEG topography. High-level feature representations are generated by CPC which consists of an encoder and autoregressive model. Finally, the effectiveness of generated features is verified by the k-means clustering algorithm. It can be found that our model generates features with high efficiency and a good clustering effect. After classification performance evaluation, the average classification accuracy of MI tasks is 89% based on 40 subjects. The proposed method can obtain effective feature representations and improve the performance of MI-BCI systems. By comparing several self-supervised methods on the public dataset, it can be concluded that the MST-CPC model has the highest average accuracy. This is a breakthrough in the combination of self-supervised learning and image processing of EEG signals. It is helpful to provide effective rehabilitation training for stroke patients to promote motor function recovery.

Keywords: Contrastive learning; EEG2Image; electroencephalogram (EEG); modified s-transform (MST); stroke.

MeSH terms

  • Algorithms
  • Brain-Computer Interfaces*
  • Cognition
  • Electroencephalography / methods
  • Humans
  • Imagination*