Bridging the BCI illiteracy gap: a subject-to-subject semantic style transfer for EEG-based motor imagery classification

Front Hum Neurosci. 2023 May 15:17:1194751. doi: 10.3389/fnhum.2023.1194751. eCollection 2023.

Abstract

Introduction: Brain-computer interfaces (BCIs) facilitate direct interaction between the human brain and computers, enabling individuals to control external devices through cognitive processes. Despite its potential, the problem of BCI illiteracy remains one of the major challenges due to inter-subject EEG variability, which hinders many users from effectively utilizing BCI systems. In this study, we propose a subject-to-subject semantic style transfer network (SSSTN) at the feature-level to address the BCI illiteracy problem in electroencephalogram (EEG)-based motor imagery (MI) classification tasks.

Methods: Our approach uses the continuous wavelet transform method to convert high-dimensional EEG data into images as input data. The SSSTN 1) trains a classifier for each subject, 2) transfers the distribution of class discrimination styles from the source subject (the best-performing subject for the classifier, i.e., BCI expert) to each subject of the target domain (the remaining subjects except the source subject, specifically BCI illiterates) through the proposed style loss, and applies a modified content loss to preserve the class-relevant semantic information of the target domain, and 3) finally merges the classifier predictions of both source and target subject using an ensemble technique.

Results and discussion: We evaluate the proposed method on the BCI Competition IV-2a and IV-2b datasets and demonstrate improved classification performance over existing methods, especially for BCI illiterate users. The ablation experiments and t-SNE visualizations further highlight the effectiveness of the proposed method in achieving meaningful feature-level semantic style transfer.

Keywords: BCI illiteracy; brain-computer interface; convolutional neural network; electroencephalogram; motor imagery; style transfer.

Grants and funding

This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2019-0-00079, Artificial Intelligence Graduate School Program (Korea University), No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User's Intentions Using Deep Learning), and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. RS202300212498).