Context-aware Multimodal Auditory BCI Classification through Graph Neural Networks

Annu Int Conf IEEE Eng Med Biol Soc. 2023 Jul:2023:1-4. doi: 10.1109/EMBC40787.2023.10339984.

Abstract

The prospect of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) in the presence of topological information of participants is often left unexplored in most of the brain-computer interface (BCI) systems. Additionally, the usage of these modalities together in the field of multimodality analysis to support multiple brain signals toward improving BCI performance is not fully examined. This study first presents a multimodal data fusion framework to exploit and decode the complementary synergistic properties in multimodal neural signals. Moreover, the relations among different subjects and their observations also play critical roles in classifying unknown subjects. We developed a context-aware graph neural network (GNN) model utilizing the pairwise relationship among participants to investigate the performance on an auditory task classification. We explored standard and deviant auditory EEG and fNIRS data where each subject was asked to perform an auditory oddball task and has multiple trials regarded as context-aware nodes in our graph construction. In experiments, our multimodal data fusion strategy showed an improvement up to 8.40% via SVM and 2.02% via GNN, compared to the single-modal EEG or fNIRS. In addition, our context-aware GNN achieved 5.3%, 4.07% and 4.53% higher accuracy for EEG, fNIRS and multimodal data based experiments, compared to the baseline models.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Brain
  • Brain-Computer Interfaces*
  • Electroencephalography / methods
  • Humans
  • Neural Networks, Computer
  • Spectroscopy, Near-Infrared / methods