Investigating EEG-based functional connectivity patterns for multimodal emotion recognition

J Neural Eng. 2022 Jan 31;19(1). doi: 10.1088/1741-2552/ac49a7.

Abstract

Objective.Previous studies on emotion recognition from electroencephalography (EEG) mainly rely on single-channel-based feature extraction methods, which ignore the functional connectivity between brain regions. Hence, in this paper, we propose a novel emotion-relevant critical subnetwork selection algorithm and investigate three EEG functional connectivity network features: strength, clustering coefficient, and eigenvector centrality.Approach.After constructing the brain networks by the correlations between pairs of EEG signals, we calculated critical subnetworks through the average of brain network matrices with the same emotion label to eliminate the weak associations. Then, three network features were conveyed to a multimodal emotion recognition model using deep canonical correlation analysis along with eye movement features. The discrimination ability of the EEG connectivity features in emotion recognition is evaluated on three public datasets: SEED, SEED-V, and DEAP.Main results. The experimental results reveal that the strength feature outperforms the state-of-the-art features based on single-channel analysis. The classification accuracies of multimodal emotion recognition are95.08±6.42%on the SEED dataset,84.51±5.11%on the SEED-V dataset, and85.34±2.90%and86.61±3.76%for arousal and valence on the DEAP dataset, respectively, which all achieved the best performance. In addition, the brain networks constructed with 18 channels achieve comparable performance with that of the 62-channel network and enable easier setups in real scenarios.Significance.The EEG functional connectivity networks combined with emotion-relevant critical subnetworks selection algorithm we proposed is a successful exploration to excavate the information between channels.

Keywords: EEG; affective brain-computer interface; brain functional connectivity network; eye movement; multimodal deep learning; multimodal emotion recognition.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Arousal
  • Brain
  • Electroencephalography*
  • Emotions
  • Neural Networks, Computer*