Spatial-temporal network for fine-grained-level emotion EEG recognition

J Neural Eng. 2022 May 27;19(3). doi: 10.1088/1741-2552/ac6d7d.

Abstract

Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.

Keywords: EEG-based emotion recognition; emotion strength; spatial-temporal network.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain-Computer Interfaces*
  • Electroencephalography* / methods
  • Emotions
  • Humans
  • Intention