ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection

IEEE J Biomed Health Inform. 2024 Feb;28(2):777-788. doi: 10.1109/JBHI.2023.3335854. Epub 2024 Feb 5.

Abstract

In this paper, a novel spatio-temporal self-constructing graph neural network (ST-SCGNN) is proposed for cross-subject emotion recognition and consciousness detection. For spatio-temporal feature generation, activation and connection pattern features are first extracted and then combined to leverage their complementary emotion-related information. Next, a self-constructing graph neural network with a spatio-temporal model is presented. Specifically, the graph structure of the neural network is dynamically updated by the self-constructing module of the input signal. Experiments based on the SEED and SEED-IV datasets showed that the model achieved average accuracies of 85.90% and 76.37%, respectively. Both values exceed the state-of-the-art metrics with the same protocol. In clinical besides, patients with disorders of consciousness (DOC) suffer severe brain injuries, and sufficient training data for EEG-based emotion recognition cannot be collected. Our proposed ST-SCGNN method for cross-subject emotion recognition was first attempted in training in ten healthy subjects and testing in eight patients with DOC. We found that two patients obtained accuracies significantly higher than chance level and showed similar neural patterns with healthy subjects. Covert consciousness and emotion-related abilities were thus demonstrated in these two patients. Our proposed ST-SCGNN for cross-subject emotion recognition could be a promising tool for consciousness detection in DOC patients.

MeSH terms

  • Benchmarking
  • Consciousness*
  • Electroencephalography
  • Emotions*
  • Humans
  • Neural Networks, Computer