A Saliency based Feature Fusion Model for EEG Emotion Estimation

Annu Int Conf IEEE Eng Med Biol Soc. 2022 Jul:2022:3170-3174. doi: 10.1109/EMBC48229.2022.9871720.

Abstract

Among the different modalities to assess emotion, electroencephalogram (EEG), representing the electrical brain activity, achieved motivating results over the last decade. Emotion estimation from EEG could help in the diagnosis or rehabilitation of certain diseases. In this paper, we propose a dual model considering two different representations of EEG feature maps: 1) a sequential based representation of EEG band power, 2) an image-based representation of the feature vectors. We also propose an innovative method to combine the information based on a saliency analysis of the image-based model to promote joint learning of both model parts. The model has been evaluated on four publicly available datasets: SEED-IV, SEED, DEAP and MPED. The achieved results outperform results from state-of-the-art approaches for three of the proposed datasets with a lower standard deviation that reflects higher stability. For sake of reproducibility, the codes and models proposed in this paper are available at https://github.com/VDelv/Emotion-EEG.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Electroencephalography* / methods
  • Emotions*
  • Reproducibility of Results