EEG-based emotion recognition in music listening

IEEE Trans Biomed Eng. 2010 Jul;57(7):1798-806. doi: 10.1109/TBME.2010.2048568. Epub 2010 May 3.

Abstract

Ongoing brain activity can be recorded as electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machine was employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% +/- 3.06% across 26 subjects. Further, this study identified 30 subject-independent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Algorithms
  • Artificial Intelligence
  • Bayes Theorem
  • Electrodes
  • Electroencephalography / methods*
  • Emotions / physiology*
  • Evoked Potentials, Auditory / physiology*
  • Female
  • Humans
  • Male
  • Music / psychology*
  • Pattern Recognition, Automated / methods*
  • Signal Processing, Computer-Assisted