Instance-Based Genre-Specific Music Emotion Prediction with An EEG Setup

Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul:2018:2092-2095. doi: 10.1109/EMBC.2018.8512630.

Abstract

This paper explores a novel direction in music-induced emotion (music emotion) analysis - the effects of different genres on the prediction of music emotion. We aim to compare the performance of various classifiers in the prediction of the emotion induced by music, as well as to investigate the adaptation of advanced features (such as asymmetries) in improving classification accuracy. The study is supported by real-world experiments where 10 subjects listened to 20 musical pieces from 5 genres- classical, heavy metal, electronic dance music, pop and rap, during which electroencephalogram (EEG) data were collected. A maximum 10-fold cross-validation accuracy of 98.4% for subject-independent and 99.0% for subject-dependent data were obtained for the classification of short instances of each song. The emotion of popular music was shown to have been most accurately predicted, with a classification accuracy of 99.6%. Further examination was conducted to investigate the effect of music emotion on the relaxation of subjects while listening.

MeSH terms

  • Auditory Perception
  • Electroencephalography
  • Emotions*
  • Music*
  • Relaxation