Real-time EEG-based emotion recognition for neurohumanities: perspectives from principal component analysis and tree-based algorithms

Front Hum Neurosci. 2024 Mar 13:18:1319574. doi: 10.3389/fnhum.2024.1319574. eCollection 2024.

Abstract

Within the field of Humanities, there is a recognized need for educational innovation, as there are currently no reported tools available that enable individuals to interact with their environment to create an enhanced learning experience in the humanities (e.g., immersive spaces). This project proposes a solution to address this gap by integrating technology and promoting the development of teaching methodologies in the humanities, specifically by incorporating emotional monitoring during the learning process of humanistic context inside an immersive space. In order to achieve this goal, a real-time emotion recognition EEG-based system was developed to interpret and classify specific emotions. These emotions aligned with the early proposal by Descartes (Passions), including admiration, love, hate, desire, joy, and sadness. This system aims to integrate emotional data into the Neurohumanities Lab interactive platform, creating a comprehensive and immersive learning environment. This work developed a ML, real-time emotion recognition model that provided Valence, Arousal, and Dominance (VAD) estimations every 5 seconds. Using PCA, PSD, RF, and Extra-Trees, the best 8 channels and their respective best band powers were extracted; furthermore, multiple models were evaluated using shift-based data division and cross-validations. After assessing their performance, Extra-Trees achieved a general accuracy of 94%, higher than the reported in the literature (88% accuracy). The proposed model provided real-time predictions of VAD variables and was adapted to classify Descartes' six main passions. However, with the VAD values obtained, more than 15 emotions can be classified (reported in the VAD emotion mapping) and extend the range of this application.

Keywords: Descartes; EEG; PCA; Random Forest; emotion recognition; humanities; neurohumanities; real-time.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. Authors would like to acknowledge financial support from CONAHCYT through the awarded Ciencia de Frontera 2023 grant, Project ID: CF-2023-G-583; as well as Tecnologico de Monterrey through the Challege Based Research Funding Program 2022, Project ID: E061-EHE-GI02-D-T3-E.