Understanding the role of emotion in decision making process: using machine learning to analyze physiological responses to visual, auditory, and combined stimulation

Front Hum Neurosci. 2024 Jan 8:17:1286621. doi: 10.3389/fnhum.2023.1286621. eCollection 2023.

Abstract

Emotions significantly shape decision-making, and targeted emotional elicitations represent an important factor in neuromarketing, where they impact advertising effectiveness by capturing potential customers' attention intricately associated with emotional triggers. Analyzing biometric parameters after stimulus exposure may help in understanding emotional states. This study investigates autonomic and central nervous system responses to emotional stimuli, including images, auditory cues, and their combination while recording physiological signals, namely the electrocardiogram, blood volume pulse, galvanic skin response, pupillometry, respiration, and the electroencephalogram. The primary goal of the proposed analysis is to compare emotional stimulation methods and to identify the most effective approach for distinct physiological patterns. A novel feature selection technique is applied to further optimize the separation of four emotional states. Basic machine learning approaches are used in order to discern emotions as elicited by different kinds of stimulation. Electroencephalographic signals, Galvanic skin response and cardio-respiratory coupling-derived features provided the most significant features in distinguishing the four emotional states. Further findings highlight how auditory stimuli play a crucial role in creating distinct physiological patterns that enhance classification within a four-class problem. When combining all three types of stimulation, a validation accuracy of 49% was achieved. The sound-only and the image-only phases resulted in 52% and 44% accuracy respectively, whereas the combined stimulation of images and sounds led to 51% accuracy. Isolated visual stimuli yield less distinct patterns, necessitating more signals for relatively inferior performance compared to other types of stimuli. This surprising significance arises from limited auditory exploration in emotional recognition literature, particularly contrasted with the pleathora of studies performed using visual stimulation. In marketing, auditory components might hold a more relevant potential to significantly influence consumer choices.

Keywords: International Affective Digital Sounds (IADS); International Affective Pictures System (IAPS); biomedical signal processing; emotions; machine learning; physiological responses.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This research is supported in part by the PNRR Project PNRR-MAD-2022-12376716.