EEG-Based Estimation on the Reduction of Negative Emotions for Illustrated Surgical Images

Sensors (Basel). 2020 Dec 11;20(24):7103. doi: 10.3390/s20247103.

Abstract

Electroencephalogram (EEG) biosignals are widely used to measure human emotional reactions. The recent progress of deep learning-based classification models has improved the accuracy of emotion recognition in EEG signals. We apply a deep learning-based emotion recognition model from EEG biosignals to prove that illustrated surgical images reduce the negative emotional reactions that the photographic surgical images generate. The strong negative emotional reactions caused by surgical images, which show the internal structure of the human body (including blood, flesh, muscle, fatty tissue, and bone) act as an obstacle in explaining the images to patients or communicating with the images with non-professional people. We claim that the negative emotional reactions generated by illustrated surgical images are less severe than those caused by raw surgical images. To demonstrate the difference in emotional reaction, we produce several illustrated surgical images from photographs and measure the emotional reactions they engender using EEG biosignals; a deep learning-based emotion recognition model is applied to extract emotional reactions. Through this experiment, we show that the negative emotional reactions associated with photographic surgical images are much higher than those caused by illustrated versions of identical images. We further execute a self-assessed user survey to prove that the emotions recognized from EEG signals effectively represent user-annotated emotions.

Keywords: CNN; DEAP; EEG; disgust; emotion; surgery image.

MeSH terms

  • Adult
  • Electroencephalography*
  • Emotions*
  • Female
  • General Surgery
  • Human Body
  • Humans
  • Male
  • Photography
  • Young Adult