Recognition of Empathy from Synchronization between Brain Activity and Eye Movement

Sensors (Basel). 2023 May 29;23(11):5162. doi: 10.3390/s23115162.

Abstract

In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos.

Keywords: EEG; brain activity; digital content; empathy; eye movement; physiological measures.

MeSH terms

  • Brain / physiology
  • Emotions / physiology
  • Empathy*
  • Eye Movements*
  • Humans
  • Internet
  • Temporal Lobe