Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives

Front Robot AI. 2020 Dec 21:7:532279. doi: 10.3389/frobt.2020.532279. eCollection 2020.

Abstract

A fascinating challenge in the field of human-robot interaction is the possibility to endow robots with emotional intelligence in order to make the interaction more intuitive, genuine, and natural. To achieve this, a critical point is the capability of the robot to infer and interpret human emotions. Emotion recognition has been widely explored in the broader fields of human-machine interaction and affective computing. Here, we report recent advances in emotion recognition, with particular regard to the human-robot interaction context. Our aim is to review the state of the art of currently adopted emotional models, interaction modalities, and classification strategies and offer our point of view on future developments and critical issues. We focus on facial expressions, body poses and kinematics, voice, brain activity, and peripheral physiological responses, also providing a list of available datasets containing data from these modalities.

Keywords: affective computing; emotion recognition (ER); human-robot interaction; machine learning; multimodal data.

Publication types

  • Review