Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality

Sensors (Basel). 2018 Feb 1;18(2):416. doi: 10.3390/s18020416.

Abstract

Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.

Keywords: Microsoft HoloLens; affect; augmented reality; emotion recognition; facial expressions; human–computer interaction; intelligence; sensors.

MeSH terms

  • Algorithms
  • Biometric Identification / standards*
  • Databases, Factual
  • Emotional Intelligence
  • Emotions*
  • Facial Expression*
  • Humans
  • Machine Learning
  • Reproducibility of Results
  • Surveys and Questionnaires