Automated emotion recognition: Current trends and future perspectives

Comput Methods Programs Biomed. 2022 Mar:215:106646. doi: 10.1016/j.cmpb.2022.106646. Epub 2022 Jan 19.

Abstract

Background: Human emotions greatly affect the actions of a person. The automated emotion recognition has applications in multiple domains such as health care, e-learning, surveillance, etc. The development of computer-aided diagnosis (CAD) tools has led to the automated recognition of human emotions.

Objective: This review paper provides an insight into various methods employed using electroencephalogram (EEG), facial, and speech signals coupled with multi-modal emotion recognition techniques. In this work, we have reviewed most of the state-of-the-art papers published on this topic.

Method: This study was carried out by considering the various emotion recognition (ER) models proposed between 2016 and 2021. The papers were analysed based on methods employed, classifier used and performance obtained.

Results: There is a significant rise in the application of deep learning techniques for ER. They have been widely applied for EEG, speech, facial expression, and multimodal features to develop an accurate ER model.

Conclusion: Our study reveals that most of the proposed machine and deep learning-based systems have yielded good performances for automated ER in a controlled environment. However, there is a need to obtain high performance for ER even in an uncontrolled environment.

Keywords: CAD; Electroencephalogram (EEG); Facial; Human emotions; Machine learning; Voice.

Publication types

  • Review

MeSH terms

  • Electroencephalography*
  • Emotions
  • Facial Expression*
  • Humans
  • Speech