Hybrid Model-Based Emotion Contextual Recognition for Cognitive Assistance Services

IEEE Trans Cybern. 2022 May;52(5):3567-3576. doi: 10.1109/TCYB.2020.3013112. Epub 2022 May 19.

Abstract

Endowing ubiquitous robots with cognitive capabilities for recognizing emotions, sentiments, affects, and moods of humans in their context is an important challenge, which requires sophisticated and novel approaches of emotion recognition. Most studies explore data-driven pattern recognition techniques that are generally highly dependent on learning data and insufficiently effective for emotion contextual recognition. In this article, a hybrid model-based emotion contextual recognition approach for cognitive assistance services in ubiquitous environments is proposed. This model is based on: 1) a hybrid-level fusion exploiting a multilayer perceptron (MLP) neural-network model and the possibilistic logic and 2) an expressive emotional knowledge representation and reasoning model to recognize nondirectly observable emotions; this model exploits jointly the emotion upper ontology (EmUO) and the n-ary ontology of events HTemp supported by the NKRL language. For validation purposes of the proposed approach, experiments were carried out using a YouTube dataset, and in a real-world scenario dedicated to the cognitive assistance of visitors in a smart devices showroom. Results demonstrated that the proposed multimodal emotion recognition model outperforms all baseline models. The real-world scenario corroborates the effectiveness of the proposed approach in terms of emotion contextual recognition and management and in the creation of emotion-based assistance services.

MeSH terms

  • Cognition
  • Emotions*
  • Humans
  • Learning
  • Neural Networks, Computer*