Deep Neural Network Approach for Pose, Illumination, and Occlusion Invariant Driver Emotion Detection

Int J Environ Res Public Health. 2022 Feb 18;19(4):2352. doi: 10.3390/ijerph19042352.

Abstract

Monitoring drivers' emotions is the key aspect of designing advanced driver assistance systems (ADAS) in intelligent vehicles. To ensure safety and track the possibility of vehicles' road accidents, emotional monitoring will play a key role in justifying the mental status of the driver while driving the vehicle. However, the pose variations, illumination conditions, and occlusions are the factors that affect the detection of driver emotions from proper monitoring. To overcome these challenges, two novel approaches using machine learning methods and deep neural networks are proposed to monitor various drivers' expressions in different pose variations, illuminations, and occlusions. We obtained the remarkable accuracy of 93.41%, 83.68%, 98.47%, and 98.18% for CK+, FER 2013, KDEF, and KMU-FED datasets, respectively, for the first approach and improved accuracy of 96.15%, 84.58%, 99.18%, and 99.09% for CK+, FER 2013, KDEF, and KMU-FED datasets respectively in the second approach, compared to the existing state-of-the-art methods.

Keywords: DeepNet; K.L.T.; MTCNN; advanced driver assistance systems (ADAS); deep neural networks; driver emotion detection; face detection; facial expression recognition; machine learning.

MeSH terms

  • Accidents, Traffic
  • Automobile Driving*
  • Emotions
  • Lighting*
  • Machine Learning
  • Neural Networks, Computer