Facial Action Unit Detection using 3D Face Landmarks for Pain Detection

Annu Int Conf IEEE Eng Med Biol Soc. 2023 Jul:2023:1-5. doi: 10.1109/EMBC40787.2023.10340059.

Abstract

Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-of-the-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.

MeSH terms

  • Benchmarking
  • Face*
  • Facial Expression*
  • Humans
  • Pain / diagnosis
  • Reproducibility of Results