Head yaw estimation from asymmetry of facial appearance

IEEE Trans Syst Man Cybern B Cybern. 2008 Dec;38(6):1501-12. doi: 10.1109/TSMCB.2008.928231.

Abstract

This paper proposes a novel method to estimate the head yaw rotations based on the asymmetry of 2-D facial appearance. In traditional appearance-based pose estimation methods, features are typically extracted holistically by subspace analysis such as principal component analysis, linear discriminant analysis (LDA), etc., which are not designed to directly model the pose variations. In this paper, we argue and reveal that the asymmetry in the intensities of each row of the face image is closely relevant to the yaw rotation of the head and, at the same time, evidently insensitive to the identity of the input face. Specifically, to extract the asymmetry information, 1-D Gabor filters and Fourier transform are exploited. LDA is further applied to the asymmetry features to enhance the discrimination ability. By using the simple nearest centroid classifier, experimental results on two multipose databases show that the proposed features outperform other features. In particular, the generalization of the proposed asymmetry features is verified by the impressive performance when the training and the testing data sets are heterogeneous.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Biometry / methods*
  • Face / anatomy & histology*
  • Head / anatomy & histology
  • Head / physiology*
  • Humans
  • Image Enhancement / methods
  • Image Interpretation, Computer-Assisted / methods*
  • Imaging, Three-Dimensional / methods
  • Pattern Recognition, Automated / methods*
  • Posture / physiology*
  • Reproducibility of Results
  • Sensitivity and Specificity