Towards real-time and rotation-invariant American Sign Language alphabet recognition using a range camera

Sensors (Basel). 2012 Oct 29;12(11):14416-41. doi: 10.3390/s121114416.

Abstract

The automatic interpretation of human gestures can be used for a natural interaction with computers while getting rid of mechanical devices such as keyboards and mice. In order to achieve this objective, the recognition of hand postures has been studied for many years. However, most of the literature in this area has considered 2D images which cannot provide a full description of the hand gestures. In addition, a rotation-invariant identification remains an unsolved problem, even with the use of 2D images. The objective of the current study was to design a rotation-invariant recognition process while using a 3D signature for classifying hand postures. A heuristic and voxel-based signature has been designed and implemented. The tracking of the hand motion is achieved with the Kalman filter. A unique training image per posture is used in the supervised classification. The designed recognition process, the tracking procedure and the segmentation algorithm have been successfully evaluated. This study has demonstrated the efficiency of the proposed rotation invariant 3D hand posture signature which leads to 93.88% recognition rate after testing 14,732 samples of 12 postures taken from the alphabet of the American Sign Language.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Hand / physiology
  • Humans
  • Photography*
  • Sign Language*
  • United States