Application for Recognizing Sign Language Gestures Based on an Artificial Neural Network

Sensors (Basel). 2022 Dec 15;22(24):9864. doi: 10.3390/s22249864.

Abstract

This paper presents the development and implementation of an application that recognizes American Sign Language signs with the use of deep learning algorithms based on convolutional neural network architectures. The project implementation includes the development of a training set, the preparation of a module that converts photos to a form readable by the artificial neural network, the selection of the appropriate neural network architecture and the development of the model. The neural network undergoes a learning process, and its results are verified accordingly. An internet application that allows recognition of sign language based on a sign from any photo taken by the user is implemented, and its results are analyzed. The network effectiveness ratio reaches 99% for the training set. Nevertheless, conclusions and recommendations are formulated to improve the operation of the application.

Keywords: convolutional neural networks; deep learning algorithm; machine image recognition systems.

MeSH terms

  • Algorithms
  • Gestures*
  • Humans
  • Machine Learning*
  • Neural Networks, Computer
  • Sign Language

Grants and funding

This research received no external funding.