A Sign Language Recognition System Applied to Deaf-Mute Medical Consultation

Sensors (Basel). 2022 Nov 24;22(23):9107. doi: 10.3390/s22239107.

Abstract

It is an objective reality that deaf-mute people have difficulty seeking medical treatment. Due to the lack of sign language interpreters, most hospitals in China currently do not have the ability to interpret sign language. Normal medical treatment is a luxury for deaf people. In this paper, we propose a sign language recognition system: Heart-Speaker. Heart-Speaker is applied to a deaf-mute consultation scenario. The system provides a low-cost solution for the difficult problem of treating deaf-mute patients. The doctor only needs to point the Heart-Speaker at the deaf patient and the system automatically captures the sign language movements and translates the sign language semantics. When a doctor issues a diagnosis or asks a patient a question, the system displays the corresponding sign language video and subtitles to meet the needs of two-way communication between doctors and patients. The system uses the MobileNet-YOLOv3 model to recognize sign language. It meets the needs of running on embedded terminals and provides favorable recognition accuracy. We performed experiments to verify the accuracy of the measurements. The experimental results show that the accuracy rate of Heart-Speaker in recognizing sign language can reach 90.77%.

Keywords: deep learning; lightweight convolutional neural network; self-service for the deaf; sign language recognition.

MeSH terms

  • China
  • Communication*
  • Humans
  • Recognition, Psychology
  • Referral and Consultation
  • Sign Language*

Grants and funding

This research received no external funding.