Blink-To-Live eye-based communication system for users with speech impairments

Sci Rep. 2023 May 17;13(1):7961. doi: 10.1038/s41598-023-34310-9.

Abstract

Eye-based communication languages such as Blink-To-Speak play a key role in expressing the needs and emotions of patients with motor neuron disorders. Most invented eye-based tracking systems are complex and not affordable in low-income countries. Blink-To-Live is an eye-tracking system based on a modified Blink-To-Speak language and computer vision for patients with speech impairments. A mobile phone camera tracks the patient's eyes by sending real-time video frames to computer vision modules for facial landmarks detection, eye identification and tracking. There are four defined key alphabets in the Blink-To-Live eye-based communication language: Left, Right, Up, and Blink. These eye gestures encode more than 60 daily life commands expressed by a sequence of three eye movement states. Once the eye gestures encoded sentences are generated, the translation module will display the phrases in the patient's native speech on the phone screen, and the synthesized voice can be heard. A prototype of the Blink-To-Live system is evaluated using normal cases with different demographic characteristics. Unlike the other sensor-based eye-tracking systems, Blink-To-Live is simple, flexible, and cost-efficient, with no dependency on specific software or hardware requirements. The software and its source are available from the GitHub repository ( https://github.com/ZW01f/Blink-To-Live ).

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Blinking*
  • Eye
  • Eye Movements
  • Humans
  • Software
  • Speech Disorders
  • Speech*