Human-machine interface-based wheelchair control using piezoelectric sensors based on face and tongue movements

Heliyon. 2022 Nov 18;8(11):e11679. doi: 10.1016/j.heliyon.2022.e11679. eCollection 2022 Nov.

Abstract

Hand-free control of assistive mobility devices has been developed to serve people with movement disabilities at all levels. In this study, we demonstrate a human-machine interface (HMI) system that uses piezoelectric sensors to translate face and tongue movements. This study addresses two issues. First, we used six piezoelectric sensors to acquire muscular facial signals to observe the sensor positions and features during winking and tongue movements. Second, we verified the proposed HMI for online simulated wheelchair control. Twelve volunteers participated in the experiment. A maximum classification accuracy of 98.0% from the maximum and mean parameters could be achieved using the linear discriminant analysis and K-nearest neighbors classification algorithms. Using the proposed algorithm, command translation patterns for command translation reached more than 95% of the average classification accuracy with 0.5 s of the window for command creation. For online control of the simulated wheelchair, the results showed high efficiency based on the time condition. The combination of winking and tongue actions results in a steering time of the same magnitude as that of joystick-based control, which is less than twice the time of a joystick. Hence, the proposed system can be further implemented in a powered wheelchair for quadriplegic patients who retain control in the face or tongue muscles.

Keywords: Assistive mobility devices; Face–machine interface; Human–machine interface; Piezoelectric sensor; Simulated wheelchair; Tongue–machine interface.