Deep learning-based framework for real-time upper limb motion intention classification using combined bio-signals

Front Neurorobot. 2023 Jul 27:17:1174613. doi: 10.3389/fnbot.2023.1174613. eCollection 2023.

Abstract

This research study proposes a unique framework that takes input from a surface electromyogram (sEMG) and functional near-infrared spectroscopy (fNIRS) bio-signals. These signals are trained using convolutional neural networks (CNN). The framework entails a real-time neuro-machine interface to decode the human intention of upper limb motions. The bio-signals from the two modalities are recorded for eight movements simultaneously for prosthetic arm functions focusing on trans-humeral amputees. The fNIRS signals are acquired from the human motor cortex, while sEMG is recorded from the human bicep muscles. The selected classification and command generation features are the peak, minimum, and mean ΔHbO and ΔHbR values within a 2-s moving window. In the case of sEMG, wavelength, peak, and mean were extracted with a 150-ms moving window. It was found that this scheme generates eight motions with an enhanced average accuracy of 94.5%. The obtained results validate the adopted research methodology and potential for future real-time neural-machine interfaces to control prosthetic arms.

Keywords: assistive robotics; disability; intelligent systems; machine learning; prosthesis; sEMG and fNIRS; trans-humeral amputation.

Grants and funding

This study received funding from King Saud University, Saudi Arabia, through researchers supporting project number (RSP2023R145) and NRPU by HEC project #10702.