Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis

IEEE Trans Neural Syst Rehabil Eng. 2020 Jun;28(6):1471-1480. doi: 10.1109/TNSRE.2020.2992885. Epub 2020 May 6.

Abstract

We propose a novel controller for powered prosthetic arms, where fused EMG and gaze data predict the desired end-point for a full arm prosthesis, which could drive the forward motion of individual joints. We recorded EMG, gaze, and motion-tracking during pick-and-place trials with 7 able-bodied subjects. Subjects positioned an object above a random target on a virtual interface, each completing around 600 trials. On average across all trials and subjects gaze preceded EMG and followed a repeatable pattern that allowed for prediction. A computer vision algorithm was used to extract the initial and target fixations and estimate the target position in 2D space. Two SVRs were trained with EMG data to predict the x- and y- position of the hand; results showed that the y-estimate was significantly better than the x-estimate. The EMG and gaze predictions were fused using a Kalman Filter-based approach, and the positional error from using EMG-only was significantly higher than the fusion of EMG and gaze. The final target position Root Mean Squared Error (RMSE) decreased from 9.28 cm with an EMG-only prediction to 6.94 cm when using a gaze-EMG fusion. This error also increased significantly when removing some or all arm muscle signals. However, using fused EMG and gaze, there were no significant difference between predictors that included all muscles, or only a subset of muscles.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Arm
  • Artificial Limbs*
  • Electromyography
  • Hand
  • Humans