Context-dependent adaptation improves robustness of myoelectric control for upper-limb prostheses

J Neural Eng. 2017 Oct;14(5):056016. doi: 10.1088/1741-2552/aa7e82. Epub 2017 Jul 10.

Abstract

Objective: Dexterous upper-limb prostheses are available today to restore grasping, but an effective and reliable feed-forward control is still missing. The aim of this work was to improve the robustness and reliability of myoelectric control by using context information from sensors embedded within the prosthesis.

Approach: We developed a context-driven myoelectric control scheme (cxMYO) that incorporates the inference of context information from proprioception (inertial measurement unit) and exteroception (force and grip aperture) sensors to modulate the outputs of myoelectric control. Further, a realistic evaluation of the cxMYO was performed online in able-bodied subjects using three functional tasks, during which the cxMYO was compared to a purely machine-learning-based myoelectric control (MYO).

Main results: The results demonstrated that utilizing context information decreased the number of unwanted commands, improving the performance (success rate and dropped objects) in all three functional tasks. Specifically, the median number of objects dropped per round with cxMYO was zero in all three tasks and a significant increase in the number of successful transfers was seen in two out of three functional tasks. Additionally, the subjects reported better user experience.

Significance: This is the first online evaluation of a method integrating information from multiple on-board prosthesis sensors to modulate the output of a machine-learning-based myoelectric controller. The proposed scheme is general and presents a simple, non-invasive and cost-effective approach for improving the robustness of myoelectric control.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adaptation, Physiological / physiology*
  • Arm / physiology
  • Artificial Limbs*
  • Hand / physiology*
  • Hand Strength / physiology*
  • Humans
  • Machine Learning*
  • Proprioception / physiology
  • Prosthesis Design / methods*