Upper-limb prosthetic control using wearable multichannel mechanomyography

IEEE Int Conf Rehabil Robot. 2017 Jul:2017:1293-1298. doi: 10.1109/ICORR.2017.8009427.

Abstract

In this paper we introduce a robust multi-channel wearable sensor system for capturing user intent to control robotic hands. The interface is based on a fusion of inertial measurement and mechanomyography (MMG), which measures the vibrations of muscle fibres during motion. MMG is immune to issues such as sweat, skin impedance, and the need for a reference signal that is common to electromyography (EMG). The main contributions of this work are: 1) the hardware design of a fused inertial and MMG measurement system that can be worn on the arm, 2) a unified algorithm for detection, segmentation, and classification of muscle movement corresponding to hand gestures, and 3) experiments demonstrating the real-time control of a commercial prosthetic hand (Bebionic Version 2). Results show recognition of seven gestures, achieving an offline classification accuracy of 83.5% performed on five healthy subjects and one transradial amputee. The gesture recognition was then tested in real time on subsets of two and five gestures, with an average accuracy of 93.3% and 62.2% respectively. To our knowledge this is the first applied MMG based control system for practical prosthetic control.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Adult
  • Amputees / rehabilitation
  • Arm / physiology*
  • Artificial Limbs*
  • Female
  • Gestures
  • Humans
  • Male
  • Middle Aged
  • Myography / instrumentation*
  • Myography / methods
  • Signal Processing, Computer-Assisted / instrumentation*
  • Young Adult