i-MYO: A multi-grasp prosthetic hand control system based on gaze movements, augmented reality, and myoelectric signals

Int J Med Robot. 2024 Feb;20(1):e2617. doi: 10.1002/rcs.2617.

Abstract

Background: Controlling a multi-grasp prosthetic hand still remains a challenge. This study explores the influence of merging gaze movements and augmented reality in bionics on improving prosthetic hand control.

Methods: A control system based on gaze movements, augmented reality, and myoelectric signals (i-MYO) was proposed. In the i-MYO, the GazeButton was introduced into the controller to detect the grasp-type intention from the eye-tracking signals, and the proportional velocity scheme based on the i-MYO was used to control hand movement.

Results: The able-bodied subjects with no prior training successfully transferred objects in 91.6% of the cases and switched the optimal grasp types in 97.5%. The patient could successfully trigger the EMG to control the hand holding the objects in 98.7% of trials in around 3.2 s and spend around 1.3 s switching the optimal grasp types in 99.2% of trials.

Conclusions: Merging gaze movements and augmented reality in bionics can widen the control bandwidth of prosthetic hand. With the help of i-MYO, the subjects can control a prosthetic hand using six grasp types if they can manipulate two muscle signals and gaze movement.

Keywords: augmented reality; gaze movement; human‐machine interfaces; multi‐grasp prosthetic hand; prosthetics; reconstruction; technical; virtual reality.

MeSH terms

  • Artificial Limbs*
  • Augmented Reality*
  • Electromyography
  • Hand / physiology
  • Hand Strength / physiology
  • Humans
  • Movement
  • Prosthesis Design