Observation-based training for neuroprosthetic control of grasping by amputees

Annu Int Conf IEEE Eng Med Biol Soc. 2014:2014:3989-92. doi: 10.1109/EMBC.2014.6944498.

Abstract

Current brain-machine interfaces (BMIs) allow upper limb amputees to position robotic arms with a high degree of accuracy, but lack the ability to control hand pre-shaping for grasping different objects. We have previously shown that low frequency (0.1-1 Hz) time domain cortical activity recorded at the scalp via electroencephalography (EEG) encodes information about grasp pre-shaping. To transfer this technology to clinical populations such as amputees, the challenge lies in constructing BMI models in the absence of overt training hand movements. Here we show that it is possible to train BMI models using observed grasping movements performed by a robotic hand attached to amputees' residual limb. Three transradial amputees controlled the grasping motion of an attached robotic hand via their EEG, following the action-observation training phase. Over multiple sessions, subjects successfully grasped the presented object (a bottle or a credit card) in 53±16 % of trials, demonstrating the validity of the BMI models. Importantly, the validation of the BMI model was through closed-loop performance, which demonstrates generalization of the model to unseen data. These results suggest `mirror neuron system' properties captured by delta band EEG that allows neural representation for action observation to be used for action control in an EEG-based BMI system.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Aged
  • Amputees / rehabilitation*
  • Biomechanical Phenomena
  • Brain-Computer Interfaces
  • Electroencephalography
  • Female
  • Hand / physiology
  • Hand Strength / physiology*
  • Humans
  • Male
  • Middle Aged
  • Signal Processing, Computer-Assisted