Prediction of Reach Goals in Depth and Direction from the Parietal Cortex

Cell Rep. 2018 Apr 17;23(3):725-732. doi: 10.1016/j.celrep.2018.03.090.

Abstract

The posterior parietal cortex is well known to mediate sensorimotor transformations during the generation of movement plans, but its ability to control prosthetic limbs in 3D environments has not yet been fully demonstrated. With this aim, we trained monkeys to perform reaches to targets located at various depths and directions and tested whether the reach goal position can be extracted from parietal signals. The reach goal location was reliably decoded with accuracy close to optimal (>90%), and this occurred also well before movement onset. These results, together with recent work showing a reliable decoding of hand grip in the same area, suggest that this is a suitable site to decode the entire prehension action, to be considered in the development of brain-computer interfaces.

Keywords: V6A; hand guidance; machine learning; monkey; neuroprosthetics; offline neural decoding; prehension; reaching in depth; robotics; visuomotor transformations.

MeSH terms

  • Action Potentials
  • Animals
  • Hand Strength / physiology
  • Macaca fascicularis / physiology*
  • Movement
  • Parietal Lobe / physiology*
  • Photic Stimulation
  • Psychomotor Performance
  • Space Perception