The posterior parietal cortex is well known to mediate sensorimotor transformations during the generation of movement plans, but its ability to control prosthetic limbs in 3D environments has not yet been fully demonstrated. With this aim, we trained monkeys to perform reaches to targets located at various depths and directions and tested whether the reach goal position can be extracted from parietal signals. The reach goal location was reliably decoded with accuracy close to optimal (>90%), and this occurred also well before movement onset. These results, together with recent work showing a reliable decoding of hand grip in the same area, suggest that this is a suitable site to decode the entire prehension action, to be considered in the development of brain-computer interfaces.
Keywords: V6A; hand guidance; machine learning; monkey; neuroprosthetics; offline neural decoding; prehension; reaching in depth; robotics; visuomotor transformations.
Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.