Decoding Different Reach-and-Grasp Movements Using Noninvasive Electroencephalogram

Front Neurosci. 2021 Sep 28:15:684547. doi: 10.3389/fnins.2021.684547. eCollection 2021.

Abstract

Grasping is one of the most indispensable functions of humans. Decoding reach-and-grasp actions from electroencephalograms (EEGs) is of great significance for the realization of intuitive and natural neuroprosthesis control, and the recovery or reconstruction of hand functions of patients with motor disorders. In this paper, we investigated decoding five different reach-and-grasp movements closely related to daily life using movement-related cortical potentials (MRCPs). In the experiment, nine healthy subjects were asked to naturally execute five different reach-and-grasp movements on the designed experimental platform, namely palmar, pinch, push, twist, and plug grasp. A total of 480 trials per subject (80 trials per condition) were recorded. The MRCPs amplitude from low-frequency (0.3-3 Hz) EEG signals were used as decoding features for further offline analysis. Average binary classification accuracy for grasping vs. the no-movement condition peaked at 75.06 ± 6.8%. Peak average accuracy for grasping vs. grasping conditions of 64.95 ± 7.4% could be reached. Grand average peak accuracy of multiclassification for five grasping conditions reached 36.7 ± 6.8% at 1.45 s after the movement onset. The analysis of MRCPs indicated that all the grasping conditions are more pronounced than the no-movement condition, and there are also significant differences between the grasping conditions. These findings clearly proved the feasibility of decoding multiple reach-and-grasp actions from noninvasive EEG signals. This work is significant for the natural and intuitive BCI application, particularly for neuroprosthesis control or developing an active human-machine interaction system, such as rehabilitation robot.

Keywords: brain-computer interface; electroencephalogram; movement-related cortical potential; neuroprosthesis; reach-and-grasp decoding.