Feature and Classification Analysis for Detection and Classification of Tongue Movements From Single-Trial Pre-Movement EEG

IEEE Trans Neural Syst Rehabil Eng. 2022:30:678-687. doi: 10.1109/TNSRE.2022.3157959. Epub 2022 Mar 22.

Abstract

Individuals with severe tetraplegia can benefit from brain-computer interfaces (BCIs). While most movement-related BCI systems focus on right/left hand and/or foot movements, very few studies have considered tongue movements to construct a multiclass BCI. The aim of this study was to decode four movement directions of the tongue (left, right, up, and down) from single-trial pre-movement EEG and provide a feature and classifier investigation. In offline analyses (from ten individuals without a disability) detection and classification were performed using temporal, spectral, entropy, and template features classified using either a linear discriminative analysis, support vector machine, random forest or multilayer perceptron classifiers. Besides the 4-class classification scenario, all possible 3-, and 2-class scenarios were tested to find the most discriminable movement type. The linear discriminant analysis achieved on average, higher classification accuracies for both movement detection and classification. The right- and down tongue movements provided the highest and lowest detection accuracy (95.3±4.3% and 91.7±4.8%), respectively. The 4-class classification achieved an accuracy of 62.6±7.2%, while the best 3-class classification (using left, right, and up movements) and 2-class classification (using left and right movements) achieved an accuracy of 75.6±8.4% and 87.7±8.0%, respectively. Using only a combination of the temporal and template feature groups provided further classification accuracy improvements. Presumably, this is because these feature groups utilize the movement-related cortical potentials, which are noticeably different on the left- versus right brain hemisphere for the different movements. This study shows that the cortical representation of the tongue is useful for extracting control signals for multi-class movement detection BCIs.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain-Computer Interfaces*
  • Electroencephalography*
  • Hand
  • Humans
  • Movement
  • Tongue