Impacts of Cortical Regions on EEG-based Classification of Lexical Tones and Vowels in Spoken Speech

Annu Int Conf IEEE Eng Med Biol Soc. 2023 Jul:2023:1-4. doi: 10.1109/EMBC40787.2023.10340428.

Abstract

Speech impairment is one of the most serious problems for patients with communication disorders, e.g., stroke survivors. The brain-computer interface (BCI) systems have shown the potential to alternatively control or rehabilitate the neurological damages in speech production. The effects of different cortical regions in speech-based BCI systems are essential to be studied, which are favorable for improving the performance of speech-based BCI systems. This work aimed to explore the impacts of different speech-related cortical regions in the electroencephalogram (EEG) based classification of seventy spoken Mandarin monosyllables carrying four vowels and four lexical tones. Seven audible speech production-related cortical regions were studied, involving Broca's and Wernicke's areas, auditory cortex, motor cortex, prefrontal cortex, sensory cortex, left brain, right brain, and whole brain. Following the previous studies in which EEG signals were collected from ten subjects during Mandarin speech production, the features of EEG signals were extracted by the Riemannian manifold method, and a linear discriminant analysis (LDA) was regarded as a classifier to classify different vowels and lexical tones. The results showed that when using electrodes from whole brain, the classifier reached the best performances, which were 48.5% for lexical tones and 70.0% for vowels, respectively. The vowel classification results under Broca's and Wernicke's areas, auditory cortex, or prefrontal cortex were higher than those under the motor cortex or sensory cortex. No such differences were observed in the lexical tone classification task.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Auditory Cortex*
  • Brain
  • Brain Mapping
  • Electroencephalography
  • Humans
  • Speech*