Convolutional Neural Networks with 3D Input for P300 Identification in Auditory Brain-Computer Interfaces

Comput Intell Neurosci. 2017:2017:8163949. doi: 10.1155/2017/8163949. Epub 2017 Nov 7.

Abstract

From allowing basic communication to move through an environment, several attempts are being made in the field of brain-computer interfaces (BCI) to assist people that somehow find it difficult or impossible to perform certain activities. Focusing on these people as potential users of BCI, we obtained electroencephalogram (EEG) readings from nine healthy subjects who were presented with auditory stimuli via earphones from six different virtual directions. We presented the stimuli following the oddball paradigm to elicit P300 waves within the subject's brain activity for later identification and classification using convolutional neural networks (CNN). The CNN models are given a novel single trial three-dimensional (3D) representation of the EEG data as an input, maintaining temporal and spatial information as close to the experimental setup as possible, a relevant characteristic as eliciting P300 has been shown to cause stronger activity in certain brain regions. Here, we present the results of CNN models using the proposed 3D input for three different stimuli presentation time intervals (500, 400, and 300 ms) and compare them to previous studies and other common classifiers. Our results show >80% accuracy for all the CNN models using the proposed 3D input in single trial P300 classification.

MeSH terms

  • Analysis of Variance
  • Auditory Perception / physiology*
  • Brain / physiology*
  • Brain-Computer Interfaces*
  • Electroencephalography* / methods
  • Event-Related Potentials, P300*
  • Female
  • Humans
  • Male
  • Neural Networks, Computer*
  • Neuropsychological Tests
  • Space Perception / physiology
  • Time Factors
  • Time Perception / physiology
  • User-Computer Interface