Filter Bank Convolutional Neural Network for Short Time-Window Steady-State Visual Evoked Potential Classification

IEEE Trans Neural Syst Rehabil Eng. 2021:29:2615-2624. doi: 10.1109/TNSRE.2021.3132162. Epub 2021 Dec 23.

Abstract

Convolutional neural network (CNN) has been gradually applied to steady-state visual evoked potential (SSVEP) of the brain-computer interface (BCI). Frequency-domain features extracted by fast Fourier Transform (FFT) or time-domain signals are used as network input. In the frequency-domain diagram, the features at the short time-window are not obvious and the phase information of each electrode channel may be ignored as well. Hence we propose a time-domain-based CNN method (tCNN), using the time-domain signal as network input. And the filter bank tCNN (FB-tCNN) is further proposed to improve its performance in the short time-window. We compare FB-tCNN with the canonical correlation analysis (CCA) methods and other CNN methods in our dataset and public dataset. And FB-tCNN shows superior performance at the short time-window in the intra-individual test. At the 0.2 s time-window, the accuracy of our method reaches 88.36 ± 4.89 % in our dataset, 77.78 ± 2.16 % and 79.21 ± 1.80 % respectively in the two sessions of the public dataset, which is higher than other methods. The impacts of training-subject number and data length in inter-individual or cross-individual are studied. FB-tCNN shows the potential in implementing inter-individual BCI. Further analysis shows that the deep learning method is easier in terms of the implementation of the asynchronous BCI system than the training data-driven CCA. The code is available for reproducibility at https://github.com/DingWenl/FB-tCNN.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Brain-Computer Interfaces*
  • Canonical Correlation Analysis
  • Electroencephalography
  • Evoked Potentials, Visual*
  • Humans
  • Neural Networks, Computer
  • Reproducibility of Results