An inter-subject model to reduce the calibration time for motion imagination-based brain-computer interface

Med Biol Eng Comput. 2019 Apr;57(4):939-952. doi: 10.1007/s11517-018-1917-x. Epub 2018 Nov 29.

Abstract

A major factor blocking the practical application of brain-computer interfaces (BCI) is the long calibration time. To obtain enough training trials, participants must spend a long time in the calibration stage. In this paper, we propose a new framework to reduce the calibration time through knowledge transferred from the electroencephalogram (EEG) of other subjects. We trained the motor recognition model for the target subject using both the target's EEG signal and the EEG signals of other subjects. To reduce the individual variation of different datasets, we proposed two data mapping methods. These two methods separately diminished the variation caused by dissimilarities in the brain activation region and the strength of the brain activation in different subjects. After these data mapping stages, we adopted an ensemble method to aggregate the EEG signals from all subjects into a final model. We compared our method with other methods that reduce the calibration time. The results showed that our method achieves a satisfactory recognition accuracy using very few training trials (32 samples). Compared with existing methods using few training trials, our method achieved much greater accuracy. Graphical abstract The framework of the proposed method. The workflow of the framework have three steps: 1, process each subjects EEG signals according to the target subject's EEG signal. 2, generate models from each subjects' processed signals. 3, ensemble these models to a final model, the final model is a model for the target subject.

Keywords: Brain-computer interface (BCI); Common spatial pattern; Electroencephalogram (EEG); Inter-subject model; Machine learning; Movement imagination.

MeSH terms

  • Algorithms
  • Brain-Computer Interfaces*
  • Calibration
  • Humans
  • Imagination*
  • Models, Neurological*
  • Motion*
  • Time Factors