Mind wandering state detection during video-based learning via EEG

Front Hum Neurosci. 2023 May 30:17:1182319. doi: 10.3389/fnhum.2023.1182319. eCollection 2023.

Abstract

The aim of this study is to explore the potential of technology for detecting mind wandering, particularly during video-based distance learning, with the ultimate benefit of improving learning outcomes. To overcome the challenges of previous mind wandering research in ecological validity, sample balance, and dataset size, this study utilized practical electroencephalography (EEG) recording hardware and designed a paradigm consisting of viewing short-duration video lectures under a focused learning condition and a future planning condition. Participants estimated statistics of their attentional state at the end of each video, and we combined this rating scale feedback with self-caught key press responses during video watching to obtain binary labels for classifier training. EEG was recorded using an 8-channel system, and spatial covariance features processed by Riemannian geometry were employed. The results demonstrate that a radial basis function kernel support vector machine classifier, using Riemannian-processed covariance features from delta, theta, alpha, and beta bands, can detect mind wandering with a mean area under the receiver operating characteristic curve (AUC) of 0.876 for within-participant classification and AUC of 0.703 for cross-lecture classification. Furthermore, our results suggest that a short duration of training data is sufficient to train a classifier for online decoding, as cross-lecture classification remained at an average AUC of 0.689 when using 70% of the training set (about 9 min). The findings highlight the potential for practical EEG hardware in detecting mind wandering with high accuracy, which has potential application to improving learning outcomes during video-based distance learning.

Keywords: Riemannian geometry; brain-computer interfaces; distance learning; electroencephalography (EEG); mind wandering; passive brain-computer interfaces (pBCI).

Grants and funding

This work was supported by the STI 2030-Major Projects of the Ministry of Science and Technology of China (2021ZD0200407), the National Key Research and Development Program of China (2020YFC0832402), and the Innovation Team Project of Guangdong Provincial Department of Education (2021KCXTD014).