EEG in game user analysis: A framework for expertise classification during gameplay

PLoS One. 2021 Jun 18;16(6):e0246913. doi: 10.1371/journal.pone.0246913. eCollection 2021.

Abstract

Video games have become a ubiquitous part of demographically diverse cultures. Numerous studies have focused on analyzing the cognitive aspects involved in game playing that could help in providing an optimal gaming experience by improving video game design. To this end, we present a framework for classifying the game player's expertise level using wearable electroencephalography (EEG) headset. We hypothesize that expert and novice players' brain activity is different, which can be classified using frequency domain features extracted from EEG signals of the game player. A systematic channel reduction approach is presented using a correlation-based attribute evaluation method. This approach lead us in identifying two significant EEG channels, i.e., AF3 and P7, among fourteen channels available in Emotiv EPOC headset. In particular, features extracted from these two EEG channels contributed the most to the video game player's expertise level classification. This finding is validated by performing statistical analysis (t-test) over the extracted features. Moreover, among multiple classifiers used, K-nearest neighbor is the best classifier in classifying game player's expertise level with a classification accuracy of up to 98.04% (without data balancing) and 98.33% (with data balancing).

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Achievement*
  • Adult
  • Cognition*
  • Competitive Behavior*
  • Electroencephalography / methods*
  • Female
  • Humans
  • Male
  • Self Concept
  • Video Games / classification
  • Video Games / psychology*
  • Video Games / statistics & numerical data
  • Young Adult

Grants and funding

We declare that this project was funded by the Deanship of Scientific Research (DSR) at king abdulaziz university Jeddah, under grant no. D-064-611-1442. The authors therefore acknowledge with thanks DSR technical and financial support.