Decoding overt shifts of attention in depth through pupillary and cortical frequency tagging

J Neural Eng. 2021 Mar 8;18(3). doi: 10.1088/1741-2552/ab8e8f.

Abstract

Objective. We have recently developed a prototype of a novel human-computer interface for assistive communication based on voluntary shifts of attention (gaze) from a far target to a near target associated with a decrease of pupil size (Pupillary Accommodative Response, PAR), an automatic vegetative response that can be easily recorded. We report here an extension of that approach based on pupillary and cortical frequency tagging.Approach. In 18 healthy volunteers, we investigated the possibility of decoding attention shifts in depth by exploiting the evoked oscillatory responses of the pupil (Pupillary Oscillatory Response, POR, recorded through a low-cost device) and visual cortex (Steady-State Visual Evoked Potentials, SSVEP, recorded from 4 scalp electrodes). With a simple binary communication protocol (focusing on a far target meaning 'No', focusing on the near target meaning 'Yes'), we aimed at discriminating when observer's overt attention (gaze) shifted from the far to the near target, which were flickering at different frequencies.Main results. By applying a binary linear classifier (Support Vector Machine, SVM, with leave-one-out cross validation) to POR and SSVEP signals, we found that, with only twenty trials and no subjects' behavioural training, the offline median decoding accuracy was 75% and 80% with POR and SSVEP signals, respectively. When the two signals were combined together, accuracy reached 83%. The number of observers for whom accuracy was higher than 70% was 11/18, 12/18 and 14/18 with POR, SVVEP and combined features, respectively. A signal detection analysis confirmed these results.Significance. The present findings suggest that exploiting frequency tagging with pupillary or cortical responses during an attention shift in the depth plane, either separately or combined together, is a promising approach to realize a device for communicating with Complete Locked-In Syndrome (CLIS) patients when oculomotor control is unreliable and traditional assistive communication, even based on PAR, is unsuccessful.

Keywords: assistive communication; attention shifts in depth; brain-computer interface; frequency tagging; locked-in syndrome; pupil oscillations; steady-state visual evoked potentials.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Attention / physiology
  • Brain-Computer Interfaces*
  • Electroencephalography / methods
  • Evoked Potentials, Visual
  • Humans
  • Photic Stimulation
  • Pupil
  • User-Computer Interface
  • Visual Cortex* / physiology