Appearance-Based Gaze Estimation for ASD Diagnosis

IEEE Trans Cybern. 2022 Jul;52(7):6504-6517. doi: 10.1109/TCYB.2022.3165063. Epub 2022 Jul 4.

Abstract

Biomarkers, such as magnetic resonance imaging (MRI) and electroencephalogram have been used to help diagnose autism spectrum disorder (ASD). However, the diagnosis needs the assist of specialized medical equipment in the hospital or laboratory. To diagnose ASD in a more effective and convenient way, in this article, we propose an appearance-based gaze estimation algorithm-AttentionGazeNet, to accurately estimate the subject's 3-D gaze from a raw video. The experimental results show its competitive performance on the MPIIGaze dataset and the improvement of 14.7% for static head pose and 46.7% for moving head pose on the EYEDIAP dataset compared with the state-of-the-art gaze estimation algorithms. After projecting the obtained gaze vector onto the screen coordinate, we apply accumulated histogram to taking into account both spatial and temporal information of estimated gaze-point and head-pose sequences. Finally, classification is conducted on our self-collected autistic children video dataset (ACVD), which contains 405 videos from 135 different ASD children, 135 typically developing (TD) children in a primary school, and 135 TD children in a kindergarten. The classification results on ACVD shows the effectiveness and efficiency of our proposed method, with the accuracy 94.8%, the sensitivity 91.1% and the specificity 96.7% for ASD.

MeSH terms

  • Algorithms
  • Autism Spectrum Disorder* / diagnostic imaging
  • Child
  • Fixation, Ocular
  • Humans