Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept

Sensors (Basel). 2018 Aug 31;18(9):2894. doi: 10.3390/s18092894.

Abstract

We introduce a two-stream model to use reflexive eye movements for smart mobile device authentication. Our model is based on two pre-trained neural networks, iTracker and PredNet, targeting two independent tasks: (i) gaze tracking and (ii) future frame prediction. We design a procedure to randomly generate the visual stimulus on the screen of mobile device, and the frontal camera will simultaneously capture head motions of the user as one watches it. Then, iTracker calculates the gaze-coordinates error which is treated as a static feature. To solve the imprecise gaze-coordinates caused by the low resolution of the frontal camera, we further take advantage of PredNet to extract the dynamic features between consecutive frames. In order to resist traditional attacks (shoulder surfing and impersonation attacks) during the procedure of mobile device authentication, we innovatively combine static features and dynamic features to train a 2-class support vector machine (SVM) classifier. The experiment results show that the classifier achieves accuracy of 98.6% to authenticate the user identity of mobile devices.

Keywords: authentication; gaze tracking; head motions; neural networks; smart mobile devices.

MeSH terms

  • Adult
  • Biometric Identification / methods*
  • Cell Phone*
  • Eye Movements / physiology*
  • Female
  • Fixation, Ocular / physiology
  • Head Movements / physiology*
  • Humans
  • Male
  • Support Vector Machine
  • Young Adult