A.Eye Drive: Gaze-based semi-autonomous wheelchair interface

Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul:2019:5967-5970. doi: 10.1109/EMBC.2019.8856608.

Abstract

Existing wheelchair control interfaces, such as sip & puff or screen based gaze-controlled cursors, are challenging for the severely disabled to navigate safely and independently as users continuously need to interact with an interface during navigation. This puts a significant cognitive load on users and prevents them from interacting with the environment in other forms during navigation. We have combined eyetracking/gaze-contingent intention decoding with computer vision context-aware algorithms and autonomous navigation drawn from self-driving vehicles to allow paralysed users to drive by eye, simply by decoding natural gaze about where the user wants to go: A.Eye Drive. Our "Zero UI" driving platform allows users to look and interact visually with at an object or destination of interest in their visual scene, and the wheelchair autonomously takes the user to the intended destination, while continuously updating the computed path for static and dynamic obstacles. This intention decoding technology empowers the end-user by promising more independence through their own agency.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Disabled Persons
  • Equipment Design*
  • Fixation, Ocular*
  • Humans
  • Robotics*
  • User-Computer Interface
  • Wheelchairs*