Towards a new modality-independent interface for a robotic wheelchair

IEEE Trans Neural Syst Rehabil Eng. 2014 May;22(3):567-84. doi: 10.1109/TNSRE.2013.2265237. Epub 2013 Jun 4.

Abstract

This work presents the development of a robotic wheelchair that can be commanded by users in a supervised way or by a fully automatic unsupervised navigation system. It provides flexibility to choose different modalities to command the wheelchair, in addition to be suitable for people with different levels of disabilities. Users can command the wheelchair based on their eye blinks, eye movements, head movements, by sip-and-puff and through brain signals. The wheelchair can also operate like an auto-guided vehicle, following metallic tapes, or in an autonomous way. The system is provided with an easy to use and flexible graphical user interface onboard a personal digital assistant, which is used to allow users to choose commands to be sent to the robotic wheelchair. Several experiments were carried out with people with disabilities, and the results validate the developed system as an assistive tool for people with distinct levels of disability.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Blinking
  • Electroencephalography
  • Electromyography
  • Eye Movements / physiology
  • Face / physiology
  • Female
  • Head Movements
  • Humans
  • Male
  • Robotics*
  • Signal Processing, Computer-Assisted
  • User-Computer Interface*
  • Wheelchairs*
  • Young Adult