AMiCUS-A Head Motion-Based Interface for Control of an Assistive Robot

Sensors (Basel). 2019 Jun 25;19(12):2836. doi: 10.3390/s19122836.

Abstract

Within this work we present AMiCUS, a Human-Robot Interface that enables tetraplegics to control a multi-degree of freedom robot arm in real-time using solely head motion, empowering them to perform simple manipulation tasks independently. The article describes the hardware, software and signal processing of AMiCUS and presents the results of a volunteer study with 13 able-bodied subjects and 6 tetraplegics with severe head motion limitations. As part of the study, the subjects performed two different pick-and-place tasks. The usability was assessed with a questionnaire. The overall performance and the main control elements were evaluated with objective measures such as completion rate and interaction time. The results show that the mapping of head motion onto robot motion is intuitive and the given feedback is useful, enabling smooth, precise and efficient robot control and resulting in high user-acceptance. Furthermore, it could be demonstrated that the robot did not move unintendedly, giving a positive prognosis for safety requirements in the framework of a certification of a product prototype. On top of that, AMiCUS enabled every subject to control the robot arm, independent of prior experience and degree of head motion limitation, making the system available for a wide range of motion impaired users.

Keywords: AHRS; IMU; assistive technology; gesture recognition; head control; human-machine interaction; motion sensors; real-time control; robot control; tetraplegia.

MeSH terms

  • Adult
  • Equipment Design
  • Female
  • Gestures
  • Head / physiology*
  • Humans
  • Male
  • Man-Machine Systems
  • Middle Aged
  • Motion
  • Quadriplegia / physiopathology*
  • Robotics*
  • Software
  • User-Computer Interface
  • Young Adult