Contactless operating table control based on 3D image processing

Annu Int Conf IEEE Eng Med Biol Soc. 2014:2014:388-92. doi: 10.1109/EMBC.2014.6943610.

Abstract

Interaction with mobile consumer devices leads to a higher acceptance and affinity of persons to natural user interfaces and perceptional interaction possibilities. New interaction modalities become accessible and are capable to improve human machine interaction even in complex and high risk environments, like the operation room. Here, manifold medical disciplines cause a great variety of procedures and thus staff and equipment. One universal challenge is to meet the sterility requirements, for which common contact-afflicted remote interfaces always pose a potential risk causing a hazard for the process. The proposed operating table control system overcomes this process risk and thus improves the system usability significantly. The 3D sensor system, the Microsoft Kinect, captures the motion of the user, allowing a touchless manipulation of an operating table. Three gestures enable the user to select, activate and manipulate all segments of the motorised system in a safe and intuitive way. The gesture dynamics are synchronised with the table movement. In a usability study, 15 participants evaluated the system with a system usability score by Broke of 79. This states a high potential for implementation and acceptance in interventional environments. In the near future, even processes with higher risks could be controlled with the proposed interface, while interfaces become safer and more direct.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Biomechanical Phenomena
  • Gestures
  • Humans
  • Imaging, Three-Dimensional / methods*
  • Operating Rooms
  • Operating Tables*
  • User-Computer Interface