Touchfree medical interfaces

Annu Int Conf IEEE Eng Med Biol Soc. 2014:2014:6597-600. doi: 10.1109/EMBC.2014.6945140.

Abstract

Real-time control of visual display systems via mid-air hand gestures offers many advantages over traditional interaction modalities. In medicine, for example, it allows a practitioner to adjust display values, e.g. contrast or zoom, on a medical visualization interface without the need to re-sterilize the interface. However, when users are holding a small tool (such as a pen, surgical needle, or computer stylus) the need to constantly put the tool down in order to make hand gesture interactions is not ideal. This work presents a novel interface that automatically adjusts for gesturing with hands and hand-held tools to precisely control medical displays. The novelty of our interface is that it uses a single set of gestures designed to be equally effective for fingers and hand-held tools without using markers. This type of interface was previously not feasible with low-resolution depth sensors such as Kinect, but is now achieved by using the recently released Leap Motion controller. Our interface is validated through a user study on a group of people given the task of adjusting parameters on a medical image.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Gestures
  • Hand
  • Humans
  • Infection Control
  • User-Computer Interface*