A non-contact interactive stereo display system for exploring human anatomy

Comput Assist Surg (Abingdon). 2019 Oct;24(sup1):131-136. doi: 10.1080/24699322.2018.1557899. Epub 2019 Feb 11.

Abstract

Stereoscopic display based on Virtual Reality (VR) can facilitate clinicians observing 3 D anatomical models with the depth cue which lets them understand the spatial relationship between different anatomical structures intuitively. However, there are few input devices available in the sterile field of the operating room for controlling 3 D anatomical models. This paper presents a cost-effective VR application for stereo display of 3 D anatomical models with non-contact interaction. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is based on Leap Motion. Voice interaction is implemented based on Bing Speech for English language and Aitalk for Chinese language. A local database is designed to record the anatomical terminologies organized in a tree structure, and provided to the speech recognition engine for querying these uncommon words. Ten participants were asked to practice the proposed system and compare it with the common interactive manners. The results show that our system is more efficient than the common interactive manner and prove the feasibility and practicability of the proposed system used in the sterile field.

Keywords: 3D anatomical model; Virtual reality; gesture interaction; stereo display; voice interaction.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Humans
  • Imaging, Three-Dimensional*
  • Models, Anatomic*
  • User-Computer Interface*
  • Virtual Reality*
  • Voice