HTC Vive MeVisLab integration via OpenVR for medical applications

PLoS One. 2017 Mar 21;12(3):e0173972. doi: 10.1371/journal.pone.0173972. eCollection 2017.

Abstract

Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.

Publication types

  • Evaluation Study

MeSH terms

  • Computer Graphics
  • Computer Simulation
  • Computer Systems
  • Equipment Design
  • Feasibility Studies
  • Humans
  • Image Processing, Computer-Assisted
  • Imaging, Three-Dimensional
  • Software
  • User-Computer Interface*

Grants and funding

The work received funding from BioTechMed-Graz in Austria (https://biotechmedgraz.at/en/, “Hardware accelerated intelligent medical imaging”), the 6th Call of the Initial Funding Program from the Research & Technology House (F&T-Haus) at the Graz University of Technology (https://www.tugraz.at/en/, “Interactive Planning and Reconstruction of Facial Defects”, PI: Jan Egger) and was supported by TU Graz Open Access Publishing Fund. Dr. Xiaojun Chen receives support by the Natural Science Foundation of China (www.nsfc.gov.cn, Grant No.: 81511130089) and the Foundation of Science and Technology Commission of Shanghai Municipality (Grants No.: 14441901002, 15510722200 and 16441908400). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.