Bimanual Intravenous Needle Insertion Simulation Using Nonhomogeneous Haptic Device Integrated into Mixed Reality

Sensors (Basel). 2023 Jul 26;23(15):6697. doi: 10.3390/s23156697.

Abstract

In this study, we developed a new haptic-mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals to practice IV needle insertion into a virtual arm with unlimited attempts under various changing insertion conditions (e.g., skin: color, texture, stiffness, friction; vein: size, shape, location depth, stiffness, friction). To achieve accurate hand-eye coordination under dynamic mixed reality scenarios, two different haptic devices (Dexmo and Geomagic Touch) and a standalone mixed reality system (HoloLens 2) were integrated and synchronized through multistep calibration for different coordinate systems (real world, virtual world, mixed reality world, haptic interface world, HoloLens camera). In addition, force-profile-based haptic rendering proposed in this study was able to successfully mimic the real tactile feeling of IV needle insertion. Further, a global hand-tracking method, combining two depth sensors (HoloLens and Leap Motion), was developed to accurately track a haptic glove and simulate grasping a virtual hand with force feedback. We conducted an evaluation study with 20 participants (9 experts and 11 novices) to measure the usability of the HMR-IV simulation system with user performance under various insertion conditions. The quantitative results from our own metric and qualitative results from the NASA Task Load Index demonstrate the usability of our system.

Keywords: IV needle insertion simulation; bimanual haptic interface; dual haptic rendering; hand motor skill training; haptic-glove-based interaction; mixed reality; nursing education.

MeSH terms

  • Augmented Reality*
  • Computer Simulation
  • Haptic Interfaces
  • Humans
  • Touch
  • Touch Perception*
  • User-Computer Interface

Grants and funding

This material is based upon work supported by the National Science Foundation under Grant No. RETTL-2118380.