UltrARsound: in situ visualization of live ultrasound images using HoloLens 2

Int J Comput Assist Radiol Surg. 2022 Nov;17(11):2081-2091. doi: 10.1007/s11548-022-02695-z. Epub 2022 Jul 1.

Abstract

Purpose: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy.

Methods: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses-thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images.

Results: Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula: see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula: see text]. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms.

Conclusions: In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.

Keywords: Augmented reality; Retroreflective spheres; Tracking; Ultrasound guidance.

MeSH terms

  • Cross-Sectional Studies*
  • Humans
  • Ultrasonography