Touchless scanner control to support MRI-guided interventions

Int J Comput Assist Radiol Surg. 2020 Mar;15(3):545-553. doi: 10.1007/s11548-019-02058-1. Epub 2019 Sep 13.

Abstract

Purpose: MRI-guided interventions allow minimally invasive, radiation-free treatment but rely on real-time image data and free slice positioning. Interventional interaction with the data and the MRI scanner is cumbersome due to the diagnostic focus of current systems, confined space and sterile conditions.

Methods: We present a touchless, hand-gesture-based interaction concept to control functions of the MRI scanner typically used during MRI-guided interventions. The system consists of a hand gesture sensor customised for MRI compatibility and a specialised UI that was developed based on clinical needs. A user study with 10 radiologists was performed to compare the gesture interaction concept and its components to task delegation-the prevalent method in clinical practice.

Results: Both methods performed comparably in terms of task duration and subjective workload. Subjective performance with gesture input was perceived as worse compared to task delegation, but was rated acceptable in terms of usability while task delegation was not.

Conclusion: This work contributes by (1) providing access to relevant functions on an MRI scanner during percutaneous interventions in a (2) suitable way for sterile human-computer interaction. The introduced concept removes indirect interaction with the scanner via an assistant, which leads to comparable subjective workload and task completion times while showing higher perceived usability.

Keywords: Gestures; Human–computer interaction; Interventional; Magnetic resonance imaging; Radiology; Task delegation; Touchless interaction; Usability.

MeSH terms

  • Gestures*
  • Humans
  • Magnetic Resonance Imaging / methods*
  • User-Computer Interface*