Using open surgery simulation kinematic data for tool and gesture recognition

Int J Comput Assist Radiol Surg. 2022 Jun;17(6):965-979. doi: 10.1007/s11548-022-02615-1. Epub 2022 Apr 13.

Abstract

Purpose: The use of motion sensors is emerging as a means for measuring surgical performance. Motion sensors are typically used for calculating performance metrics and assessing skill. The aim of this study was to identify surgical gestures and tools used during an open surgery suturing simulation based on motion sensor data.

Methods: Twenty-five participants performed a suturing task on a variable tissue simulator. Electromagnetic motion sensors were used to measure their performance. The current study compares GRU and LSTM networks, which are known to perform well on other kinematic datasets, as well as MS-TCN++, which was developed for video data and was adapted in this work for motion sensors data. Finally, we extended all architectures for multi-tasking.

Results: In the gesture recognition task the MS-TCN++ has the highest performance with accuracy of [Formula: see text] and F1-Macro of [Formula: see text], edit distance of [Formula: see text] and F1@10 of [Formula: see text] In the tool usage recognition task for the right hand, MS-TCN++ performs the best in most metrics with an accuracy score of [Formula: see text], F1-Macro of [Formula: see text], F1@10 of [Formula: see text], and F1@25 of [Formula: see text]. The multi-task GRU performs best in all metrics in the left-hand case, with an accuracy of [Formula: see text], edit distance of [Formula: see text], F1-Macro of [Formula: see text], F1@10 of [Formula: see text], and F1@25 of [Formula: see text].

Conclusion: In this study, using motion sensor data, we automatically identified the surgical gestures and the tools used during an open surgery suturing simulation. Our methods may be used for computing more detailed performance metrics and assisting in automatic workflow analysis. MS-TCN++ performed better in gesture recognition as well as right-hand tool recognition, while the multi-task GRU provided better results in the left-hand case. It should be noted that our multi-task GRU network is significantly smaller and has achieved competitive results in the rest of the tasks as well.

Keywords: Motion sensor; Surgical gesture recognition; Surgical simulation; Tool identification.

MeSH terms

  • Biomechanical Phenomena
  • Gestures*
  • Humans
  • Motion
  • Sutures*