Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training

IEEE Trans Biomed Eng. 2016 Jun;63(6):1280-91. doi: 10.1109/TBME.2015.2493100. Epub 2015 Oct 26.

Abstract

Dexterity and procedural knowledge are two critical skills that surgeons need to master to perform accurate and safe surgical interventions. However, current training systems do not allow us to provide an in-depth analysis of surgical gestures to precisely assess these skills. Our objective is to develop a method for the automatic and quantitative assessment of surgical gestures. To reach this goal, we propose a new unsupervised algorithm that can automatically segment kinematic data from robotic training sessions. Without relying on any prior information or model, this algorithm detects critical points in the kinematic data that define relevant spatio-temporal segments. Based on the association of these segments, we obtain an accurate recognition of the gestures involved in the surgical training task. We, then, perform an advanced analysis and assess our algorithm using datasets recorded during real expert training sessions. After comparing our approach with the manual annotations of the surgical gestures, we observe 97.4% accuracy for the learning purpose and an average matching score of 81.9% for the fully automated gesture recognition process. Our results show that trainees workflow can be followed and surgical gestures may be automatically evaluated according to an expert database. This approach tends toward improving training efficiency by minimizing the learning curve.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Gestures*
  • Humans
  • Pattern Recognition, Automated / methods*
  • Robotic Surgical Procedures / education*
  • Robotic Surgical Procedures / methods*
  • Unsupervised Machine Learning*