EasyLabels: weak labels for scene segmentation in laparoscopic videos

Int J Comput Assist Radiol Surg. 2019 Jul;14(7):1247-1257. doi: 10.1007/s11548-019-02003-2. Epub 2019 Jun 4.

Abstract

Purpose: We present a different approach for annotating laparoscopic images for segmentation in a weak fashion and experimentally prove that its accuracy when trained with partial cross-entropy is close to that obtained with fully supervised approaches.

Methods: We propose an approach that relies on weak annotations provided as stripes over the different objects in the image and partial cross-entropy as the loss function of a fully convolutional neural network to obtain a dense pixel-level prediction map.

Results: We validate our method on three different datasets, providing qualitative results for all of them and quantitative results for two of them. The experiments show that our approach is able to obtain at least [Formula: see text] of the accuracy obtained with fully supervised methods for all the tested datasets, while requiring [Formula: see text][Formula: see text] less time to create the annotations compared to full supervision.

Conclusions: With this work, we demonstrate that laparoscopic data can be segmented using very few annotated data while maintaining levels of accuracy comparable to those obtained with full supervision.

Keywords: Computer-assisted interventions; Instrument detection and segmentation; Laparoscopy.

MeSH terms

  • Humans
  • Laparoscopy / methods*
  • Neural Networks, Computer
  • Surgical Instruments*