Teacher-student approach for lung tumor segmentation from mixed-supervised datasets

PLoS One. 2022 Apr 5;17(4):e0266147. doi: 10.1371/journal.pone.0266147. eCollection 2022.

Abstract

Purpose: Cancer is among the leading causes of death in the developed world, and lung cancer is the most lethal type. Early detection is crucial for better prognosis, but can be resource intensive to achieve. Automating tasks such as lung tumor localization and segmentation in radiological images can free valuable time for radiologists and other clinical personnel. Convolutional neural networks may be suited for such tasks, but require substantial amounts of labeled data to train. Obtaining labeled data is a challenge, especially in the medical domain.

Methods: This paper investigates the use of a teacher-student design to utilize datasets with different types of supervision to train an automatic model performing pulmonary tumor segmentation on computed tomography images. The framework consists of two models: the student that performs end-to-end automatic tumor segmentation and the teacher that supplies the student additional pseudo-annotated data during training.

Results: Using only a small proportion of semantically labeled data and a large number of bounding box annotated data, we achieved competitive performance using a teacher-student design. Models trained on larger amounts of semantic annotations did not perform better than those trained on teacher-annotated data. Our model trained on a small number of semantically labeled data achieved a mean dice similarity coefficient of 71.0 on the MSD Lung dataset.

Conclusions: Our results demonstrate the potential of utilizing teacher-student designs to reduce the annotation load, as less supervised annotation schemes may be performed, without any real degradation in segmentation accuracy.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Humans
  • Image Processing, Computer-Assisted* / methods
  • Lung Neoplasms* / diagnostic imaging
  • Neural Networks, Computer
  • Students
  • Tomography, X-Ray Computed

Grants and funding

This work was supported by The Norwegian National Advisory Unit for Ultrasound and Image-Guided therapy at St. Olavs hospital and The Liaison Committee for Education, Research and Innovation in Central Norway (No. 2018/42794) and The Cancer Foundation, St. Olavs hospital, Trondheim University Hospital (No. 13/2021) and SINTEF and The EEA Grant project "Improving Cancer Diagnostics in Flexible Endoscopy Using Artificial Intelligence and Medical Robotics (IDEAR)" (No. 19/2020).