Automatic Multi-Organ Segmentation on Abdominal CT With Dense V-Networks

IEEE Trans Med Imaging. 2018 Aug;37(8):1822-1834. doi: 10.1109/TMI.2018.2806309. Epub 2018 Feb 14.

Abstract

Automatic segmentation of abdominal anatomy on computed tomography (CT) images can support diagnosis, treatment planning, and treatment delivery workflows. Segmentation methods using statistical models and multi-atlas label fusion (MALF) require inter-subject image registrations, which are challenging for abdominal images, but alternative methods without registration have not yet achieved higher accuracy for most abdominal organs. We present a registration-free deep-learning-based segmentation algorithm for eight organs that are relevant for navigation in endoscopic pancreatic and biliary procedures, including the pancreas, the gastrointestinal tract (esophagus, stomach, and duodenum) and surrounding organs (liver, spleen, left kidney, and gallbladder). We directly compared the segmentation accuracy of the proposed method to the existing deep learning and MALF methods in a cross-validation on a multi-centre data set with 90 subjects. The proposed method yielded significantly higher Dice scores for all organs and lower mean absolute distances for most organs, including Dice scores of 0.78 versus 0.71, 0.74, and 0.74 for the pancreas, 0.90 versus 0.85, 0.87, and 0.83 for the stomach, and 0.76 versus 0.68, 0.69, and 0.66 for the esophagus. We conclude that the deep-learning-based segmentation represents a registration-free method for multi-organ abdominal CT segmentation whose accuracy can surpass current methods, potentially supporting image-guided navigation in gastrointestinal endoscopy procedures.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Digestive System / diagnostic imaging
  • Humans
  • Kidney / diagnostic imaging
  • Radiographic Image Interpretation, Computer-Assisted / methods*
  • Radiography, Abdominal / methods*
  • Spleen / diagnostic imaging
  • Tomography, X-Ray Computed / methods*