SKOOTS: Skeleton oriented object segmentation for mitochondria

bioRxiv [Preprint]. 2023 May 8:2023.05.05.539611. doi: 10.1101/2023.05.05.539611.

Abstract

The segmentation of individual instances of mitochondria from imaging datasets is informative, yet time-consuming to do by hand, sparking interest in developing automated algorithms using deep neural networks. Existing solutions for various segmentation tasks are largely optimized for one of two types of biomedical imaging: high resolution three-dimensional (whole neuron segmentation in volumetric electron microscopy datasets) or two-dimensional low resolution (whole cell segmentation of light microscopy images). The former requires consistently predictable boundaries to segment large structures, while the latter is boundary invariant but struggles with segmentation of large 3D objects without downscaling. Mitochondria in whole cell 3D EM datasets often occupy the challenging middle ground: large with ambiguous borders, limiting accuracy with existing tools. To rectify this, we have developed skeleton oriented object segmentation (SKOOTS); a new segmentation approach which efficiently handles large, densely packed mitochondria. We show that SKOOTS can accurately, and efficiently, segment 3D mitochondria in previously difficult situations. Furthermore, we will release a new, manually annotated, 3D mitochondria segmentation dataset. Finally, we show this approach can be extended to segment objects in 3D light microscopy datasets. These results bridge the gap between existing segmentation approaches and increases the accessibility for three-dimensional biomedical image analysis.

Publication types

  • Preprint