Source-free domain adaptive segmentation with class-balanced complementary self-training

Artif Intell Med. 2023 Dec:146:102694. doi: 10.1016/j.artmed.2023.102694. Epub 2023 Oct 31.

Abstract

Unsupervised domain adaptation (UDA) plays a crucial role in transferring knowledge gained from a labeled source domain to effectively apply it in an unlabeled and diverse target domain. While UDA commonly involves training on data from both domains, accessing labeled data from the source domain is frequently constrained, citing concerns related to patient data privacy or intellectual property. The source-free UDA (SFUDA) can be promising to sidestep this difficulty. However, without the source domain supervision, the SFUDA methods can easily fall into the dilemma of "winner takes all", in which the majority category can dominate the deep segmentor, and the minority categories are largely ignored. In addition, the over-confident pseudo-label noise in self-training-based UDA is a long-lasting problem. To sidestep these difficulties, we propose a novel class-balanced complementary self-training (CBCOST) framework for SFUDA segmentation. Specifically, we jointly optimize the pseudo-label-based self-training with two mutually reinforced components. The first class-wise balanced pseudo-label training (CBT) explicitly exploits the fine-grained class-wise confidence to select the class-wise balanced pseudo-labeled pixels with the adaptive within-class thresholds. Second, to alleviate the pseudo-labeled noise, we propose a complementary self-training (COST) to exclude the classes that do not belong to, with a heuristic complementary label selection scheme. We evaluated our CBCOST framework on both 2D and 3D cross-modality cardiac anatomical segmentation tasks and brain tumor segmentation tasks. Our experimental results showed that our CBCOST performs better than existing SFUDA methods and yields similar performance, compared with UDA methods with the source data.

Keywords: Segmentation; Self-training; Source-free domain adaptation.

Publication types

  • Research Support, Non-U.S. Gov't