Prediction of common labels for universal domain adaptation

Neural Netw. 2023 Aug:165:463-471. doi: 10.1016/j.neunet.2023.05.057. Epub 2023 Jun 7.

Abstract

Universal domain adaptation (UniDA) is an unsupervised domain adaptation that selectively transfers the knowledge between different domains containing different label sets. However, the existing methods do not predict the common labels of different domains and manually set a threshold to discriminate private samples, so they rely on the target domain to finely select the threshold and ignore the problem of negative transfer. In this paper, to address the above problems, we propose a novel classification model named Prediction of Common Labels (PCL) for UniDA, in which the common labels are predicted by Category Separation via Clustering (CSC). It is noted that we devise a new evaluation metric called category separation accuracy to measure the performance of category separation. To weaken negative transfer, we select source samples by the predicted common labels to fine-tune model for better domain alignment. In the test process, the target samples are discriminated by the predicted common labels and the results of clustering. Experimental results on three widely used benchmark datasets indicate the effectiveness of the proposed method.

Keywords: Cross-domain classification; Deep learning; Prediction of common label; Universal domain adaptation.

MeSH terms

  • Benchmarking*
  • Cluster Analysis
  • Knowledge*