Towards Accurate and Robust Domain Adaptation Under Multiple Noisy Environments

IEEE Trans Pattern Anal Mach Intell. 2023 May;45(5):6460-6479. doi: 10.1109/TPAMI.2022.3215150. Epub 2023 Apr 3.

Abstract

In many non-stationary environments, machine learning algorithms usually confront the distribution shift scenarios. Previous domain adaptation methods have achieved great success. However, they would lose algorithm robustness in multiple noisy environments where the examples of source domain become corrupted by label noise, feature noise, or open-set noise. In this paper, we report our attempt toward achieving noise-robust domain adaptation. We first give a theoretical analysis and find that different noises have disparate impacts on the expected target risk. To eliminate the effect of source noises, we propose offline curriculum learning minimizing a newly-defined empirical source risk. We suggest a proxy distribution-based margin discrepancy to gradually decrease the noisy distribution distance to reduce the impact of source noises. We propose an energy estimator for assessing the outlier degree of open-set-noise examples to defeat the harmful influence. We also suggest robust parameter learning to mitigate the negative effect further and learn domain-invariant feature representations. Finally, we seamlessly transform these components into an adversarial network that performs efficient joint optimization for them. A series of empirical studies on the benchmark datasets and the COVID-19 screening task show that our algorithm remarkably outperforms the state-of-the-art, with over 10% accuracy improvements in some transfer tasks.