Concept Drift-Tolerant Transfer Learning in Dynamic Environments

IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3857-3871. doi: 10.1109/TNNLS.2021.3054665. Epub 2022 Aug 3.

Abstract

Existing transfer learning methods that focus on problems in stationary environments are not usually applicable to dynamic environments, where concept drift may occur. To the best of our knowledge, the concept drift-tolerant transfer learning (CDTL), whose major challenge is the need to adapt the target model and knowledge of source domains to the changing environments, has yet to be well explored in the literature. This article, therefore, proposes a hybrid ensemble approach to deal with the CDTL problem provided that data in the target domain are generated in a streaming chunk-by-chunk manner from nonstationary environments. At each time step, a class-wise weighted ensemble is presented to adapt the model of target domains to new environments. It assigns a weight vector for each classifier generated from the previous data chunks to allow each class of the current data leveraging historical knowledge independently. Then, a domain-wise weighted ensemble is introduced to combine the source and target models to select useful knowledge of each domain. The source models are updated with the source instances performed by the proposed adaptive weighted CORrelation ALignment (AW-CORAL). AW-CORAL iteratively minimizes domain discrepancy meanwhile decreases the effect of unrelated source instances. In this way, positive knowledge of source domains can be potentially promoted while negative knowledge is reduced. Empirical studies on synthetic and real benchmark data sets demonstrate the effectiveness of the proposed algorithm.