Transfer Learning Under Conditional Shift Based on Fuzzy Residual

IEEE Trans Cybern. 2022 Feb;52(2):960-970. doi: 10.1109/TCYB.2020.2988277. Epub 2022 Feb 16.

Abstract

Transfer learning has received much attention recently and has been proven to be effective in a wide range of applications, whereas studies on regression problems are still scarce. In this article, we focus on the transfer learning problem for regression under the situations of conditional shift where the source and target domains share the same marginal distribution while having different conditional probability distributions. We propose a new framework called transfer learning based on fuzzy residual (ResTL) which learns the target model by preserving the distribution properties of the source data in a model-agnostic way. First, we formulate the target model by adding fuzzy residual to a model-agnostic source model and reuse the antecedent parameters of the source fuzzy system. Then two methods for bias computation are provided for different considerations, which refer to two ResTL methods called ResTLLS and ResTLRD. Finally, we conduct a series of experiments both on a toy example and several real-world datasets to verify the effectiveness of the proposed method.

MeSH terms

  • Learning*
  • Machine Learning*