Task Similarity Estimation Through Adversarial Multitask Neural Network

IEEE Trans Neural Netw Learn Syst. 2021 Feb;32(2):466-480. doi: 10.1109/TNNLS.2020.3028022. Epub 2021 Feb 4.

Abstract

Multitask learning (MTL) aims at solving the related tasks simultaneously by exploiting shared knowledge to improve performance on individual tasks. Though numerous empirical results supported the notion that such shared knowledge among tasks plays an essential role in MTL, the theoretical understanding of the relationships between tasks and their impact on learning shared knowledge is still an open problem. In this work, we are developing a theoretical perspective of the benefits involved in using information similarity for MTL. To this end, we first propose an upper bound on the generalization error by implementing the Wasserstein distance as the similarity metric. This indicates the practical principles of applying the similarity information to control the generalization errors. Based on those theoretical results, we revisited the adversarial multitask neural network and proposed a new training algorithm to learn the task relation coefficients and neural network parameters automatically. The computer vision benchmarks reveal the abilities of the proposed algorithms to improve the empirical performance. Finally, we test the proposed approach on real medical data sets, showing its advantage for extracting task relations.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Alzheimer Disease / diagnostic imaging
  • Benchmarking
  • Data Mining
  • Electronic Data Processing
  • Humans
  • Image Processing, Computer-Assisted
  • Machine Learning*
  • Neural Networks, Computer*