Novel Multitask Conditional Neural-Network Surrogate Models for Expensive Optimization

IEEE Trans Cybern. 2022 May;52(5):3984-3997. doi: 10.1109/TCYB.2020.3014126. Epub 2022 May 19.

Abstract

Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.

MeSH terms

  • Algorithms*
  • Bayes Theorem
  • Learning
  • Neural Networks, Computer*