Generalized Convolution Spectral Mixture for Multitask Gaussian Processes

IEEE Trans Neural Netw Learn Syst. 2020 Dec;31(12):5613-5623. doi: 10.1109/TNNLS.2020.2980779. Epub 2020 Nov 30.

Abstract

Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.