CTFNet: Long-Sequence Time-Series Forecasting Based on Convolution and Time-Frequency Analysis

IEEE Trans Neural Netw Learn Syst. 2023 Aug 17:PP. doi: 10.1109/TNNLS.2023.3294064. Online ahead of print.

Abstract

Although current time-series forecasting methods have significantly improved the state-of-the-art (SOTA) results for long-sequence time-series forecasting (LSTF), they still have difficulty in capturing and extracting the features and dependencies of long-term sequences and suffer from information utilization bottlenecks and high-computational complexity. To address these issues, a lightweight single-hidden layer feedforward neural network (SLFN) combining convolution mapping and time-frequency decomposition called CTFNet is proposed with three distinctive characteristics. First, time-domain (TD) feature mining-in this article, a method for extracting the long-term correlation of horizontal TD features based on matrix factorization is proposed, which can effectively capture the interdependence among different sample points of a long time series. Second, multitask frequency-domain (FD) feature mining-this can effectively extract different frequency feature information of time-series data from the FD and minimize the loss of data features. Integrating multiscale dilated convolutions, simultaneously focusing on both global and local context feature dependencies at the sequence level, and mining the long-term dependencies of the multiscale frequency information and the spatial dependencies among the different scale frequency information, break the bottleneck of data utilization, and ensure the integrity of feature extraction. Third, highly efficient-the CTFNet model has a short training time and fast inference speed. Our empirical studies with nine benchmark datasets show that compared with state-of-the-art methods, CTFNet can reduce prediction error by 64.7% and 53.7% for multivariate and univariate time series, respectively.