Disentangled Dynamic Deviation Transformer Networks for Multivariate Time Series Anomaly Detection

Sensors (Basel). 2023 Jan 18;23(3):1104. doi: 10.3390/s23031104.

Abstract

Graph neural networks have been widely used by multivariate time series-based anomaly detection algorithms to model the dependencies of system sensors. Previous studies have focused on learning the fixed dependency patterns between sensors. However, they ignore that the inter-sensor and temporal dependencies of time series are highly nonlinear and dynamic, leading to inevitable false alarms. In this paper, we propose a novel disentangled dynamic deviation transformer network (D3TN) for anomaly detection of multivariate time series, which jointly exploits multiscale dynamic inter-sensor dependencies and long-term temporal dependencies to improve the accuracy of multivariate time series prediction. Specifically, to disentangle the multiscale graph convolution, we design a novel disentangled multiscale aggregation scheme to better represent the hidden dependencies between sensors to learn fixed inter-sensor dependencies based on static topology. To capture dynamic inter-sensor dependencies determined by real-time monitoring situations and unexpected anomalies, we introduce a self-attention mechanism to model dynamic directed interactions in various potential subspaces influenced by various factors. In addition, complex temporal correlations across multiple time steps are simulated by processing the time series in parallel. Experiments on three real datasets show that the proposed D3TN significantly outperforms the state-of-the-art methods.

Keywords: anomaly detection; graph neural networks; time series; transformer.