Graph Attention Transformer for Unsupervised Multivariate Time Series Anomaly Detection

Tzu Hsuan Hsu*, Yu Chee Tseng, Jen Jee Chen

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper studies the anomaly detection problem of multivariate time series data. Previous methods may rely on determining positive anomalies by calculating the differences in reconstructed or forecasted results. The challenges include recognition rate, scalability, and lacks of anomaly labels. We propose a self-supervised model that treats the data dependencies of multiple time series as a graph, applies a modified Transformer encoder with graph attention to learn features, and adopts a GRU to predict future data. In addition, a data selection policy with data offsetting and data dropping is designed to filter out outlier data in a self-supervised way, helping to retrieve useful features and avoid data imbalance. The model is validated on two real-world datasets and demonstrates better performance over the state-of-the-art models by about 1.0 in F1-score.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume3343
StatePublished - 2022
Event2022 MACLEAN: MAChine Learning for EArth ObservatioN Workshop, MACLEAN 2022 - Grenoble, France
Duration: 18 Sep 202222 Sep 2022

Keywords

  • Anomaly detection
  • Graph Neural Network
  • GRU-forecasting
  • Multivariate time series data
  • Spacecraft telemetry data
  • Transformer

Fingerprint

Dive into the research topics of 'Graph Attention Transformer for Unsupervised Multivariate Time Series Anomaly Detection'. Together they form a unique fingerprint.

Cite this