Code Monkey home page Code Monkey logo

awesome-ssl4ts's Introduction

Awesome Self-Supervised Learning for Time Series (SSL4TS)

Awesome PRs Welcome Stars Visits Badge

A professionally curated list of awesome resources (paper, code, data, etc.) on Self-Supervised Learning for Time Series (SSL4TS), which is the first work to comprehensively and systematically summarize the recent advances of Self-Supervised Learning for modeling time series data to the best of our knowledge.

We will continue to update this list with the newest resources. If you find any missed resources (paper/code) or errors, please feel free to open an issue or make a pull request.

For general AI for Time Series (AI4TS) Papers, Tutorials, and Surveys at the Top AI Conferences and Journals, please check This Repo.

Survey paper

Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects

Kexin Zhang, Qingsong Wen, Chaoli Zhang, Rongyao Cai, Ming Jin, Yong Liu, James Zhang, Yuxuan Liang, Guansong Pang, Dongjin Song, Shirui Pan.

If you find this repository helpful for your work, please kindly cite our survey paper.

@article{zhang2023ssl4ts,
  title={Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects},
  author={Kexin Zhang and Qingsong Wen and Chaoli Zhang and Rongyao Cai and Ming Jin and Yong Liu and James Zhang and Yuxuan Liang and Guansong Pang and Dongjin Song and Shirui Pan},
  journal={arXiv preprint arXiv:2306.10125},
  year={2023}
}

Taxonomy of Self-Supervised Learning for Time Series




Category of Self-Supervised Learning for Time Series

Generative-based Methods on SSL4TS

In this category, the pretext task is to generate the expected data based on a given view of the data. In the context of time series modeling, the commonly used pretext tasks include using the past series to forecast the future windows or specific time stamps, using the encoder and decoder to reconstruct the input, and forecasting the unseen part of the masked time series. This section sorts out the existing self-supervised representation learning methods in time series modeling from the perspectives of autoregressive-based forecasting, autoencoder-based reconstruction, and diffusion-based generation. It should be noted that autoencoder-based reconstruction task is also viewed as an unsupervised framework. In the context of SSL, we mainly use the reconstruction task as a pretext task, and the final goal is to obtain the representations through autoencoder models. The illustration of the generative-based SSL for time series is shown in Fig. 3.

Autoregressive-based forecasting

  • Timeseries anomaly detection using temporal hierarchical one-class network, in NeurIPS, 2020. [paper]
  • Self-supervised transformer for sparse and irregularly sampled multivariate clinical time-series, in ACM Transactions on Knowledge Discovery from Data, 2022. [paper]
  • Graph neural network-based anomaly detection in multivariate time series, in AAAI, 2021. [paper]
  • Semisupervised time series classification model with self-supervised learning, in Engineering Applications of Artificial Intelligence, 2022. [paper]

Autoencoder-based reconstruction

  • TimeNet: Pre-trained deep recurrent neural network for time series classification, in arXiv, 2017. [paper]
  • Unsupervised pre-training of a deep LSTM-based stacked autoencoder for multivariate time series forecasting problems, in Scientific Reports, 2019. [paper]
  • Autowarp: Learning a warping distance from unlabeled time series using sequence autoencoders, in NeurIPS, 2018. [paper]
  • Practical approach to asynchronous multivariate time series anomaly detection and localization, in KDD, 2021. [paper]
  • Learning representations for time series clustering, in NeurIPS, 2019. [paper]
  • USAD: Unsupervised anomaly detection on multivariate time series, in KDD, 2020 [paper]
  • Learning sparse latent graph representations for anomaly detection in multivariate time series, in KDD, 2022. [paper]
  • Wind turbine fault detection using a denoising autoencoder with temporal information, in IEEE/ASME Transactions on Mechatronics, 2018 [paper]
  • Denoising temporal convolutional recurrent autoencoders for time series classification, in Information Sciences, 2022. [paper]
  • Pre-training enhanced spatial-temporal graph neural network for multivariate time series forecasting, in KDD, 2022. [paper]
  • A transformer-based framework for multivariate time series representation learning, in KDD, 2021. [paper]
  • Multi-variate time series forecasting on variable subsets, in KDD, 2022. [paper]
  • TARNet: Task-aware reconstruction for time-series transformer, in KDD, 2022. [paper]
  • Learning latent seasonal-trend representations for time series forecasting, in NeurIPS, 2022. [paper] [repo]
  • Multivariate time series anomaly detection and interpretation using hierarchical inter-metric and temporal embedding, in KDD, 2021. [paper]
  • Robust anomaly detection for multivariate time series through stochastic recurrent neural network, in KDD, 2019. [paper]
  • GRELEN: Multivariate time series anomaly detection from the perspective of graph relational learning, in IJCAI, 2022. [paper]
  • Deep variational graph convolutional recurrent network for multivariate time series anomaly detection, in ICML, 2022. [paper]
  • Heteroscedastic temporal variational autoencoder for irregularly sampled time series, in ICLR, 2022. [paper]
  • Learning from irregularly-sampled time series: A missing data perspective, in ICML, 2020. [paper]

Diffusion-based generation

  • CSDI: Conditional score-based diffusion models for probabilistic time series imputation, in NeurIPS, 2021. [paper]
  • Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting, in ICML, 2021. [paper]
  • Generative time series forecasting with diffusion, denoise, and disentanglement, in NeurIPS, 2022. [paper]
  • ImDiffusion: Imputed diffusion models for multivariate time series anomaly detection, in arXiv, 2023. [paper]
  • Diffusion-based time series imputation and forecasting with structured state space models, in Transactions on Machine Learning Research, 2022. [paper]
  • Diffload: Uncertainty quantification in load forecasting with diffusion model, in arXiv, 2023. [paper]
  • DiffSTG: Probabilistic spatio-temporal graph forecasting with denoising diffusion models, in arXiv, 2023. [paper]

Contrastive-based Methods on SSL4TS

Contrastive learning is a widely used self-supervised learning strategy, showing a strong learning ability in computer vision and natural language processing. Unlike discriminative models that learn a mapping rule to true labels and generative models that try to reconstruct inputs, contrastive-based methods aim to learn data representations by contrasting between positive and negative samples. Specifically, positive samples should have similar representations, while negative samples have different representations. Therefore, the selection of positive samples and negative samples is very important to contrastive-based methods. This section sorts out and summarizes the existing contrastive-based methods in time series modeling according to the selection of positive and negative samples. The illustration of the contrastive-based SSL for time series is shown in Fig. 4.

Sampling contrast

  • Unsupervised scalable representation learning for multivariate time series, in NeurIPS, 2019. [paper]
  • Unsupervised representation learning for time series with temporal neighborhood coding, in ICLR, 2021. [paper]
  • Neighborhood contrastive learning applied to online patient monitoring, in ICML, 2021. [paper]

Prediction contrast

  • Representation learning with contrastive predictive coding, in arXiv, 2018. [paper]
  • Detecting anomalies within time series using local neural transformations, in arXiv, 2022. [paper]
  • Contrastive predictive coding for anomaly detection in multi-variate time series data, in arXiv, 2022. [paper]
  • Time series change point detection with self-supervised contrastive predictive coding, in WWW, 2021. [paper]
  • Time Series Anomaly Detection using Skip-Step Contrastive Predictive Coding, in NeurIPS Workshop: Self-Supervised Learning-Theory and Practice, 2022. [paper]
  • Stock trend prediction with multi-granularity data: A contrastive learning approach with adaptive fusion, in CIKM, 2021. [paper]
  • Time-series representation learning via temporal and contextual contrasting, in IJCAI, 2021. [paper]
  • Self-supervised contrastive representation learning for semi-supervised time-series classification, in arXiv, 2022. [paper]

Augmentation contrast

  • TS2Vec: Towards universal representation of time series, in AAAI, 2022. [paper]
  • CoST: Contrastive learning of disentangled seasonal-trend representations for time series forecasting, in ICLR, 2022. [paper]
  • Unsupervised time-series representation learning with iterative bilinear temporal-spectral fusion, in ICML, 2022. [paper]
  • Self-supervised contrastive pre-training for time series via time-frequency consistency, in NeurIPS, 2022. [paper]
  • Timeclr: A self-supervised contrastive learning framework for univariate time series representation, in Knowledge-Based Systems, 2022. [paper]
  • Clocs: Contrastive learning of cardiac signals across space, time, and patients, in ICML, 2021. [paper]
  • Contrastive learning for unsupervised domain adaptation of time series, in arXiv, 2022. [paper]
  • Valve Stiction Detection Using Multitimescale Feature Consistent Constraint for Time-Series Data, in IEEE/ASME Transactions on Mechatronics, 2022. [paper]
  • Multi-Granularity Residual Learning with Confidence Estimation for Time Series Prediction, in WWW, 2022. [paper]
  • Stock trend prediction with multi-granularity data: A contrastive learning approach with adaptive fusion, in CIKM, 2021. [paper]
  • Self-supervised learning with attention-based latent signal augmentation for sleep staging with limited labeled data, in IJCAI, 2022. [paper]
  • DCdetector: Dual Attention Contrastive Representation Learning for Time Series Anomaly Detection, in arXiv, 2023. [paper]
  • Time-series representation learning via temporal and contextual contrasting, in IJCAI, 2021. [paper]
  • Self-supervised contrastive representation learning for semi-supervised time-series classification, in arXiv, 2022. [paper]

Prototype contrast

  • Shapenet: A shapelet-neural network approach for multivariate time series classification, in AAAI, 2021. [paper]
  • Tapnet: Multivariate time series classification with attentional prototypical network, in AAAI, 2020. [paper]
  • Learning discriminative virtual sequences for time series classification, in CIKM, 2020. [paper]
  • MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for Multivariate Time Series, in AAAI, 2023. [paper]

Expert knowledge contrast

  • Self-supervised pre-training for time series classification, in IJCNN, 2021. [paper]
  • Utilizing expert features for contrastive learning of time-series representations,in ICML, 2022. [paper]
  • SleepPriorCL: Contrastive representation learning with prior knowledge-based positive mining and adaptive temperature for sleep staging, in arXiv, 2021. [paper]

Adversarial-based Methods on SSL4TS

Adversarial-based self-supervised representation learning methods utilize generative adversarial networks (GANs) to construct pretext tasks. GAN contains a generator $\mathcal{G}$ and a discriminator $\mathcal{D}$. The generator $\mathcal{G}$ is responsible for generating synthetic data similar to real data, while the discriminator $\mathcal{D}$ is responsible for determining whether the generated data is real data or synthetic data. Therefore, the goal of the generator is to maximize the decision failure rate of the discriminator, and the goal of the discriminator is to minimize its failure rate. According to the final task, the existing adversarial-based representation learning methods can be divided into time series generation and imputation, and auxiliary representation enhancement. The illustration of the adversarial-based SSL for time series is shown in Fig. 5.

Time series generation and imputation

  • C-RNN-GAN: Continuous recurrent neural networks with adversarial training, in arXiv, 2016. [paper]
  • Time-series generative adversarial networks, in NeurIPS, 2019. [paper]
  • TTS-GAN: A transformer-based time-series generative adversarial network, in AIME, 2022. [paper]
  • E2gan: End-to-end generative adversarial network for multivariate time series imputation, in IJCAI, 2019. [paper]
  • Generating multivariate time series with COmmon Source CoordInated GAN (COSCI-GAN), in NeurIPS, 2022. [paper]
  • PSA-GAN: Progressive self attention GANs for synthetic time series, in ICLR, 2021. [paper]
  • GT-GAN: General Purpose Time Series Synthesis with Generative Adversarial Networks, in NeurIPS, 2022. [paper]
  • Multivariate time series imputation with generative adversarial networks, in NeurIPS, 2018. [paper]
  • Generative semi-supervised learning for multivariate time series imputation, in AAAI, 2021. [paper]

Auxiliary representation enhancement

  • USAD: Unsupervised anomaly detection on multivariate time series, in KDD, 2020. [paper]
  • Anomaly transformer: Time series anomaly detection with association discrepancy, in ICLR, 2022. [paper]
  • Deep unsupervised binary coding networks for multivariate time series retrieval, in AAAI, 2020. [paper]
  • Learning representations for time series clustering, in NeurIPS, 2019. [paper]
  • Adversarial sparse transformer for time series forecasting, in NeurIPS, 2020. [paper]
  • ACT: Adversarial Convolutional Transformer for Time Series Forecasting, in IJCNN, 2022. [paper]
  • BeatGAN: Anomalous rhythm detection using adversarially generated time series, in IJCAI, 2019. [paper]
  • Adversarial unsupervised representation learning for activity time-series, in AAAI, 2019. [paper]

Applications and Datasets on SSL4TS

Anomaly Detection

Dataset Size Dimension Source Link Comment
PSM 132,481 / 87,841 26 [paper] [link] AnRa: 27.80%
SMD 708,405 / 708,405 38 [paper] [link] AnRa: 4.16%
MSL 58,317 / 73,729 55 [paper] [link] AnRa: 10.72%
SMAP 135,183 / 427,617 25 [paper] [link] AnRa: 13.13%
SWaT 475,200 / 449,919 51 [paper] [link] AnRa: 12.98%
WADI 1,048,571 / 172,801 103 [paper] [link] AnRa: 5.99%

Forecasting

Dataset Size Dimension Source Link Comment
ETTh 17,420 7 [paper] [link] SaIn: 1h
ETTm 69,680 7 [paper] [link] SaIn: 15min
Wind 10,957 28 Non [link] SaIn: 1day
Electricity 26,304 321 Non [link] SaIn: 1hour
ILI 966 7 Non [link] SaIn: 1weak
Weather 52,696 21 Non [link] SaIn: 10min
Traffic 17,544 862 Non [link] SaIn: 1hour
Exchange 7,588 8 [paper] [link] SaIn: 1day
Solar 52,560 137 Non [link] SaIn: 10min

Classification and Clustering

Dataset Size Dimension Source Link Comment
HAR 17,3056 / 173,056 9 [paper] [link] Classes: 6
UCR 130 128*M 1 [paper] [link] N/A
UEA 30 30*M D [paper] [link] N/A

Time Series Related Survey

  • Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook, in arXiv 2023. [paper] [Website]
  • Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects, in arXiv 2023. [paper] [Website]
  • A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection, in arXiv 2023. [paper] [Website]
  • Transformers in Time Series: A Survey, in IJCAI 2023. [paper] [link]
  • Time series data augmentation for deep learning: a survey, in IJCAI 2021. [paper]
  • Neural temporal point processes: a review, in IJCAI 2021. [paper]
  • Time-series forecasting with deep learning: a survey, in Philosophical Transactions of the Royal Society A 2021. [paper]
  • Deep learning for time series forecasting: a survey, in Big Data 2021. [paper]
  • Neural forecasting: Introduction and literature overview, in arXiv 2020. [paper]
  • Deep learning for anomaly detection in time-series data: review, analysis, and guidelines, in Access 2021. [paper]
  • A review on outlier/anomaly detection in time series data, in ACM Computing Surveys 2021. [paper]
  • A unifying review of deep and shallow anomaly detection, in Proceedings of the IEEE 2021. [paper]
  • Deep learning for time series classification: a review, in Data Mining and Knowledge Discovery 2019. [paper]
  • More related time series surveys, tutorials, and papers can be found at this repo.

Self-Supervised Learning Tutorial/Survey in Other Disciplines

  • A cookbook of self-supervised learning, in arXiv 2023. [paper]
  • Self-supervised Learning: Generative or Contrastive, in TKDE 2021. [paper]
  • A Survey on Self-Supervised Learning for Non-Sequential Tabular Data, in arXiv 2024. [paper]

awesome-ssl4ts's People

Contributors

qingsongedu avatar zhangkexin1126 avatar ry-cai avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.