This is a repository for collecting papers and code in time series domain.
├─ Linear/
├─ RNN and CNN/
├─ Transformer/
├─ GNN/
├─ LLM Framework/
├─ Diffusion Model/
├─ Benchmark and Dataset/
└─ Repositories/
- N-BEATS: Neural basis expansion analysis for interpretable time series forecasting, Oreshkin et al., ICLR 2020. [paper][n-beats][N-BEATS]
- DLinear: Are Transformers Effective for Time Series Forecasting, Zeng et al., AAAI 2023. [paper][code][DiPE-Linear][TimeLinear]
- TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting, Ekambaram et al., KDD 2023. [paper][model][example]
- FreTS: Frequency-domain MLPs are More Effective Learners in Time Series Forecasting, Yi et al., NeurIPS 2023. [paper][code][FilterNet]
- Tiny Time Mixers (TTMs): Fast Pretrained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series, Ekambaram et al., arxiv 2024. [paper][code]
- FCDNet: Frequency-Guided Complementary Dependency Modeling for Multivariate Time-Series Forecasting, Chen et al., arxiv 2023. [paper][code]
- SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion, Han et al., NeurIPS 2024. [paper][code]
- SparseTSF: Modeling Long-term Time Series Forecasting with 1k Parameters, Lin et al., ICML 2024 Oral. [paper][code]
- TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting, Wang et al., ICLR 2024. [paper][code]
- DUET: Dual Clustering Enhanced Multivariate Time Series Forecasting, Qiu et al., KDD 2025. [paper][code]
- TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis, Wu et al., ICLR 2023. [paper][code][slides]
- RWKV-TS: Beyond Traditional Recurrent Neural Network for Time Series Tasks, Hou and Yu, arxiv 2024. [paper][code]
- Transformers in Time Series: A Survey, Wen et al., IJCAI 2023. [paper][code]
- Deep Time Series Models: A Comprehensive Survey and Benchmark, Wang et al., arxiv 2024. [paper][code]
- Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting, Zhou et al., AAAI 2021 Best paper. [paper][code]
- Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting, Wu et al., NeurIPS 2021. [paper][code][slides][ETSformer]
- Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy, Xu et al., ICLR 2022. [paper][code][slides][TranAD]
- Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting, Liu et al., NeurIPS 2022. [paper][code]
- iTransformer: Inverted Transformers Are Effective for Time Series Forecasting, Liu et al., ICLR 2024 Spotlight. [paper][code]
- Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting, Liu et al., ICLR 2022. [paper][code]
- FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting, Zhou et al., ICML 2022. [paper][code][DAMO-DI-ML]
- PatchTST: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers, Nie et al., ICLR 2023. [paper][code]
- Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting, Zhang and Yan, ICLR 2023. [paper][code]
- TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables, Wang et al., NeurIPS 2024. [paper][code]
- UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting, Liu et al., arxiv 2024. [paper]
- MetaTST: Metadata Matters for Time Series: Informative Forecasting with Transformers, Dong et al., arxiv 2024. [paper]
- Are Language Models Actually Useful for Time Series Forecasting, Tan et al., NeurIPS 2024. [paper][code][CATS]
- Rethinking the Power of Timestamps for Robust Time Series Forecasting: A Global-Local Fusion Perspective, Wang et al., NeurIPS 2024. [paper][code][ChatTime]
- ElasTST: Towards Robust Varied-Horizon Forecasting with Elastic Time-Series Transformer, Zhang et al., NeurIPS 2024. [paper][code]
- Pathformer: Multi-scale Transformers with Adaptive Pathways for Time Series Forecasting, Chen et al., ICLR 2024. [paper][code]
- A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection, Jin et al., arxiv 2023. [paper][code]
- GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks, Li et al., NeurIPS 2023. [paper][code]
- FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure Graph Perspective, Yi et al., NeurIPS 2023. [paper][code]
- MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate Time Series Forecasting, Cai et al., AAAI 2024. [paper][code]
-
Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook, Jin et al., arxiv 2023. [paper][code]
-
Large Language Models for Time Series: A Survey, Zhang et al., arxiv 2024. [paper][code]
-
Large Language Models for Forecasting and Anomaly Detection: A Systematic Literature Review, Su et al., arxiv 2024. [paper]
-
SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling, Dong et al., NeurIPS 2023 Spotlight. [paper][code]
-
One Fits All: Power General Time Series Analysis by Pretrained LM, Zhou et al., NeurIPS 2023 Spotlight. [paper][code][AI-for-Time-Series-Papers-Tutorials-Surveys][CALF]
-
Large Language Models Are Zero-Shot Time Series Forecasters, Gruver et al., NeurIPS 2023. [paper][code]
-
Lag-Llama: Towards Foundation Models for Time Series Forecasting, Rasul et al., arxiv 2023. [paper][code]
-
TimesFM: A decoder-only foundation model for time-series forecasting, Das et al., ICML 2024. [paper][code]
-
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models, Jin et al., ICLR 2024. [paper][code]
-
AutoTimes: Autoregressive Time Series Forecasters via Large Language Models, Liu et al., NeurIPS 2024. [paper][code]
-
Timer: Generative Pre-trained Transformers Are Large Time Series Models, Liu et al., ICML 2024. [paper][code][Unified Time Series Dataset][website]
-
Timer-XL: Long-Context Transformers for Unified Time Series Forecasting, Liu et al., arxiv 2024. [paper]
-
TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling, Dong et al., ICML2024. [paper][code]
-
MOMENT: A Family of Open Time-series Foundation Models, Goswami et al., ICML 2024. [paper][code]
-
Unified Training of Universal Time Series Forecasting Transformers, Woo et al., ICML 2024. [paper][code]
-
Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning, Bian et al., arxiv 2024. [paper]
-
UNITS: A Unified Multi-Task Time Series Model, Gao et al., NeurIPS 2024. [paper][code]
-
Chronos: Learning the Language of Time Series, Ansari et al., arxiv 2024. [paper][code]
-
Large language models can be zero-shot anomaly detectors for time series, Alnegheimish et al., arxiv 2024. [paper]
-
Foundation Models for Time Series Analysis: A Tutorial and Survey, Liang et al., arxiv 2024. [paper][granite-tsfm]
-
Are Language Models Actually Useful for Time Series Forecasting?, Tan et al., arxiv 2024. [paper][code]
-
LETS-C: Leveraging Language Embedding for Time Series Classification, Kaur et al., arxiv 2024. [paper]
-
Towards Neural Scaling Laws for Time Series Foundation Models, Yao et al., arxiv 2024. [paper]
-
VisionTS: Visual Masked Autoencoders Are Free-Lunch Zero-Shot Time Series Forecasters, Chen et al., arxiv 2024. [paper][code]
-
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts, Shi et al., ICLR 2025. [paper][code][Moirai-MoE]
- Diffusion-TS: Interpretable Diffusion for General Time Series Generation, Yuan and Qiao, ICLR 2024. [paper][code]
- A Survey on Diffusion Models for Time Series and Spatio-Temporal Data, Yang et al., arxiv 2024. [paper][code]
- TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model, Cao et al., arxiv 2024. [paper]
- UTSD: Unified Time Series Diffusion Model, Ma et al., arxiv 2024. [paper]
- Auto-Regressive Moving Diffusion Models for Time Series Forecasting, Gao et al., AAAI 2025. [paper][code]
-
TSPP: A Unified Benchmarking Tool for Time-series Forecasting, Bączek et al., arxiv 2023. [paper][code]
-
TFB: Towards Comprehensive and Fair Benchmarking of Time Series Forecasting Methods, Qiu et al., arxiv 2024. [paper][code]
-
A Survey of Generative Techniques for Spatial-Temporal Data Mining, Zhang et al., arxiv 2024. [paper]
-
Time-MMD: A New Multi-Domain Multimodal Dataset for Time Series Analysis, Liu et al., arxiv 2024. [paper][code][MM-TSFlib]
-
GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation, Aksu et al., arxiv 2024. [paper][code]
-
[multivariate-time-series-data][ETDataset][Awesome-TimeSeries-SpatioTemporal-Diffusion-Model]
- [Time-Series-Library]
- [time-series-transformers-review][awesome-AI-for-time-series-papers][Awesome-TimeSeries-SpatioTemporal-LM-LLM][TSFpaper][deep-learning-time-series][LLMs4TS][awesome-time-series-papers]
- [statsforecast][neuralforecast][gluonts][Merlion][pytorch-forecasting][tsai][pytorch-transformer-ts][flow-forecast][pytorch-ts]
- [AIAlpha]
- [prophet][Kats][tsfresh][sktime][darts][tslearn][pyflux]