Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [Paper].
🚩 News (2025.01) Timer-XL has been accepted as ICLR 2025. See you at Singapore :)
🚩 News (2024.12) Released a univariate pre-trained version [HuggingFace]. An quickstart usage is provided here.
🚩 News (2024.10) Model implementation is released in [OpenLTM].
Timer-XL is a generative Transformer for time series forecasting. It can be used for task-specific training or scalable pre-training, handling arbitrary-length and any-variable time series.
💪 Various forecasting tasks can be formuled as a long-context generation problem, which can be well addressed by generative Transformers.
💡 We propose multivariate next token prediction, a paradigm to uniformly predict univariate and multivariate time series with optional covariates.
🌟 We pre-train Timer-XL, a long-context version of time-series Transformers (Timer), for zero-shot forecasting.
🏆 Timer-XL achieves state-of-the-art performance as a one-for-all time series forecaster.
For our previous work, please refer to Time-Series-Transformer (Timer)
Time-Series Transformers | PatchTST | iTransformer | Moirai | Timer | Timer-XL (Ours) |
---|---|---|---|---|---|
Generative | No | No | No | Yes | Yes |
Intra-Series Modeling | Yes | No | Yes | Yes | Yes |
Inter-Series Modeling | No | Yes | Yes | No | Yes |
We generalize next token prediction for multivariate time series. Each prediction is made based on tokens of the previous time from multiple variables:
We design TimeAttention, a causal self-attention allowing intra- and inter-series modeling while maintaining the causality and flexibility of generative Transformers. It can be applied to univariate and covariate-informed contexts, enabling unified time series forecasting.
If you find this repo helpful, please cite our paper.
@article{liu2024timer,
title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
journal={arXiv preprint arXiv:2410.04803},
year={2024}
}
We appreciate the following GitHub repos a lot for their valuable code and efforts:
- Time-Series-Library (https://github.com/thuml/Time-Series-Library)
- Large-Time-Series-Model (https://github.com/thuml/Large-Time-Series-Model)
- AutoTimes (https://github.com/thuml/AutoTimes)
If you have any questions or want to use the code, feel free to contact:
- Yong Liu ([email protected])
- Guo Qin ([email protected])