Skip to content
/ Timer-XL Public

About Code release for "Timer-XL: Long-Context Transformers for Unified Time Series Forecasting"

License

Notifications You must be signed in to change notification settings

thuml/Timer-XL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 

Repository files navigation

Timer-XL

Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [Paper].

🚩 News (2025.01) Timer-XL has been accepted as ICLR 2025. See you at Singapore :)

🚩 News (2024.12) Released a univariate pre-trained version [HuggingFace]. An quickstart usage is provided here.

🚩 News (2024.10) Model implementation is released in [OpenLTM].

Introduction

Timer-XL is a generative Transformer for time series forecasting. It can be used for task-specific training or scalable pre-training, handling arbitrary-length and any-variable time series.

💪 Various forecasting tasks can be formuled as a long-context generation problem, which can be well addressed by generative Transformers.

💡 We propose multivariate next token prediction, a paradigm to uniformly predict univariate and multivariate time series with optional covariates.

🌟 We pre-train Timer-XL, a long-context version of time-series Transformers (Timer), for zero-shot forecasting.

🏆 Timer-XL achieves state-of-the-art performance as a one-for-all time series forecaster.

What is New

For our previous work, please refer to Time-Series-Transformer (Timer)

Model Architecture

Time-Series Transformers PatchTST iTransformer Moirai Timer Timer-XL (Ours)
Generative No No No Yes Yes
Intra-Series Modeling Yes No Yes Yes Yes
Inter-Series Modeling No Yes Yes No Yes

Generalize 1D Sequences to 2D Time Series

Multivariate Next Token Prediction

We generalize next token prediction for multivariate time series. Each prediction is made based on tokens of the previous time from multiple variables:

Universal TimeAttention

We design TimeAttention, a causal self-attention allowing intra- and inter-series modeling while maintaining the causality and flexibility of generative Transformers. It can be applied to univariate and covariate-informed contexts, enabling unified time series forecasting.

Main Results

Citation

If you find this repo helpful, please cite our paper.

@article{liu2024timer,
  title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
  author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
  journal={arXiv preprint arXiv:2410.04803},
  year={2024}
}

Acknowledgment

We appreciate the following GitHub repos a lot for their valuable code and efforts:

Contact

If you have any questions or want to use the code, feel free to contact:

About

About Code release for "Timer-XL: Long-Context Transformers for Unified Time Series Forecasting"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published