This is also the official repository for the following paper:
-
Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting (ICLR 2026) [OpenReview] [arXiv]
@inproceedings{li_LearningRecursiveMultiScale_2026, author = {Li, Boyuan and Liu, Zhen and Luo, Yicheng and Ma, Qianli}, booktitle = {International Conference on Learning Representations}, title = {Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting}, year = {2026} } -
HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting (ICML 2025) [poster] [OpenReview] [arXiv]
@inproceedings{li_HyperIMTSHypergraphNeural_2025, author = {Li, Boyuan and Luo, Yicheng and Liu, Zhen and Zheng, Junhao and Lv, Jianming and Ma, Qianli}, booktitle = {Forty-Second International Conference on Machine Learning}, title = {HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting}, year = {2025} }
-
Extensibility: Adapt your model/dataset once, train almost any combination of "model"
$\times$ "dataset"$\times$ "loss function". -
Compatibility: Accept models with any number/type of arguments in
forward; Accept datasets with any number/type of return values in__getitem__; Accept tailored loss calculation for specific models. - Maintainability: No need to worry about breaking the training codes of existing models/datasets/loss functions when adding new ones.
- Reproducibility: Minimal library dependencies for core components. Try the best to get rid of fancy third-party libraries (e.g., PyTorch Lightning, EasyTorch).
- Efficiency: Multi-GPU parallel training; Python built-in logger; structured experimental result saving (json)...
- Transferability: Even if you don't like our framework, you can still easily find and copy the models/datasets you want. No overwhelming encapsulation.
Checkout the new documentation website.
51 models, covering regular, irregular, pretrained, and traffic models, have been included in PyOmniTS, and more are coming.
Model classes can be found in models/, and their dependencies can be found in layers/
- β : supported
- β: not supported
- '-': not implemented
- MTS: regularly sampled multivariate time series
- IMTS: able to handle irregularly sampled multivariate time series
| Model | Venue | Type | Forecasting | Classification | Imputation |
|---|---|---|---|---|---|
| Ada-MSHyper | NeurIPS 2024 | MTS | β | β | β |
| APN | AAAI 2026 | IMTS | β | - | - |
| Autoformer | NeurIPS 2021 | MTS | β | β | β |
| Scaleformer | ICLR 2023 | MTS | β | - | β |
| BigST | VLDB 2024 | MTS | β | β | β |
| Crossformer | ICLR 2023 | MTS | β | β | β |
| CRU | ICML 2022 | IMTS | β | β | β |
| DLinear | AAAI 2023 | MTS | β | β | β |
| ETSformer | arXiv 2022 | MTS | β | β | β |
| FEDformer | ICML 2022 | MTS | β | β | β |
| FiLM | NeurIPS 2022 | MTS | β | β | β |
| FourierGNN | NeurIPS 2023 | MTS | β | β | β |
| FreTS | NeurIPS 2023 | MTS | β | β | β |
| GNeuralFlow | NeurIPS 2024 | IMTS | β | β | β |
| GraFITi | AAAI 2024 | IMTS | β | β | β |
| GRU-D | Scientific Reports 2018 | IMTS | β | β | β |
| HD-TTS | ICML 2024 | IMTS | β | - | β |
| Hi-Patch | ICML 2025 | IMTS | β | β | β |
| higp | ICML 2024 | MTS | β | β | β |
| HyperIMTS | ICML 2025 | IMTS | β | - | β |
| Informer | AAAI 2021 | MTS | β | β | β |
| iTransformer | ICLR 2024 | MTS | β | β | β |
| Koopa | NeurIPS 2023 | MTS | β | β | β |
| Latent_ODE | NeurIPS 2019 | IMTS | β | β | β |
| Leddam | ICML 2024 | MTS | β | β | β |
| LightTS | arXiv 2022 | MTS | β | β | β |
| Mamba | Language Modeling 2024 | MTS | β | β | β |
| MICN | ICLR 2023 | MTS | β | β | β |
| MOIRAI | ICML 2024 | Any | β | - | β |
| mTAN | ICLR 2021 | IMTS | β | β | β |
| NeuralFlows | NeurIPS 2021 | IMTS | β | β | β |
| NHITS | AAAI 2023 | MTS | β | - | β |
| Nonstationary Transformer | NeurIPS 2022 | MTS | β | β | β |
| PatchTST | ICLR 2023 | MTS | β | β | β |
| Pathformer | ICLR 2024 | MTS | β | - | β |
| PrimeNet | AAAI 2023 | IMTS | β | β | β |
| Pyraformer | ICLR 2022 | MTS | β | β | β |
| Raindrop | ICLR 2022 | IMTS | β | β | β |
| Reformer | ICLR 2020 | MTS | β | β | β |
| ReIMTS | ICLR 2026 | IMTS | β | β | - |
| SeFT | ICML 2020 | IMTS | β | β | β |
| SegRNN | arXiv 2023 | MTS | β | β | β |
| Temporal Fusion Transformer | arXiv 2019 | MTS | β | - | - |
| TiDE | TMLR 2023 | MTS | β | β | β |
| TimeCHEAT | AAAI 2025 | MTS | β | β | β |
| TimeMixer | ICLR 2024 | MTS | β | β | β |
| TimesNet | ICLR 2023 | MTS | β | β | β |
| tPatchGNN | ICML 2024 | IMTS | β | β | β |
| Transformer | NeurIPS 2017 | MTS | β | β | β |
| TSMixer | TMLR 2023 | MTS | β | β | β |
| Warpformer | KDD 2023 | IMTS | β | β | β |
Dataest classes are put in data/data_provider/datasets, and dependencies can be found in data/dependencies:
11 datasets, covering regular and irregular ones, have been included in PyOmniTS, and more are coming.
- β : supported
- β: not supported
- '-': not implemented
- MTS: regularly sampled multivariate time series
- IMTS: irregularly sampled multivariate time series
| Dataset | Type | Field | Forecasting |
|---|---|---|---|
| ECL | MTS | electricity | β |
| ETTh1 | MTS | electricity | β |
| ETTm1 | MTS | electricity | β |
| Human Activity | IMTS | biomechanics | β |
| ILI | MTS | healthcare | β |
| MIMIC III | IMTS | healthcare | β |
| MIMIC IV | IMTS | healthcare | β |
| PhysioNet'12 | IMTS | healthcare | β |
| Traffic | MTS | traffic | β |
| USHCN | IMTS | weather | β |
| Weather | MTS | weather | β |
Datasets for classification and imputation have not released yet.
The following loss functions are included under loss_fns/:
| Loss Function | Task | Note |
|---|---|---|
| CrossEntropyLoss | Classification | - |
| MAE | Forecasting/Imputation | - |
| ModelProvidedLoss | - | Some models prefer to calculate loss within forward(), such as GNeuralFlows. |
| MSE_Dual | Forecasting/Imputation | |
| MSE | Forecasting/Imputation | - |
PyOmniTS is continously evolving:
- More tutorials.
- Classification support in core components.
- Imputation support in core components.
- Optional python package management via uv.
We encountered the following problems when using existing ones:
-
Argument & return value chaos for models'
forward():Different models usually take varying number and shape of arguments, especially ones from different domains. Changes to training logic are needed to support these differences.
-
Return value chaos for datasets'
__getitem__():datasets can return a number of tensors in different shapes, which have to be aligned with arguments of models'
forward()one by one. Changes to training logic are also needed to support these differences. -
Argument & return value chaos for loss functions'
forward():loss functions take different types of tensors as input, require aligning with return values from models'
forward(). -
Overwhelming dependencies:
some existing pipelines use fancy high-level packages in building the pipeline, which can lower the flexibility of code modification.
Ladbaby π» π |
- Time Series Library: Models and datasets for regularly sampled time series are mostly adapted from it.
- BasicTS: Documentation design reference.
- Google Gemini: Icon creation.
