Skip to content

πŸ”¬ A Researcher-Friendly Framework for Time Series Analysis. Train Any Model on Any Dataset!

License

Notifications You must be signed in to change notification settings

Ladbaby/PyOmniTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

294 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

A Researcher-Friendly Framework for Time Series Analysis.

Train Any Model on Any Dataset.


This is also the official repository for the following paper:

  • Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting (ICLR 2026) [OpenReview] [arXiv]

    @inproceedings{li_LearningRecursiveMultiScale_2026,
        author = {Li, Boyuan and Liu, Zhen and Luo, Yicheng  and Ma, Qianli},
        booktitle = {International Conference on Learning Representations},
        title = {Learning Recursive Multi-Scale Representations for Irregular Multivariate Time Series Forecasting},
        year = {2026}
    }
    
  • HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting (ICML 2025) [poster] [OpenReview] [arXiv]

    @inproceedings{li_HyperIMTSHypergraphNeural_2025,
        author = {Li, Boyuan and Luo, Yicheng and Liu, Zhen and Zheng, Junhao and Lv, Jianming and Ma, Qianli},
        booktitle = {Forty-Second International Conference on Machine Learning},
        title = {HyperIMTS: Hypergraph Neural Network for Irregular Multivariate Time Series Forecasting},
        year = {2025}
    }
    

1. ✨ Hightlighted Features

  • Extensibility: Adapt your model/dataset once, train almost any combination of "model" $\times$ "dataset" $\times$ "loss function".
  • Compatibility: Accept models with any number/type of arguments in forward; Accept datasets with any number/type of return values in __getitem__; Accept tailored loss calculation for specific models.
  • Maintainability: No need to worry about breaking the training codes of existing models/datasets/loss functions when adding new ones.
  • Reproducibility: Minimal library dependencies for core components. Try the best to get rid of fancy third-party libraries (e.g., PyTorch Lightning, EasyTorch).
  • Efficiency: Multi-GPU parallel training; Python built-in logger; structured experimental result saving (json)...
  • Transferability: Even if you don't like our framework, you can still easily find and copy the models/datasets you want. No overwhelming encapsulation.

2. 🧭 Documentation

Checkout the new documentation website.

3. πŸ€– Models

51 models, covering regular, irregular, pretrained, and traffic models, have been included in PyOmniTS, and more are coming.

Model classes can be found in models/, and their dependencies can be found in layers/

  • βœ…: supported
  • ❌: not supported
  • '-': not implemented
  • MTS: regularly sampled multivariate time series
  • IMTS: able to handle irregularly sampled multivariate time series
Model Venue Type Forecasting Classification Imputation
Ada-MSHyper NeurIPS 2024 MTS βœ… βœ… βœ…
APN AAAI 2026 IMTS βœ… - -
Autoformer NeurIPS 2021 MTS βœ… βœ… βœ…
Scaleformer ICLR 2023 MTS βœ… - βœ…
BigST VLDB 2024 MTS βœ… βœ… βœ…
Crossformer ICLR 2023 MTS βœ… βœ… βœ…
CRU ICML 2022 IMTS βœ… ❌ βœ…
DLinear AAAI 2023 MTS βœ… βœ… βœ…
ETSformer arXiv 2022 MTS βœ… βœ… βœ…
FEDformer ICML 2022 MTS βœ… βœ… βœ…
FiLM NeurIPS 2022 MTS βœ… βœ… βœ…
FourierGNN NeurIPS 2023 MTS βœ… βœ… βœ…
FreTS NeurIPS 2023 MTS βœ… βœ… βœ…
GNeuralFlow NeurIPS 2024 IMTS βœ… ❌ βœ…
GraFITi AAAI 2024 IMTS βœ… βœ… βœ…
GRU-D Scientific Reports 2018 IMTS βœ… βœ… βœ…
HD-TTS ICML 2024 IMTS βœ… - βœ…
Hi-Patch ICML 2025 IMTS βœ… βœ… βœ…
higp ICML 2024 MTS βœ… βœ… βœ…
HyperIMTS ICML 2025 IMTS βœ… - βœ…
Informer AAAI 2021 MTS βœ… βœ… βœ…
iTransformer ICLR 2024 MTS βœ… βœ… βœ…
Koopa NeurIPS 2023 MTS βœ… ❌ βœ…
Latent_ODE NeurIPS 2019 IMTS βœ… ❌ βœ…
Leddam ICML 2024 MTS βœ… βœ… βœ…
LightTS arXiv 2022 MTS βœ… βœ… βœ…
Mamba Language Modeling 2024 MTS βœ… βœ… βœ…
MICN ICLR 2023 MTS βœ… βœ… βœ…
MOIRAI ICML 2024 Any βœ… - βœ…
mTAN ICLR 2021 IMTS βœ… βœ… βœ…
NeuralFlows NeurIPS 2021 IMTS βœ… ❌ βœ…
NHITS AAAI 2023 MTS βœ… - βœ…
Nonstationary Transformer NeurIPS 2022 MTS βœ… βœ… βœ…
PatchTST ICLR 2023 MTS βœ… βœ… βœ…
Pathformer ICLR 2024 MTS βœ… - βœ…
PrimeNet AAAI 2023 IMTS βœ… βœ… βœ…
Pyraformer ICLR 2022 MTS βœ… βœ… βœ…
Raindrop ICLR 2022 IMTS βœ… βœ… βœ…
Reformer ICLR 2020 MTS βœ… βœ… βœ…
ReIMTS ICLR 2026 IMTS βœ… βœ… -
SeFT ICML 2020 IMTS βœ… βœ… βœ…
SegRNN arXiv 2023 MTS βœ… βœ… βœ…
Temporal Fusion Transformer arXiv 2019 MTS βœ… - -
TiDE TMLR 2023 MTS βœ… βœ… βœ…
TimeCHEAT AAAI 2025 MTS βœ… βœ… βœ…
TimeMixer ICLR 2024 MTS βœ… βœ… βœ…
TimesNet ICLR 2023 MTS βœ… βœ… βœ…
tPatchGNN ICML 2024 IMTS βœ… βœ… βœ…
Transformer NeurIPS 2017 MTS βœ… βœ… βœ…
TSMixer TMLR 2023 MTS βœ… βœ… βœ…
Warpformer KDD 2023 IMTS βœ… βœ… βœ…

4. πŸ’Ύ Datasets

Dataest classes are put in data/data_provider/datasets, and dependencies can be found in data/dependencies:

11 datasets, covering regular and irregular ones, have been included in PyOmniTS, and more are coming.

  • βœ…: supported
  • ❌: not supported
  • '-': not implemented
  • MTS: regularly sampled multivariate time series
  • IMTS: irregularly sampled multivariate time series
Dataset Type Field Forecasting
ECL MTS electricity βœ…
ETTh1 MTS electricity βœ…
ETTm1 MTS electricity βœ…
Human Activity IMTS biomechanics βœ…
ILI MTS healthcare βœ…
MIMIC III IMTS healthcare βœ…
MIMIC IV IMTS healthcare βœ…
PhysioNet'12 IMTS healthcare βœ…
Traffic MTS traffic βœ…
USHCN IMTS weather βœ…
Weather MTS weather βœ…

Datasets for classification and imputation have not released yet.

5. πŸ“‰ Loss Functions

The following loss functions are included under loss_fns/:

Loss Function Task Note
CrossEntropyLoss Classification -
MAE Forecasting/Imputation -
ModelProvidedLoss - Some models prefer to calculate loss within forward(), such as GNeuralFlows.
MSE_Dual Forecasting/Imputation
MSE Forecasting/Imputation -

6. 🚧 Roadmap

PyOmniTS is continously evolving:

  • More tutorials.
  • Classification support in core components.
  • Imputation support in core components.
  • Optional python package management via uv.

Yet Another Code Framework?

We encountered the following problems when using existing ones:

  • Argument & return value chaos for models' forward():

    Different models usually take varying number and shape of arguments, especially ones from different domains. Changes to training logic are needed to support these differences.

  • Return value chaos for datasets' __getitem__():

    datasets can return a number of tensors in different shapes, which have to be aligned with arguments of models' forward() one by one. Changes to training logic are also needed to support these differences.

  • Argument & return value chaos for loss functions' forward():

    loss functions take different types of tensors as input, require aligning with return values from models' forward().

  • Overwhelming dependencies:

    some existing pipelines use fancy high-level packages in building the pipeline, which can lower the flexibility of code modification.

Contributors

Ladbaby
Ladbaby

πŸ’» πŸ›

Acknowledgement

Packages

 
 
 

Contributors