Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
elephaint
left a comment
There was a problem hiding this comment.
LGTM, maybe the refactoring in a separate PR?
We can also do it separately though, it's non-blocking imho.
Small detail re. mint.json.
Does it (somewhat) reproduce the paper results?
Here are the results for ETTm1: Paper (avg for 4 horizons) neuralforecast (avg for 4 horizons) Although I was training for 1000 step with early stopping on all horizons and they all reached 1000 steps, so maybe training for longer would give closer scores |
Add TimeXer to neuralforecast. This is a transformer-based model that supports future exogenous features, and seems to perform very well in long-horizon forecasting according to here.
Original code implementation: https://github.com/thuml/TimeXer/tree/main
Paper: https://arxiv.org/abs/2402.19072
Note: The encoder and embedding are specific to TimeXer, that's why I am not reusing the existing common module from other models.