Replies: 1 comment
-
|
Hi @kenchiayy That's a really nice suggestion! But I never checked its correctness or tested in dataset. If you are interested, feel free to optimize or reimplement it. Best regards, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Summary
I want to suggest adding support for the Temporal Fusion Transformer (TFT) architecture to this repository. TFT is a state-of-the-art model developed by Google Research that combines the best of LSTM and attention mechanisms for interpretable multivariate time series forecasting.
Motivation
Implementation Options
There are a few PyTorch-based implementations available:
Benefits
Would love to hear your thoughts on whether this fits the scope of the project, and if so, I’d be happy to help contribute the integration!
Beta Was this translation helpful? Give feedback.
All reactions