Replies: 1 comment
-
Duplicate of #7999. I will try to reproduce. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone else gotten the following error when using an LSTM jumping knowledge with the GAT model: "UserWarning: RNN module weights are not part of single contiguous chunk of memory"? How do we correctly use the GAT with LSTM JK to avoid this warning? Or is this a bug?
Here is the full error message:
It appears that this can be fixed by calling
flatten_parameters()
on the internal LSTM model. But how can we do this when we don't have access to the underlying LSTM module?My model:
Prediction code (ran after training for a number of epochs):
Environment
The code for instantiating the model:
Beta Was this translation helpful? Give feedback.
All reactions