Using output of NN as edge_weights raises backward gradient runtime error inplace operation #2540
Unanswered
Mandeep-Rathee
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Removing |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We want to use the
edge_weight
parameter ofSGC
(and later also other GNNs). However, if we used a neural network to compute theedge_weight
, we get an error in thebackward
call of the second epoch. The first epoch does not throw an error. Since our first idea was that our own code for the neural network caused the error, we replaced this byGAT
or to be more precise the attention weights retrieved by GAT. However, the error still remains the same. Can you help us solving this problem?We also tried using the
__explain_mask__
and__explain__
parameters (ignoring the sigmoid for a moment), but sadly it gives results in the same issue.We checked it with the following version:
Python 3.6.7, torch 1.6.0, pytorch-geometric 1.6.1
Minimal Example
Error
We are able to execute the code with torch 1.4 and pytorch-geometric 1.6.1, but we guess that it was not caught. Inplace change detection was extended in torch 1.5, if we have searched correctly. So it is simply not detected in torch 1.4, but probably the gradients are wrong.
Beta Was this translation helpful? Give feedback.
All reactions