Replies: 1 comment 5 replies
-
How does your training loop look like? If you have no gradients, then did you forget to call |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm working on heterogenous graphs training, and here is the previous post
https://github.com/pyg-team/pytorch_geometric/discussions/6191
Since the loss remains nearly constant, I print the model parameters after loss.backward() of each batch of the model using model.namedparameters(), and they are constant during training, and here is one parameter:
And then I check the grad of this parameter, its None...
I'm wondering how can this happen? I'm following the heteroConv tutorial https://pytorch-geometric.readthedocs.io/en/latest/notes/heterogeneous.html, and here is my forward function.
x_in is a batch. I think the reason could be, x_in is batch and cant be forwarded. So I have modified the train() function into:
And I get error " 'Tensor' object has no attribute 'ptr' "", because need ptr of a node, which is only possible to do when I use batch as the argument of forward. I would appreciate it if someone could help, thanks
Beta Was this translation helpful? Give feedback.
All reactions