Replies: 1 comment
-
Note that x = torch.empty((0, x_dim), dtype=torch.float) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I've got a series of graph lists, but they don't have the same length, in order to use our dataloader, I'd like to pad the shorter lists to the same length. Now, I use the following blank graph to pad.
It works with the dataloader. But when I put it to the Weight_GCN layer, it shows that:
So, how should I pad a blank graph correctly😔Thank you very much~
ps: I use this Weight_GCN
Beta Was this translation helpful? Give feedback.
All reactions