Replies: 2 comments 9 replies
-
Yes, our |
Beta Was this translation helpful? Give feedback.
2 replies
-
Hi @rusty1s , I'm kind of confused with this part of the source code for `GatedGraphConv`. Why do you use a for loop here but only output once? I noticed that the iteration times of the for-loop equal to the sequence length, does the purpose of the loop here is to do the message propagation based on the prediction length? |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I tried to implement the above architecture from the paper [https://arxiv.org/abs/1511.05493](Gated Graph Sequence Neural Networks). But I'm confused with the PyG
GatedGraphConv
operator, is this operator works as a single GG-NN in the network?(e.g. like Fo&Fx in the above figure)I didn't see implementations of the node annotation output model for predicting X(k+1) from H(k,T) in the source code. This is an essential step for the next node prediction. If I set the
out_channels
equal to the input size, how can I identify the next link node?Input and output for the

GatedGraphConv
operator. (each node has 3 features)Beta Was this translation helpful? Give feedback.
All reactions