Replies: 1 comment 3 replies
-
There are two ways to add virtual nodes:
x_global = global_mean_pool(x, batch)
x = x + self.lin(x_global)[batch] Yes, reshaping is all you need to do. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When I did the graph classification task, I wanted to add a virtual supernode to represent the whole graph features. But when miniBatch is used, the shape of the data changes. For example, if the dim is 32*9*128 , 32 is batchsize, 9 is nodenum, and 128 is the channel dim. Assuming the number of channels remains the same, the final feature x will be 288*128. I have no idea how to add virtual nodes, should I add a node feature with all zero for each graph when I need to build the dataset? or cat columns of zero?Just like this:
If in a batch,get the supernode feature like this:
right?
Another problem was that I wanted to mix the GCN and CNN layers. Using MiniBatch would lower the dimensions compared to PyTorch,For example, it is 32*9*128 in pytorch, and the feature dimension of x after calculation in PYG will become (32*9=288)*128. If I want to send the result of GCN to CNN during use, I only need to reshape the feature into 32*9*128 doesn't have any effect, does it?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions