Replies: 1 comment
-
Yes, this is the recommended way. There is no "shared calculation" though since resulting graphs are still disjoint. For graph pooling, there no exists a x = global_mean_pool(x, data.batch) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a batched input to GATv2Conv with node matrix of shape [batch_sz , num_nodes , node_feature_dim] , but the GATv2Conv accepts input of dim 2 ,searching through the internet , I found some solution ... (not the one I want)
But using above solution , the distinction between graphs got lost , becuase :
batch.x.shape
gave[batch_sz * num_nodes , node_feature_dim]
...It simply put all nodes of all graphs in one single graphs.. Now there is shared calculations between different graphs , which is strictly undesirable... As when applying some graph pooling layer, I don't know which nodes belonged to which graph....
Pls suggest some fix for this issue ...
Thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions