Replies: 1 comment 6 replies
-
It's tricky to implement that in a mini-batch fashion, as you need to ensure that your positive link exists in the mini-batch. I guess you could perform subgraph sampling, where you ensure that your positive link exists in the subgraph, and then proceed with your pipeline as usual. |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Matthias,
I am working on a problem that the dataset is different graphs and the label is one of the existing edge in each graph, so I try to use GNN to train the graphs and use
to find the edge with the highest probability, but in this case global pooling is not used, so how to use mini-batching ?
I also find out if the model predict the label (one edge of the graph) directly, the result is good. But if I train the graph, upgrade node attribute and use the mlp(node1+node2) to find out the edge with the highest probability, the model is not working and the result is poor, maybe this method does not make sense. Can you help me?
thank you!
Beta Was this translation helpful? Give feedback.
All reactions