Replies: 2 comments 3 replies
-
I haven't personally tried to reproduce the OGB experiments, but you should be able to construct a GNN model via y_one_hot = torch.nn.functional.one_hot(data.y)
train_mask = data.train_mask.clone()
rnd_mask = torch.rand(train_mask.size(0)) < 0.5
y_one_hot[(~train_mask) | (~rnd_mask)] = 0. # Do not leak information
x = torch.cat([data.x, y_one_hot], dim=-1)
train_mask = train_mask & rnd_mask # New mask to train against |
Beta Was this translation helpful? Give feedback.
0 replies
-
Two questions about:
|
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi to everyone!
Is there anyone that has reproduced the experiment on ogbn-proteins of the paper "Masked Label Prediction: Unified Message Passing model for semi-supervised classification" with the Pyg operator TransformerConv ? And so which layers to use and how to embed the labels? The PGL implementation is not so clear. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions