Link Prediction-based Molecule Generation #4413
Unanswered
fork123aniket
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Is that training AP or validation AP? In general, it is expected that you achieve a better training loss using fixed-size negative samples (since the model can just easily overfit on these samples). I don't think you are using |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This question is an extension of #4341. The train, test, and val functions can be seen below:-
As advised in #4341, I'd applied
batched_negative_sampling
to randomly sample negative edges for multiple graphs. However, I ended up having badVGAE
model with AUC: 0.5162 and AP: 0.0683. Interestingly (enough), when didn't putbatched_negative_sampling
to use while training theVGAE
model (usingloss = model.recon_loss(z, data.pos_edge_label_index, data.neg_edge_label_index)
insidetrain()
), got AP as high as 0.7014 (which is quite good). While employingbatched_negative_sampling
, why it results in worst performance of the model?? Have I appliedbatched_negative_sampling
in a wrong manner?? Do I need to usebatched_negative_sampling
at the validation time as well, i.e.auc, ap = model.test(z, data.pos_edge_label_index, batched_negative_sampling(data.edge_index, data.batch, method='dense', force_undirected=True))
insideval()
??Note that in order to get negative edges (
neg_edge_label
andneg_edge_label_index
) for all the graphs present in the training set of ZINC dataset, I've setadd_negative_train_samples=True
, as can be seen below:-This can also help superseding the need for using
batched_negative_sampling
if I ain't mistaken.Beta Was this translation helpful? Give feedback.
All reactions