Softmax and binary classification problem in MoleculeNet #5597
Unanswered
jiaruHithub
asked this question in
Q&A
Replies: 1 comment 15 replies
-
I might be missing something, but if you want to train your GNN to output the representation of |
Beta Was this translation helpful? Give feedback.
15 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
In
MoleculeNet
, there is many binary classfication problem datasets. In general,BCE loss
should be used during training on the datasets ofMoleculeNet
. But, I generated a generic representationg_rep
for each class of data in a dataset, When a graph is represented by GNN, I want the representation to match the generic vectorg_rep
, and the class corresponding to the vectorg_rep
with the highest matching possibility is the label of the graph.Thus I use softmax to output the matching possibility between current graph representation and every vector
g_rep
, then I useF.nll_loss
to compute the loss. And the target islabel
, Because the order ofg_rep
for each class is the same as the order of label in one-hot label.But surprisingly, the accuracy rate of the network is only 50%, and it stays the same no matter how hard it is trained. Is there something wrong with me? Help. Thank you very much.
Beta Was this translation helpful? Give feedback.
All reactions