Replies: 1 comment 5 replies
-
This is hard to say. I don't think it's a problem with model or optimization as those look correct to me. The only thing that you might wanna fix in that regard is that you are wrongly computing your loss, as you only divide it by the total number of graphs and not the total number of nodes in all training graphs. |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have the following Data Generation Process, basically I am trying to generate the default of banks based on an algorithm and then classify if a bank defaulted or not:
My GNN is as follows:
The Training looks as follows:
And the test is as floows:
I train for 300 epochs. My loss doesnt change by must and my train accuracy decreases. Is there any reason why this could be happening? I have tries playing with batch_sizes and learning rates as well.
Beta Was this translation helpful? Give feedback.
All reactions