Replies: 1 comment
-
The goal of applying dropout is to avoid overfitting. In that way, you force the model to make correct predictions given only a subset of data available. If your model already performs well, and there is no significant overfitting taking place, you do not need to necessarily apply dropout. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Currently I am using dropout in my model as I have seen that it is quite standard for GNN architectures. I am trying to fine tune my model (which already has good results). I do not really grasp well what could be to drop certain node features for node prediction. Doesn't it make prediction more difficult, without any benefit? I am wondering whether I am using a trick that would be actually more fitted for graph level prediction.
To put some context, here is the forward function from my model:
Beta Was this translation helpful? Give feedback.
All reactions