Replies: 1 comment 8 replies
-
That's a cool finding. Does that also hold true when training the models multiple times (e.g., 100 times) and compare the mean accuracy over the models? |
Beta Was this translation helpful? Give feedback.
8 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When I delete 20% of the training nodes in the CORA dataset, I find that the accuracy is 1% higher than when I didn't delete it, and only when I delete more than 50% of the training nodes, the accuracy will drop significantly.Why is this happening? it is because removing some nodes makes the model more robust?
Beta Was this translation helpful? Give feedback.
All reactions