Setting Seed For Graph Attention Network #4081
Unanswered
akul-goyal
asked this question in
Q&A
Replies: 1 comment
-
Yes, this is a known "issues" of While I agree that determinism is generally desirable, one cannot guarantee it in GNNs due to the permutation invariance constraint. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am setting the seed within my code using torch_geometric.seed.seed_everything(100). However, I am still a lot of variation within my results between iterations under the same seed. To figure out where the non-determinism is going on, I saved values from each layer in my network and compared them between different iterations. I start to see differences after I pass my feature matrix through the first attention layer. I have configured my attention layer as follows:
self.GAT = GATConv(hidden_channels, hidden_channels, add_self_loops=False, negative_slope=0.01, dropout=0.6, heads=8)
I assumed that dropout or the number of heads I am using might be bringing the variation, but even when I took out dropout and reduced the number of heads to 1 I still saw discrepancies. Is there something I am doing wrong or is this due to something that is internal to the GAT?
Beta Was this translation helpful? Give feedback.
All reactions