Replies: 3 comments 8 replies
-
Yes, that is correct. There is currently no easy fix to that and it is hard to provide every possible combination of GNN and normalization layers into a single module. We would need to fix this by parsing the argument list of both the GNN and the normalization layer to propagate the necessary arguments accordingly (i.e., GNN layers do not need |
Beta Was this translation helpful? Give feedback.
6 replies
-
@rusty1s How about this??
|
Beta Was this translation helpful? Give feedback.
2 replies
-
Fixed via #8024 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Models based on
BasicGNN
have anorm
argument to pass a normalization layer. However, the normalization layers build through this are then only called onx
(i.e. node features), and I can't see how they would get other arguments such asbatch
. Normalization layers likeGraphNorm
need thebatch
argument, otherwise defaulting to normalizing across everything (and not within each graph), which is not howGraphNorm
was defined in the original paper, leading to odd behaviour during inference: a model in eval mode gives different results for a fixed input depending on other inputs in the same batch.How can I use
GraphNorm
in GNN Models without having to essentially copy the internals and calling the normalization layers myself?(Also, unless I'm misunderstanding something, to me the current treatment of normalization layers borders on being a bug, as it happily uses the provided normalization layers, just in an unexpected way)
Beta Was this translation helpful? Give feedback.
All reactions