Replies: 1 comment 1 reply
-
Thanks for the pointer :) Yes, I'm definitely interested in this PR! I think such regularization should have its place in How does the API of such regularization techniques will look like? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @rusty1s,
I'm interested in adding two regularizers to pytg that we proposed in our recently accepted ICML paper: Improving Molecular Graph Neural Network Explainability with Orthonormalization and Induced Sparsity.
These are both used like L1 or L2 weight decay, though typically only for specific layers. So they're not normalization layers, probably best just implemented as functions.
Before I make a PR, just wanted to 1) make sure it would be likely to be accepted and 2) discuss where something like this might go...I was thinking either
torch_geometric.nn.norm
ortorch_geometric.utils
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions