How are contributions computed with the explain class #8595
Closed
arthurserres
started this conversation in
General
Replies: 1 comment 1 reply
-
For integration with GNNs, we wrap the utilized GNN into a parent module, which expects two inputs: The node feature matrix, and a soft edge mask which is multiplied within message passing to every message. These are the two inputs that Captum will generate attributions for. Hope this answers your question. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello all,
I try to understand the way the contributions are computed by the explain library.
I use Captum explainer and in particular Integrated Gradient method. I read the paper explaining the integrated gradient method so the method by itself is clear. However I looked at the source code as well at the documentation but I do not manage to understand how each node feature abd edge contribution are computed, may someone help on that ?
Thanks a lot :)
Beta Was this translation helpful? Give feedback.
All reactions