Replies: 1 comment 7 replies
-
Copy over from Slack: This is a very good question for which I don't have a good answer. GNNExplainer basically works via multiple loss function (decrease attribution in general but keep prediction close to the initial prediction). If you are seeing attribution still heavily decreasing after many epochs, this might be a signal that this multi-task loss is not properly weighted. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When using GNNExplainer as explanation algorithm, what is the basis on which to choose the number of epochs parameters?
I have trained a graph model for binary node classification task. I am using node_mask parameter in explainer as 'attributes' (every node has a feature vector of dim 64) and edge_mask parameter as 'object'.
I tried using different values of epochs.
As I increase epochs, the total sum of node features attributions matrix keeps on decreasing. (E.g.~70000 for 200 epochs and ~5 for 50000 epochs)
If I check the relative importance of different nodes for different epoch setting, self node always gets max importance compared to other nodes but for smaller epochs e.g. 200 it is only 0.03% and after more epochs e.g. 50000 it is 65%.
So overall it seems explanation changes hugely based on choice of number of epochs.
Is there any thereorical or practical criteria to choose appropriate number of epochs for GNNExplainer?
Beta Was this translation helpful? Give feedback.
All reactions