Replies: 1 comment 1 reply
-
That is a current limitation. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I am having problem to get attention weight of heterogenous graph using GATv2
File "/opt/homebrew/lib/python3.10/site-packages/torch/fx/graph_module.py", line 662, in call_wrapped
return self._wrapped_call(self, *args, **kwargs)
File "/opt/homebrew/lib/python3.10/site-packages/torch/fx/graph_module.py", line 279, in call
raise e.with_traceback(None)
TypeError: add(): argument 'input' (position 1) must be Tensor, not tuple
This is an error, and I use 2 GATv2 layers with 'add_self_loops=False' is layer command and 'return_attention_weights=True' in forward line.
If i use other layers (sage or Graph conv) with this data, it always run well and if i do not specifying return_attention_weights (as a default).
How can I solve it?
Beta Was this translation helpful? Give feedback.
All reactions