-
HypergraphConv first aggregates node features into hyperedges, during which attention weights are applied to evaluate the importance of different nodes in hyperedges. and aggregates hyperedge features into nodes, Should there be a change in the calculation of alpha? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
As far as I understand the paper, this is correct, as the attention module computes attention for each node belonging to different sets of hyperedges, not attention across all nodes belonging to the same hyperedge. |
Beta Was this translation helpful? Give feedback.
As far as I understand the paper, this is correct, as the attention module computes attention for each node belonging to different sets of hyperedges, not attention across all nodes belonging to the same hyperedge.