Skip to content
Discussion options

You must be logged in to vote

The dense GNN layers support [batch_size, num_nodes, num_nodes] tensors, so you can directly use them. If you wanna do sparse message passing via edge_index, you can do

batch, row, col = adj.nonzero()
batch = batch * num_nodes

row = row + batch
col = col + batch
edge_index = torch.stack([row, col], dim=0)

x = x.view(-1, x.size(-1))  # [batch_size * num_nodes, num_features]

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@595693085
Comment options

@rusty1s
Comment options

Answer selected by 595693085
@595693085
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants