edge_index when batching #9054
Replies: 1 comment 3 replies
-
Yes, |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone, I am struggling with getting the edge_index attribute right when microbatching.
I have a Torch Dataset with the following methods:
I am using the torch_geometric.loader.DataLoader and am a bit confused as to how batching works. I understand that each batch of Data objects is merged into one large Data object, but are the batch edge_indexes also supposed to be merged into one large edge_index? Here is my microbatching function:
I am currently just taking the edge_index from 1 sample and using that for the whole batch (my edges are static across samples). In my train step function I then do:
And eventually get this error when using a GCNConv layer:
I have made sure that my edge_index does not contain any self loops, and all edge indices are within bounds. The edge_index being passed to the model in the microbatch is of size (2, 25116) and microbatch.x is of size (16, 8372) where the microbatch size is 16.
I am clearly getting something wrong with the edge_index during microbatching but am not sure how it is usually done in PyG. Thanks for any help!
Beta Was this translation helpful? Give feedback.
All reactions