Replies: 1 comment 1 reply
-
If you have a dataset with graphs of varying size, I suggest you use |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have thousands of graphs but with the different number of nodes and edge_index. During training, it will take a long time to stack along one dimension with Dataloader. So I'd like to try the mini-batch method. But the DenseDataLoader and cat_dim methods need the number of nodes is all the same.
class MyData(Data): def __cat_dim__(self, key, item): if key == 'foo': return None else: return super().__cat_dim__(key, item)
How could I accelerate the training? I think the first one is to batching along a new dimension without stacking. The two above methods not gonna work with the different number of nodes.
The second method is using SparseTensor, but this works with batch_size =1.
Beta Was this translation helpful? Give feedback.
All reactions