Replies: 1 comment
-
You are right that currently, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Dear PyTorch Geometric Team,
I am working with dense batches and I came across the DenseDataLoader class. As per the documentation of DenseDataLoader, it seems to require that all graph attributes in a dataset need to have the same shape for the loader to function correctly. This implies that each graph in the dataset should have the same number of nodes and consistent feature dimensions. My dataset consists of graphs, each with a different number of nodes, which is a common scenario in graph-based datasets. Indeed, when I try to use DenseDataLoader with my dataset, it throws me an error that it cannot stack my node attributes.
Given this context, I am trying to understand the practical utility of DenseDataLoader. Could you elaborate on why DenseDataLoader necessitates the same shape for all graph attributes? Does this imply that the x attribute (node features) of each graph should be identical in shape for every graph in the dataset? I was hoping that by using DenseDataLoader I would get the same result as using regular DataLoader followed by to_dense_batch. But it seems not to be the case, or am I missing something? Why doesn't this data loader produce the same output as to_dense_batch?
Looking at this issue (#881) I see that the person is talking about DenseDataLoader returning a mask for each batch, as each graph has a different number of nodes. So, what I'm missing?
Thank you in advance for your insights on this matter.
Beta Was this translation helpful? Give feedback.
All reactions