Replies: 2 comments
-
What is the error you are seeing? Your understanding is fully correct. I am not sure though what you mean by
Both |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi, Thanks for your confirmation. By trying out another dataset I discovered that there was a little inconsistency in the timestamps of my partitioning. The behaviour seems indeed correct now. Thanks for your help |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
I am currently trying to create one full batch that contains all the 1-hop neighbors of my input nodes in the graph. My understanding was that if I only ask NeighborLoader for 1 batch, it should contains only the 1-hop neighbors.
Here is my call:
My graph is an
HeteroData
object.input_nodes
of the form['node_type', Tensor[node IDs]]
,input_time
is a tensor of the same size asinput_nodes
.Is there something that I misunderstood so far?
Sadly, the
batch.time_dict
vectors are much longer than the actual nodes frombatch.tf_dict
.I have no idea why that could be the case and any suggestions would be welcome. To tell you a bit more about what I am doing, the graph here is created from a partition of a bigger dataset, the goal is to do distributed training later on, but with a manual partitioning for the purpose of the experiment.
I did consider using
k_hop_subgraph
, but I am failing to see how that would work with an HeteroData object to get back to a usable HeteroData using the returned values.Small edit: I do indeed seem to get the 1-hop graph. The problem is with the time vectors
Beta Was this translation helpful? Give feedback.
All reactions