Replies: 2 comments 2 replies
-
You can convert n graphs into a single graph with n disconnected components and then pass that to
|
Beta Was this translation helpful? Give feedback.
0 replies
-
I don't think you can really do imbalanced sampling on node-level tasks across multiple graphs. How should that work? As an alternative, I think it may be good to simply re-weight your loss. PyTorch provides native support for this via the |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have multiple graphs with unbalanced classes on which I attempt to train a model on a node multi-class classification task. I would like to apply the ImbalancedSampler (https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/loader/imbalanced_sampler.html) to the Dataset object, consisting of multiple graphs (as Data objects). As far as I understand the documentation, currently I can either do sampling on node level with a single Data object, or graph-level sampling by passing a Dataset object. In the second case, the ImbalancedSampler throws an error in case that the number of classes is not equal to the number of graphs in the Dataset (which makes sense since it is meant for graph classification I guess).
Is there a way to handle my inbalanced graphs while training for a node classification task? Can I adapt the ImbalancedSampler in a way to that the sampler is applied than for all nodes in all graphs? Or are there other workarounds for unbalanced classes in multiple graphs while node classification?
Thank you in advance.
Beta Was this translation helpful? Give feedback.
All reactions