Replies: 1 comment
-
Thanks for your interest. There is no concrete plan yet about distributed graph storage (interested to hear your thoughts on this one), but distributed training is fully supported (either via standard PyTorch or PyTorch Lightning). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, What's your plan about distributed training in PyG? I think we can implement this feature together
Beta Was this translation helpful? Give feedback.
All reactions