DataParallel + DataListLoader question #2806
Unanswered
DL-WallModel
asked this question in
Q&A
Replies: 1 comment 3 replies
-
Did you forget to call Furthermore, I suggest you to look at our distributed training examples as well (https://github.com/rusty1s/pytorch_geometric/blob/master/examples/multi_gpu/distributed_batching.py) since this will allow you to have more control and is the recommended PyTorch way to train models on multiple GPUs as well. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone!
I'm trying to make a multi GPU implementation following the example found in /examples/multi_gpu/data_parallel.py
However, I have a problem with the forward function. In the example commented above, the list of data provided by the loader is fed straightforwardly to the forward function and then, the data is extracted as follows:
being data a data_list:
I'm trying to proceed the same way, but when I attempt to extract the data, an error is raised:
AttributeError: 'list' object has no attribute 'x'
Actually, this error makes sense to me, since the attribute x belongs to Data and not to a list. Please, could you tell me why in the example, the attributes x, edge_index, and edge_attr are extracted from a data_list? Sorry, I tried to run the example on my computer, but I don't know why it can't download the MNISTSuperpixels dataset and I can't check the variable types on my own.
Thank you in advance for your help! best regards
Joan
Beta Was this translation helpful? Give feedback.
All reactions