-
The situation is that I defined my personal Pytorch Dataset for DICOM files, the MONAI transformer is applied in the "getitem" function. The code snippet is as follows,
The MONAI transformer is,
an error raises, Major environment config: |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
yes, the root cause is that cuda runtime doesn't support fork start method for multiprocessing https://pytorch.org/docs/stable/notes/multiprocessing.html#cuda-in-multiprocessing, I think you can try either make sure the transforms are operating on cpu data, or use monai's thread data loader (https://github.com/Project-MONAI/tutorials/blob/1.3.0/acceleration/fast_training_tutorial.ipynb), or use |
Beta Was this translation helpful? Give feedback.
yes, the root cause is that cuda runtime doesn't support fork start method for multiprocessing https://pytorch.org/docs/stable/notes/multiprocessing.html#cuda-in-multiprocessing, I think you can try either make sure the transforms are operating on cpu data, or use monai's thread data loader (https://github.com/Project-MONAI/tutorials/blob/1.3.0/acceleration/fast_training_tutorial.ipynb), or use
torch.multiprocessing.set_start_method('spawn')
.