Multiprocessing of RandCropByPosNegLabel #7964
Unanswered
StefanFischer
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Hi @StefanFischer, to put transforms on the GPU, you can try For more details, you can refer to this tutorial: https://github.com/Project-MONAI/tutorials/blob/main/acceleration/fast_model_training_guide.md Hope it helps, thanks. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello MONAI-Team and Users,
i would like to know if there is an option to run RandCropByPosNegLabel Transforms directly on the GPU with multiprocessing.
To ensure fast dataloading i use MONAI's CacheDataset to do determistic transforms before training start. During training the RandCropByPosNegLabel transform is applied on the GPU. I try to train with a very high batch size (>1000 samples per batch) and then the dataloading gets quite slow (5s per iteration) and minimal patch size (16x16x16, bs=1028). While the number of voxels per training input tensor is equal to the maximal patch size (128x128x128, bs=2), the training with the small patch size is significantly larger than with the maximal patch size. Besides the RandCropByPosNegLabel transformation i do NOT apply any other random transform.
Is there an option to do it more efficiently than currently implemented in the MONAI library?
The code of the RandCropByPosNegLabel shows a for loop over the cropping operation.
MONAI/monai/transforms/croppad/array.py
Lines 1047 to 1223 in 59a7211
for i, center in enumerate(self.centers): cropper = SpatialCrop(roi_center=center, roi_size=roi_size, lazy=lazy_) cropped = cropper(img)
I tried to use multiprocessing (torch.multiprocessing), but i always get CUDA related issues. I am also an absolute noob in multiprocessing. Does anyone has experience in CUDA multiprocessing?
Would be happy to hear from you guys :)
Beta Was this translation helpful? Give feedback.
All reactions