Make sure batch_size is the same as defined in DataLoader, even after transforms #6242
simonebonato
started this conversation in
General
Replies: 1 comment 2 replies
-
perhaps you can try wrapping the multi-sample dataset with a dataset = monai.data.Dataset(filenames, transform=xform) # xform is the Compose with RandSpatialCropSamplesd
p_dataset = monai.data.PatchDataset(dataset, patch_func=lambda x: x, samples_per_image=4)
loader = monai.data.DataLoader(p_dataset, num_workers=1, batch_size=3, shuffle=False) the output batch size would be 3 in this case. (I haven't tested this code in an end-to-end workflow, please double-check the details such as randomness, multiprocessing...) for more info: MONAI/monai/data/grid_dataset.py Lines 202 to 226 in 8eceabf |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I'm working on a project with 3d head CT scans, and before feeding them to my segmentation model I want to perform some augmentations using monai transforms. The problem is that after I apply RandSpatialCropSamplesd() with num_samples=4, even if I set a batch_size of 1 in the DataLoader I still get an effective batch_size of 4, and since the data is too big I cannot load it onto the GPU.
Is there a way to overcome this issue? I tried changing the collate_fn and using other transforms but without success.
The transforms I am using are the following:
Beta Was this translation helpful? Give feedback.
All reactions