MaskedPatchWSIDataset on unequal sample sizes #6358
-
I am trying to some experiments with the Chamelyon Dataset using the dir_path:str="/Datasets/CHAMELYON_IMG"
extn:str=".tif"
search_parameter = os.path.join(dir_path, '**/*'+extn)
train_data_list = [{"image": image_path} for image_path in glob.glob(search_parameter, recursive=True) ]
# load datasets and set the transforms
preprocess_cpu_train = Compose(
[
ScaleIntensityRangeD(keys="image", a_min=0.0, a_max=255.0, b_min=0, b_max=1.0),
ToTensorD(keys="image"),
]
)
dataset = MaskedPatchWSIDataset(
data=train_data_list,
patch_size=patch_size,
transform=preprocess_cpu_train,
reader="cuCIM",
)
dataloader = DataLoader(dataset, batch_size=128, num_workers=2) When I sample from the dataloader I am faced with the error below
Do all the images have to be of the same size to use |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi @a-parida12, I don't think |
Beta Was this translation helpful? Give feedback.
-
Thanks @a-parida12 for your question. As we investigated in #6360, the error was not reproducible, and |
Beta Was this translation helpful? Give feedback.
Thanks @a-parida12 for your question. As we investigated in #6360, the error was not reproducible, and
MaskedPatchWSIDataset
works fine on input images with various sizes.