Issues with CacheDataset function #7228
Unanswered
nguyen-peter
asked this question in
Q&A
Replies: 1 comment
-
Hi @nguyen-peter, the warning message is from Thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! I have been trying to create a 3D image segmentation model for ct scans. I have the following transforms:
train_transforms = Compose( [ LoadImaged(keys=["image", "label"]), EnsureChannelFirstd(keys=["image", "label"]), Orientationd(keys=["image", "label"], axcodes="RAS"), Spacingd( keys=["image", "label"], pixdim=(1.5, 1.5, 2.0), mode=("bilinear", "nearest"), ), ScaleIntensityRanged( keys=["image"], a_min=-175, a_max=250, b_min=0.0, b_max=1.0, clip=True, ), CropForegroundd(keys=["image", "label"], source_key="image"), RandCropByPosNegLabeld( keys=["image", "label"], label_key="label", spatial_size=(96, 96, 96), pos=1, neg=1, num_samples=4, image_key="image", image_threshold=0, ), RandFlipd( keys=["image", "label"], spatial_axis=[0], prob=0.10, ), RandFlipd( keys=["image", "label"], spatial_axis=[1], prob=0.10, ), RandFlipd( keys=["image", "label"], spatial_axis=[2], prob=0.10, ), RandRotate90d( keys=["image", "label"], prob=0.10, max_k=3, ), RandShiftIntensityd( keys=["image"], offsets=0.10, prob=0.50, ), ] ) val_transforms = Compose( [ LoadImaged(keys=["image", "label"]), EnsureChannelFirstd(keys=["image", "label"]), Orientationd(keys=["image", "label"], axcodes="RAS"), Spacingd( keys=["image", "label"], pixdim=(1.5, 1.5, 2.0), mode=("bilinear", "nearest"), ), ScaleIntensityRanged(keys=["image"], a_min=-175, a_max=250, b_min=0.0, b_max=1.0, clip=True), CropForegroundd(keys=["image", "label"], source_key="image"), ] ) print("complete")
And I am trying to load my data with the following code:
`data_dir = "/panfs/jay/groups/25/barkerfk/nguy4214/data/"
split_json = "dataset_0.json"
datasets = data_dir + split_json
datalist = load_decathlon_datalist(datasets, True, "training")
val_files = load_decathlon_datalist(datasets, True, "validation")
train_ds = CacheDataset(
data=datalist,
transform=train_transforms,
cache_num=50,
cache_rate=1.0,
num_workers=8,
)
train_loader = DataLoader(train_ds, batch_size=1, shuffle=True, num_workers=8, pin_memory=True)
val_ds = CacheDataset(data=val_files, transform=val_transforms, cache_num=6, cache_rate=1.0, num_workers=4)
val_loader = DataLoader(val_ds, batch_size=1, shuffle=False, num_workers=4, pin_memory=True)
print(train_ds)
print(val_files)
print("done")`
My data is a json that looks like this:
dataset_0.json
It appears that it is struggling at CacheDataset: it typically produces the following output:
Loading dataset: 4%|▍ | 1/23 [00:00<00:07, 2.76it/s]/home/barkerfk/nguy4214/.local/lib/python3.7/site-packages/monai/transforms/spatial/array.py:705: UserWarning: axcodes ('RAS') length is smaller than the number of input spatial dimensions D=2. Orientation: input spatial shape is torch.Size([1060, 1085]), num. channels is 1,please make sure the input is in the channel-first format. f"axcodes ('{self.axcodes}') length is smaller than the number of input spatial dimensions D={sr}.\n" Loading dataset: 52%|█████▏ | 12/23 [00:00<00:00, 18.29it/s]
Thank you for any help
Beta Was this translation helpful? Give feedback.
All reactions