You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I update to 0.9, The training process that was normally running under 0.8 collapsed.
Originally only 10 g was needed, but it suddenly soared to 60 g after reading the data.This is a screenshot of my Linux - htop command. Not long after I started training
I have tried to modify num_workers and cases_num, also tried the original monai.data.dataset.It doesn't work.
Mycode is : I've deleted most of transforms
train_transforms = [
LoadImaged(keys=['image', 'label']),
EnsureChannelFirstd(keys=['image', 'label']),
CropForegroundd(keys=['image', 'label'], source_key='image'), ]
train_transforms = Compose(train_transforms)
train_ds = CacheDataset(
data=train_dicts, transform=train_transforms, cache_num=6, copy_cache=True,
cache_rate=1.0, num_workers=4)
train_loader = DataLoader(train_ds, batch_size=2, shuffle=False, collate_fn=list_data_collate, num_workers=4, pin_memory=True)
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When I update to 0.9, The training process that was normally running under 0.8 collapsed.


Originally only 10 g was needed, but it suddenly soared to 60 g after reading the data.This is a screenshot of my Linux - htop command. Not long after I started training
I have tried to modify num_workers and cases_num, also tried the original monai.data.dataset.It doesn't work.
Mycode is : I've deleted most of transforms
train_transforms = [
LoadImaged(keys=['image', 'label']),
EnsureChannelFirstd(keys=['image', 'label']),
CropForegroundd(keys=['image', 'label'], source_key='image'), ]
train_transforms = Compose(train_transforms)
train_ds = CacheDataset(
data=train_dicts, transform=train_transforms, cache_num=6, copy_cache=True,
cache_rate=1.0, num_workers=4)
train_loader = DataLoader(train_ds, batch_size=2, shuffle=False, collate_fn=list_data_collate, num_workers=4, pin_memory=True)
Beta Was this translation helpful? Give feedback.
All reactions