fetch data in a non-epoch way #4141
Replies: 2 comments 2 replies
-
Hi @ericspod , Could you please help share some training skills about this question? Thanks in advance. |
Beta Was this translation helpful? Give feedback.
-
What I'm guessing you want is to have a single epoch for training which extracts a set number of batches from you dataset with duplication so that the total up to more data than what's in your dataset. What I've done in the past for this is use Pytorch's samplers with the data loader to define this randomised duplication sampling, allowing me to define epochs of however many iterations I wanted: train_ds = # training dataset
batch_size = 100 # number of items in batches
num_iters = 400 # number of iterations in an epoch
rs=torch.utils.data.RandomSampler(train_ds, replacement=True, num_samples=num_iters * batch_size)
train_loader = monai.data.DataLoader(train_ds, batch_size=batch_size, num_workers=8, sampler=rs)
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for the great job.
I wonder when trainnig a network,if I could fetch batch data without considering the epoch parameter. It means in each iteration,a batch of data is extracted from dataset and dont need to set the traning epoch.
Beta Was this translation helpful? Give feedback.
All reactions