If tuner.scale_batch_size() accepts a train_dataloader, why can't this be used independently of trainer.fit(model, datamodule)? #12264
-
My training code (after a lot of setup) looks like this: # Create a trainer
trainer = pl.Trainer.from_argparse_args(argparse.Namespace(**dict_args),
callbacks=my_callbacks)
# Look for largest batch size and optimal LR
tuner = Tuner(trainer)
tuner.scale_batch_size(
model,
train_dataloaders=data_module.get_descending_size_train_dataloader(),
init_val=1)
tuner.lr_find(model, data_module.train_dataloader())
# Train the model
trainer.fit(model, data_module)
trainer.test(model, data_module)
I thought making a data module was the best practice. Am I not allowed to pass a dataloader to both |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
passing train_dataloader directly to batch_size scaling call will give you an exception error. But passing datamodule is allowed. The reason is that after each batch size scale iteration, we need to reinitialize the dataloader using the scaled value for batch_size param and if you pass in the dataloader itself, it won't be possible to reinitialize the dataloader as of now. Maybe we can add support for it in the future. |
Beta Was this translation helpful? Give feedback.
passing train_dataloader directly to batch_size scaling call will give you an exception error. But passing datamodule is allowed. The reason is that after each batch size scale iteration, we need to reinitialize the dataloader using the scaled value for batch_size param and if you pass in the dataloader itself, it won't be possible to reinitialize the dataloader as of now. Maybe we can add support for it in the future.