Skip to content

If tuner.scale_batch_size() accepts a train_dataloader, why can't this be used independently of trainer.fit(model, datamodule)? #12264

Discussion options

You must be logged in to vote

passing train_dataloader directly to batch_size scaling call will give you an exception error. But passing datamodule is allowed. The reason is that after each batch size scale iteration, we need to reinitialize the dataloader using the scaled value for batch_size param and if you pass in the dataloader itself, it won't be possible to reinitialize the dataloader as of now. Maybe we can add support for it in the future.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@jonathanking
Comment options

Answer selected by jonathanking
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment