flatten predictions from batches #15156
Unanswered
pamparana34
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am currently using the trainer predict function for doing some inference and this might be a bit silly but I am calling it as:
Now, the issue is that the dataloader has some greater than 1 batch size set (typically 256), so that I can use the GPU effectively to run inference. So the returned object is a list of list of pytorch tensors and this makes it difficult to output this in desired format.
So, my predict function is something like:
I tried hooking onto
on_predict_epoch_end
as:However, this does not seem to be able to influence the returned tensor. I think nothing can be returned from this method but not sure why it accepts the
results
param.Usually there is something like
train_epoch_end
where we can return values and I wonder how I can do the equivalent for the prediction steps.For various design reasons, I cannot do this on the side of the caller. Is there a way to do this from within the model code through a hook?
Beta Was this translation helpful? Give feedback.
All reactions