Datamodule without Trainer (for inference) #6502
Answered
by
carmocca
indigoviolet
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
In my usage, LightningDatamodule is currently encapsulating batch collation, moving to device, and batch transformations (via However, when I want to do inference on a bunch of inputs, I want the same steps to happen. What is the recommended way to achieve this? The problem is that Trainer drives the device transfers and hooks around it, and I don't have a Trainer during inference. |
Beta Was this translation helpful? Give feedback.
Answered by
carmocca
Apr 20, 2021
Replies: 1 comment
-
Why would you not want to use the Trainer? You can now use |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
carmocca
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Why would you not want to use the Trainer?
You can now use
trainer.predict
for inference (will be in beta after the 1.3 release)