Predicting from 1 instance, not 1 batch #13729
Unanswered
dorienh
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 2 replies
-
Hi, I think you just need to convert raw datas into tensor, unsqueeze the tensor(so that it can avoid dimension issue), then pass into model itself. for example(just a pseudocode), raw_data = # input from any source
tensor = convert_func(raw_data) # your custom codes should be here. tensor size should be like: (n)
tensor = tensor.unsqueeze() # tensor size should become (1, n). normally, batch_size should be in first dimension. since batch_size is 1, it contains only 1 data in tensor.
result = model(tensor)
return result |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am deploying a model whereby speed is of the essence. It's a time series model, so it is trained in larger batches, each with x time_steps as input to the actual model (LSTM input size).
I see the documentation about standard predicting, e.g. in the simplest case:
But is there any way I can avoid to feed the entire batch? I just need to predict for the last time_steps. I fear that it would slow down the model if I need to make many unnecessary predictions.
Sorry if this is a bit of a newbie question.
Beta Was this translation helpful? Give feedback.
All reactions