Isn't here duplicate computation for gan example? #14506
Answered
by
akihironitta
wztdream
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Hi, # train generator
if optimizer_idx == 0:
# generate images
self.generated_imgs = self(z) ## ,<-- first The other here:
And also the sampling seems also duplicate: def training_step(self, batch, batch_idx, optimizer_idx):
imgs, _ = batch
# sample noise
z = torch.randn(imgs.shape[0], self.hparams.latent_dim)
z = z.type_as(imgs) I do think this is unreasonable. So how to avoid this unnecessary computation in lightning, I know there are some tricks to achieve, such as using a |
Beta Was this translation helpful? Give feedback.
Answered by
akihironitta
Sep 3, 2022
Replies: 1 comment 2 replies
-
@wztdream Yes, it is redundant. I see a few options:
|
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
wztdream
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@wztdream Yes, it is redundant. I see a few options:
optimizer_idx==1
.