-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
Hi, thank you for sharing the implementation of your great work! And I have some concerns with the following line of code:
Lines 346 to 348 in f238d37
| loss_G.backward(retain_graph=True) | |
| nn.utils.clip_grad_norm_(self.discr.parameters(), 5) | |
| self.gen_opt.step() |
Why do you clip the grad of the discriminator when updating the generator (maybe instead of clipping the grad of generator)? I fail to find the corresponding explanation in your paper.
I also notice that when updating the discriminator, fake images are not detached from the generator network (which is a common practice to stop the grad from being propagated to the generator). Do you intend to do so?
I really appreciate it if you could kindly solve my issues.
Best Regards
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels