Skip to content

Regarding some details of updating the generator and discriminator #25

@yunfan0621

Description

@yunfan0621

Hi, thank you for sharing the implementation of your great work! And I have some concerns with the following line of code:

Adv-Makeup/model.py

Lines 346 to 348 in f238d37

loss_G.backward(retain_graph=True)
nn.utils.clip_grad_norm_(self.discr.parameters(), 5)
self.gen_opt.step()

Why do you clip the grad of the discriminator when updating the generator (maybe instead of clipping the grad of generator)? I fail to find the corresponding explanation in your paper.

I also notice that when updating the discriminator, fake images are not detached from the generator network (which is a common practice to stop the grad from being propagated to the generator). Do you intend to do so?

I really appreciate it if you could kindly solve my issues.

Best Regards

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions