Skip to content
This repository was archived by the owner on Aug 15, 2019. It is now read-only.

How does the loss minimize if autoencoder_B keeps changing the weights learned by autoencoder_A ? #31

@mike3454

Description

@mike3454

The training of the single encoder and 2 decoders happens like in the following simplified code.

self.encoder = self.Encoder()
self.decoder_A = self.Decoder()
self.decoder_B = self.Decoder()
...
self.autoencoder_A = KerasModel(x, self.decoder_A(self.encoder(x)))
self.autoencoder_B = KerasModel(x, self.decoder_B(self.encoder(x)))

for i in epochs:
    self.autoencoder_A.train_one_batch(...) 
    self.autoencoder_B.train_one_batch(...) # doesn't this reset the encoder weights 

My understanding when training one autoencoder_A it does not change the weights of autoencoder_B's decoder but changes the weights of the encoder since it is shared. Please correct me if i am wrong.

How does the loss gets minimized if one autoencoder changes the weight of shared encoder another alternatively ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions