Skip to content
Discussion options

You must be logged in to vote

optimizer = tf.train.AdamOptimizer(learning_rate = self.learning_rate)

The default optimizer is Adam. Exponential Decay is used for the learning rate of Adam. You can see tensorflow API docs https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Adam.

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by njzjz
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #307 on December 24, 2020 15:39.