-
Hello! I was wondering whether you it would be possible to integrate the ADAM optimiser as an option for the training process. By looking at the code, it looks like the only learning rate schedule available at the moment is Exponential Decay. Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Answered by
njzjz
Dec 11, 2020
Replies: 2 comments
-
deepmd-kit/source/train/Trainer.py Line 278 in b2662e2 The default optimizer is Adam. Exponential Decay is used for the learning rate of Adam. You can see tensorflow API docs https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Adam. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
njzjz
-
Oh, I see. Thank you! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
deepmd-kit/source/train/Trainer.py
Line 278 in b2662e2
The default optimizer is Adam. Exponential Decay is used for the learning rate of Adam. You can see tensorflow API docs https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Adam.