How to create an AdamW optimizer with Amsgrad #15356
Unanswered
SalamanderXing
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I'm reproducing a deep learning paper from PyTorch -> JAX. The paper uses the following optimizer:
I found both AdamW and AMSGrad in Optax, but not AMSGrad with weight decay
Beta Was this translation helpful? Give feedback.
All reactions