Personal Optimizer using SimpleOptimizer Problem #3443
Replies: 2 comments
-
Hey 👋 Burn does not officially support nesting autodiff backends to compute higher-order derivatives. While this may compile, I think the issue you are seeing is directly related to a shared global state in the autodiff engine. So calling Also, see the related outstanding issue #1942 |
Beta Was this translation helpful? Give feedback.
-
Hey, Thanks for the quick answer. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am currently implementing a new optimizer for my project, and this optimizer is using the Hessian to calculate the next state. Therefore I need to use .backward and .grad on the gradient of my loss. Which imply that my backend for the simple optimizer needs to be a AutodiffBackend.
This is where my problems occurs.
To provide context, I used the exact same format as the Adam optimizer.
And with my new optim the Config needs to have a InnerBackend as AutodiffBackend, because of the autodiffbackend for the simpleoptimizer.
So I nested two autodiff and with that done nothing is working and my program just stall for infinity !
Is there any other way to use .backward and .grad inside the simple optimizer or maybe to not have the Innerbackend as Autodiff ?
Thanks for the help
Beta Was this translation helpful? Give feedback.
All reactions