-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
Since Flux has changed it's interface of learning rate scheduler
opt = Flux.Optimiser(ADAM(0.05), ExpDecay(1.0, 0.985, 20)) in
https://github.com/viniviena/ude_chromatography/blob/f12e20d349957e579074bad35de5edee9c187fd6/UDE_paper_chromatography/PDE_gradients_lux_ude.jl#L314C1-L314C59
failed, I have tried with
using OptimizationOptimisers
opt = Adam(0.05, (1.0, 0.985)
which errors as:
ERROR: Function argument passed to autodiff cannot be proven readonly.
If the the function argument cannot contain derivative data, instead call autodiff(Mode, Const(f), ...)
See https://enzyme.mit.edu/index.fcgi/julia/stable/faq/#Activity-of-temporary-storage for more information.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels