-
-
Notifications
You must be signed in to change notification settings - Fork 226
Open
Description
I did an initial round of cleanup in #882, but there's a lot of unwanted code that should be purged, and most of the handling should be forwarded to Lux.
- GPU Support: Currently there is an
adaptto copy over anything that is not on GPU to GPU on every call to the function. IMO this should be completely removed, and if user calls a model which is partially on GPU then it should be an error (similar to Lux)- We can preserve the current behavior for scalars
-
Phi/ODEPhineed to be rewritten as a Lux layer, that un-blocks all current shortcomings with nested AD - Annotate the closures with
@closureto avoid boxing - Remove the logging subpackage with a extension refactor: remove NeuralPDELogging in-favor of extension #901
- Move Bayesian ones into an extension (or a subpackage
BayesianNeuralPDE)? I am pretty sure the number of users for those is quite small but those packages add a significant load time - reproducibility -- all models should take in an
rnginstead of relying on the global RNG
P.S. Just because I am opening this issue doesn't mean I am taking it upon me to do this 😓
Metadata
Metadata
Assignees
Labels
No labels