-
Notifications
You must be signed in to change notification settings - Fork 15
Integrate DifferentiationInterface.jl for gradient computation #397
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
JuliaBUGS.jl documentation for PR #397 is available at: |
Pull Request Test Coverage Report for Build 18304377034Details
💛 - Coveralls |
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #397 +/- ##
==========================================
- Coverage 83.26% 82.87% -0.40%
==========================================
Files 31 31
Lines 3759 3801 +42
==========================================
+ Hits 3130 3150 +20
- Misses 629 651 +22 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
511c6a6
to
f0135a8
Compare
One high-level suggestion is to review DynamicPPL.LogDensityFunction and explore whether it can be shared with JuliaBUGS models. This could help reduce code redundancy. cc @penelopeysm |
name = "JuliaBUGS" | ||
uuid = "ba9fb4c0-828e-4473-b6a1-cd2560fee5bf" | ||
version = "0.10.3" | ||
version = "0.10.4" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it really non-breaking to make such a big change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The adtype
parameter is optional (defaults to nothing), all existing code works unchanged, so this is backward compatible.
However, if you prefer 0.11.0
, I can change it.
I like this a lot, thanks! (The code really look fantastic and comprehensive.) I had a similar PR locally, but didn't push through because I feels like we should first test against the AD backends so that when we say we support, we actually support. Another thing to consider is whether we want to drop the support for LogDensityProblemsAD (I think the current PR will, so as it is, it should be breaking). @shravanngoswamii what do you think? |
Let's drop it unless there is a good reason. |
|
||
# Test that ReverseDiff backend works | ||
ad_model_compiled = compile(model_def, data; adtype=AutoReverseDiff(; compile=true)) | ||
ad_model_nocompile = compile(model_def, data; adtype=AutoReverseDiff(; compile=false)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[JuliaFormatter] reported by reviewdog 🐶
ad_model_nocompile = compile(model_def, data; adtype=AutoReverseDiff(; compile=false)) | |
ad_model_nocompile = compile( | |
model_def, data; adtype=AutoReverseDiff(; compile=false) | |
) |
Benchmark results on macOS (aarch64)BridgeStan not found at location specified by $BRIDGESTAN environment variable, downloading version 2.7.0 to /Users/runner/.bridgestan/bridgestan-2.7.0 Stan results:
JuliaBUGS Mooncake results:
Benchmark results on Ubuntu (x64)BridgeStan not found at location specified by $BRIDGESTAN environment variable, downloading version 2.7.0 to /home/runner/.bridgestan/bridgestan-2.7.0 Stan results:
JuliaBUGS Mooncake results:
|
Replace
LogDensityProblemsAD
withDifferentiationInterface.jl
for automatic differentiation.Changes:
adtype
parameter to compile() for specifying AD backends:ReverseDiff
,:ForwardDiff
,:Zygote
,:Enzyme
BUGSModelWithGradient
wrapper withlogdensity_and_gradient
methodadtype
continues to workUsage:
Closes #380