Skip to content

Commit 8eb5aae

Browse files
authored
Dust off: CI, docs, remove SimpleUnpack. (#111)
* update CI and bounds, remove SimpleUnpack from dependencies * dust off docs * dust off CI actions
1 parent d432cc3 commit 8eb5aae

File tree

4 files changed

+23
-17
lines changed

4 files changed

+23
-17
lines changed

.github/workflows/CI.yml

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ jobs:
1717
fail-fast: false
1818
matrix:
1919
version:
20-
- '1.6' # Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
21-
- '1' # Leave this line unchanged. '1' will automatically expand to the latest stable 1.x release of Julia.
20+
- 'min'
21+
- '1' # will automatically expand to the latest stable 1.x release of Julia.
2222
# - 'nightly' # NOTE: nightly disabled as it currently fails
2323
os:
2424
- ubuntu-latest
@@ -30,7 +30,7 @@ jobs:
3030
with:
3131
version: ${{ matrix.version }}
3232
arch: ${{ matrix.arch }}
33-
- uses: actions/cache@v1
33+
- uses: actions/cache@v4
3434
env:
3535
cache-name: cache-artifacts
3636
with:
@@ -43,7 +43,9 @@ jobs:
4343
- uses: julia-actions/julia-buildpkg@v1
4444
- uses: julia-actions/julia-runtest@v1
4545
- uses: julia-actions/julia-processcoverage@v1
46-
- uses: codecov/codecov-action@v4
46+
- uses: codecov/codecov-action@v5
4747
with:
48+
file: lcov.info
49+
# NOTE: you need to add the token to Github secrets, see
50+
# https://docs.codecov.com/docs/adding-the-codecov-token
4851
token: ${{ secrets.CODECOV_TOKEN }}
49-
fail_ci_if_error: false # or true if you want CI to fail when Codecov fails

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
1111
[compat]
1212
ArgCheck = "1, 2"
1313
DocStringExtensions = "0.8, 0.9"
14-
julia = "1.6"
14+
julia = "1.10"
1515

1616
[extras]
1717
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

docs/Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
33
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
44
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
5+
LogDensityProblems = "6fdf6af0-433a-55f7-b3ed-c6c6e0b8df7c"
56
LogDensityProblemsAD = "996a588d-648d-4e1f-a8f0-a84b347e47b1"
6-
SimpleUnPack = "ce78b400-467f-4804-87d8-8f486da07d0a"
77
Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c"
88
TransformVariables = "84d833dd-6860-57f9-a1a7-6da5db126cff"
99
TransformedLogDensities = "f9bc47f6-f3f8-4f3b-ab21-f8bc73906f26"

docs/src/index.md

Lines changed: 14 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -46,9 +46,13 @@ which is added to the log likelihood to obtain the log posterior.
4646

4747
It is useful to define a *callable* that implements this, taking some vector `x` as an input and calculating the summary statistics, then, when called with a `NamedTuple` containing the parameters, evaluating to the log posterior.
4848

49-
```@example 1
49+
```@setup A
50+
using LogDensityProblems
51+
```
52+
53+
```@example A
5054
using Random; Random.seed!(1) # hide
51-
using Statistics, SimpleUnPack # imported for our implementation
55+
using Statistics
5256
5357
struct NormalPosterior{T} # contains the summary statistics
5458
N::Int
@@ -63,8 +67,8 @@ end
6367
6468
# define a callable that unpacks parameters, and evaluates the log likelihood
6569
function (problem::NormalPosterior)(θ)
66-
@unpack μ, σ = θ
67-
@unpack N, x̄, S = problem
70+
(; μ, σ) = θ
71+
(; N, x̄, S) = problem
6872
loglikelihood = -N * (log(σ) + (S + abs2(μ - x̄)) / (2 * abs2(σ)))
6973
logprior = - abs2(σ)/8 - abs2(μ)/50
7074
loglikelihood + logprior
@@ -76,7 +80,7 @@ nothing # hide
7680

7781
Let's try out the posterior calculation:
7882

79-
```@repl 1
83+
```@repl A
8084
problem((μ = 0.0, σ = 1.0))
8185
```
8286

@@ -90,13 +94,13 @@ In our example, we require ``\sigma > 0``, otherwise the problem is meaningless.
9094
!!! note
9195
Since version 1.0, TransformedLogDensity has been moved to the package TransformedLogDensities.
9296

93-
```@repl 1
97+
```@repl A
9498
using LogDensityProblems, TransformVariables, TransformedLogDensities
9599
ℓ = TransformedLogDensity(as((μ = asℝ, σ = asℝ₊)), problem)
96100
```
97101

98102
Then we can query the dimension of this problem, and evaluate the log density:
99-
```@repl 1
103+
```@repl A
100104
LogDensityProblems.dimension(ℓ)
101105
LogDensityProblems.logdensity(ℓ, zeros(2))
102106
```
@@ -108,7 +112,7 @@ LogDensityProblems.logdensity(ℓ, zeros(2))
108112

109113
If you prefer to implement the transformation yourself, you just have to define the following three methods for your problem: declare that it can evaluate log densities (but not their gradient, hence the `0` order), allow the dimension of the problem to be queried, and then finally code the density calculation with the transformation. (Note that using `TransformedLogDensities.TransformedLogDensity` takes care of all of these for you, as shown above).
110114

111-
```@example 1
115+
```@example A
112116
function LogDensityProblems.capabilities(::Type{<:NormalPosterior})
113117
LogDensityProblems.LogDensityOrder{0}()
114118
end
@@ -123,7 +127,7 @@ end
123127
nothing # hide
124128
```
125129

126-
```@repl 1
130+
```@repl A
127131
LogDensityProblems.logdensity(problem, zeros(2))
128132
```
129133

@@ -134,7 +138,7 @@ Here we use the exponential function to transform from ``\mathbb{R}`` to the pos
134138
Using either definition, you can transform to another object which is capable of evaluating the *gradient*, using automatic differentiation. For this, you need the [LogDensityProblemsAD.jl](https://github.com/tpapp/LogDensityProblemsAD.jl) package.
135139

136140
Now observe that we can obtain gradients, too:
137-
```@repl 1
141+
```@repl A
138142
import ForwardDiff
139143
using LogDensityProblemsAD
140144
∇ℓ = ADgradient(:ForwardDiff, ℓ)

0 commit comments

Comments
 (0)