You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .github/workflows/CI.yml
+7-5Lines changed: 7 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -17,8 +17,8 @@ jobs:
17
17
fail-fast: false
18
18
matrix:
19
19
version:
20
-
- '1.6'# Replace this with the minimum Julia version that your package supports. E.g. if your package requires Julia 1.5 or higher, change this to '1.5'.
21
-
- '1'#Leave this line unchanged. '1' will automatically expand to the latest stable 1.x release of Julia.
20
+
- 'min'
21
+
- '1'# will automatically expand to the latest stable 1.x release of Julia.
22
22
# - 'nightly' # NOTE: nightly disabled as it currently fails
23
23
os:
24
24
- ubuntu-latest
@@ -30,7 +30,7 @@ jobs:
30
30
with:
31
31
version: ${{ matrix.version }}
32
32
arch: ${{ matrix.arch }}
33
-
- uses: actions/cache@v1
33
+
- uses: actions/cache@v4
34
34
env:
35
35
cache-name: cache-artifacts
36
36
with:
@@ -43,7 +43,9 @@ jobs:
43
43
- uses: julia-actions/julia-buildpkg@v1
44
44
- uses: julia-actions/julia-runtest@v1
45
45
- uses: julia-actions/julia-processcoverage@v1
46
-
- uses: codecov/codecov-action@v4
46
+
- uses: codecov/codecov-action@v5
47
47
with:
48
+
file: lcov.info
49
+
# NOTE: you need to add the token to Github secrets, see
Copy file name to clipboardExpand all lines: docs/src/index.md
+14-10Lines changed: 14 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,9 +46,13 @@ which is added to the log likelihood to obtain the log posterior.
46
46
47
47
It is useful to define a *callable* that implements this, taking some vector `x` as an input and calculating the summary statistics, then, when called with a `NamedTuple` containing the parameters, evaluating to the log posterior.
48
48
49
-
```@example 1
49
+
```@setup A
50
+
using LogDensityProblems
51
+
```
52
+
53
+
```@example A
50
54
using Random; Random.seed!(1) # hide
51
-
using Statistics, SimpleUnPack # imported for our implementation
55
+
using Statistics
52
56
53
57
struct NormalPosterior{T} # contains the summary statistics
54
58
N::Int
@@ -63,8 +67,8 @@ end
63
67
64
68
# define a callable that unpacks parameters, and evaluates the log likelihood
If you prefer to implement the transformation yourself, you just have to define the following three methods for your problem: declare that it can evaluate log densities (but not their gradient, hence the `0` order), allow the dimension of the problem to be queried, and then finally code the density calculation with the transformation. (Note that using `TransformedLogDensities.TransformedLogDensity` takes care of all of these for you, as shown above).
110
114
111
-
```@example1
115
+
```@exampleA
112
116
function LogDensityProblems.capabilities(::Type{<:NormalPosterior})
113
117
LogDensityProblems.LogDensityOrder{0}()
114
118
end
@@ -123,7 +127,7 @@ end
123
127
nothing # hide
124
128
```
125
129
126
-
```@repl1
130
+
```@replA
127
131
LogDensityProblems.logdensity(problem, zeros(2))
128
132
```
129
133
@@ -134,7 +138,7 @@ Here we use the exponential function to transform from ``\mathbb{R}`` to the pos
134
138
Using either definition, you can transform to another object which is capable of evaluating the *gradient*, using automatic differentiation. For this, you need the [LogDensityProblemsAD.jl](https://github.com/tpapp/LogDensityProblemsAD.jl) package.
0 commit comments