You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Implement a data update likelihood callback
* Make the likelihood accumulator optional
* Make some DataUpdateCallback args into kwargs
* WIP
* Fix the partial observability implementation
* Better type signatures
* Misc
* Add Fenrir to ProbNumDiffEq.jl
* Add proper data likelihood tests
* Split the `data_likelihoods.jl` file into separate files per method
* Fix a bug
* Remove the smooth=false suggestion if dense=false
* Fix another broken test (again?)
* Revert the ManifoldUpdate renaming
* Actually remove the filtering likelihood from the dalton file
* Try out DocStringExtensions.jl
* Add compat entry to DocStringExtensions
* Create a DataLikelihoods submodule
* Remove Fenrir from the docs
* Write docstrings for the fenrir and dalton likelihood and doc them
* Make marginalize a bit more flexible
* Update the Probabilistic Exponential Integrator citation
* Remove the underscores from the likelihoods again
* Add a parameter inference tutorial again
* Update the doc index
* Make the parameter inference example even nicer
* Improve tests
* Make Fenrir compatible with matrix-valued observation noise
* Test matrix-valued observation noise
* JuliaFormatter.jl
* Add support for PSDMatrices as observation noise
* Fix a MarkovKernel docstring
* Implement and test update equations with non-zero observation noise
* Shorten and streamline the `update!` functionality a bit
* Simplify Fenrir quite a bit
* Misc updates to the data update callback
* Make the Fenrir code yet a bit more compact
* JuliaFormatter.jl
* Change the parameter inference example to partial observations
* Add DiffEqCallbacks compat entry
* Make the parameter inference doc code a bit nicer
* Add docstrings for DataUpdateLogLikelihood and DataUpdateCallback
* Polish the docs
* Make sure that Fenrir ll is not inf, even if it would technically be
* JuliaFormatter.jl
* Faster DataUpdateCallback
* Slight fenrir speed improvement
* Remove some changes that I somehow introduced earlier
* Add the data likelihood tests to runtests.jl
* Fix the failing data likelihood tests
* JuliaFormatter.jl
* Test for more data likelihood observation noise types
Copy file name to clipboardExpand all lines: docs/src/index.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ Run Julia, enter `]` to bring up Julia's package manager, and add the ProbNumDif
14
14
15
15
```
16
16
julia> ]
17
-
(v1.9) pkg> add ProbNumDiffEq
17
+
(v1.10) pkg> add ProbNumDiffEq
18
18
```
19
19
20
20
## Getting Started
@@ -35,10 +35,10 @@ For a quick introduction check out the "[Solving ODEs with Probabilistic Numeric
35
35
- Arbitrary precision via Julia's built-in [arbitrary precision arithmetic](https://docs.julialang.org/en/v1/manual/integers-and-floating-point-numbers/#Arbitrary-Precision-Arithmetic)
36
36
- Specialized solvers for second-order ODEs (see [Second Order ODEs and Energy Preservation](@ref))
37
37
- Compatible with DAEs in mass-matrix ODE form (see [Solving DAEs with Probabilistic Numerics](@ref))
38
+
- Data likelihoods for parameter-inference in ODEs (see [Parameter Inference with ProbNumDiffEq.jl](@ref))
38
39
39
40
40
41
## Related packages
41
42
42
43
-[probdiffeq](https://pnkraemer.github.io/probdiffeq/): Fast and feature-rich filtering-based probabilistic ODE solvers in JAX.
43
44
-[ProbNum](https://probnum.readthedocs.io/en/latest/): Probabilistic numerics in Python. It has not only probabilistic ODE solvers, but also probabilistic linear solvers, Bayesian quadrature, and many filtering and smoothing implementations.
44
-
-[Fenrir.jl](https://github.com/nathanaelbosch/Fenrir.jl): Parameter-inference in ODEs with probabilistic ODE solvers. This package builds on ProbNumDiffEq.jl to provide a negative marginal log-likelihood function, which can then be used with an optimizer or with MCMC for parameter inference.
# Parameter Inference with ProbNumDiffEq.jl and Fenrir.jl
1
+
# Parameter Inference with ProbNumDiffEq.jl
2
2
3
-
!!! note
4
-
This is mostly just a copy from [the tutorial included in the Fenrir.jl documentation](https://nathanaelbosch.github.io/Fenrir.jl/stable/gettingstarted/), so have a look there too!
5
3
6
4
7
-
```@example fenrir
8
-
using LinearAlgebra
9
-
using OrdinaryDiffEq, ProbNumDiffEq, Plots
10
-
using Fenrir
11
-
using Optimization, OptimizationOptimJL
12
-
stack(x) = copy(reduce(hcat, x)') # convenient
13
-
nothing # hide
14
-
```
15
-
16
-
## The parameter inference problem in general
17
5
Let's assume we have an initial value problem (IVP)
Our goal is then to recover the true parameter `p` (and thus also the true trajectory plotted above) the noisy data.
60
52
61
53
## Computing the negative log-likelihood
62
-
To do parameter inference - be it maximum-likelihod, maximum a posteriori, or full Bayesian inference with MCMC - we need to evaluate the likelihood of given a parameter estimate ``\theta_\text{est}``.
63
-
This is exactly what Fenrir.jl's [`fenrir_nll`](https://nathanaelbosch.github.io/Fenrir.jl/stable/#Fenrir.fenrir_nll) provides:
64
-
```@example fenrir
65
-
p_est = (0.1, 0.1, 2.0)
66
-
prob = remake(true_prob, p=p_est)
54
+
To do parameter inference - be it maximum-likelihod, maximum a posteriori, or full Bayesian inference with MCMC - we need to evaluate the likelihood of given a parameter estimate ``\theta_\text{est}``, which corresponds to the probability of the data under the trajectory returned by the ODE solver
This is the negative marginal log-likelihood of the parameter `p_est`.
76
+
This is the negative marginal log-likelihood of the parameter `θ_est`.
73
77
You can use it as any other NLL: Optimize it to compute maximum-likelihood estimates or MAPs, or plug it into MCMC to sample from the posterior.
74
78
In our paper [tronarp22fenrir](@cite) we compute MLEs by pairing Fenrir with [Optimization.jl](http://optimization.sciml.ai/stable/) and [ForwardDiff.jl](https://juliadiff.org/ForwardDiff.jl/stable/).
75
79
Let's quickly explore how to do this next.
@@ -80,23 +84,29 @@ Let's quickly explore how to do this next.
80
84
To compute a maximum-likelihood estimate (MLE), we just need to maximize ``\theta \to p(\mathcal{D} \mid \theta)`` - that is, minimize the `nll` from above.
81
85
We use [Optimization.jl](https://docs.sciml.ai/Optimization/stable/) for this.
82
86
First, define a loss function and create an `OptimizationProblem`
83
-
```@example fenrir
87
+
```@example parameterinference
88
+
using Optimization, OptimizationOptimJL
89
+
84
90
function loss(x, _)
85
91
ode_params = x[begin:end-1]
86
92
prob = remake(true_prob, p=ode_params)
87
-
κ² = exp(x[end]) # the diffusion parameter of the EK1
0 commit comments