You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For more examples of adjoint usage on large parameter models, consult the [DiffEqFlux documentation](https://diffeqflux.sciml.ai/dev/).
367
-
368
-
## Inference of a Stochastic Differential Equation
369
-
370
-
A [Stochastic Differential Equation (SDE)](https://diffeq.sciml.ai/stable/tutorials/sde_example/) is a differential equation that has a stochastic (noise) term in the expression of the derivatives.
371
-
Here we fit a stochastic version of the Lokta-Volterra system.
372
-
373
-
We use a quasi-likelihood approach in which all trajectories of a solution are compared instead of a reduction such as mean, this increases the robustness of fitting and makes the likelihood more identifiable.
# Early exit if simulation could not be computed successfully.
416
-
if predicted.retcode !== :Success
417
-
Turing.@addlogprob! -Inf
418
-
return nothing
419
-
end
420
-
421
-
# Observations.
422
-
for i in 1:length(predicted)
423
-
data[:, i] ~ MvNormal(predicted[i], σ^2 * I)
424
-
end
425
-
426
-
return nothing
427
-
end;
428
-
```
429
-
430
-
The probabilistic nature of the SDE solution makes the likelihood function noisy which poses a challenge for NUTS since the gradient is changing with every calculation.
431
-
Therefore we use NUTS with a low target acceptance rate of `0.25` and specify a set of initial parameters.
432
-
SGHMC might be a more suitable algorithm to be used here.
0 commit comments