Skip to content

Commit 59aafbc

Browse files
TorkelEisaacsas
andcommitted
Update docs/src/inverse_problems/optimization_ode_param_fitting.md
Co-authored-by: Sam Isaacson <[email protected]>
1 parent 952a380 commit 59aafbc

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/src/inverse_problems/optimization_ode_param_fitting.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ optprob = OptimizationProblem(loss_function, [1.0, 1.0, 1.0]; lb = [0.1, 0.1, 0.
120120
nothing # hide
121121
```
122122

123-
In addition to boundaries, Optimization.jl also supports setting [linear and non-linear constraints](https://docs.sciml.ai/Optimization/stable/tutorials/constraints/#constraints) on its output solution.
123+
In addition to boundaries, Optimization.jl also supports setting [linear and non-linear constraints](https://docs.sciml.ai/Optimization/stable/tutorials/constraints/#constraints) on its output solution for some optimizers.
124124

125125
## Parameter fitting with known parameters
126126
If we from previous knowledge know that *kD = 0.1*, and only would like to fit the values of *kD* and *kP*, this can be achieved through `build_loss_objective`'s `prob_generator` argument. First, we create a function (`fixed_p_prob_generator`) that modifies our `ODEProblem` to incorporate this knowledge:

0 commit comments

Comments
 (0)