You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -197,24 +197,27 @@ NonLinMPC controller with a sample time Ts = 10.0 s, Ipopt optimizer, UnscentedK
197
197
algebra instead of a `for` loop. This feature can accelerate the optimization, especially
198
198
for the constraint handling, and is not available in any other package, to my knowledge.
199
199
200
-
The optimization relies on [`JuMP`](https://github.com/jump-dev/JuMP.jl) automatic
201
-
differentiation (AD) to compute the objective and constraint derivatives. Optimizers
202
-
generally benefit from exact derivatives like AD. However, the [`NonLinModel`](@ref)
203
-
state-space functions must be compatible with this feature. See [Automatic differentiation](https://jump.dev/JuMP.jl/stable/manual/nlp/#Automatic-differentiation)
204
-
for common mistakes when writing these functions.
205
-
206
-
If `LHS` represents the result of the left-hand side in the inequality
elements) and ``\mathbf{D̂_e}`` (`nd*(Hp+1)` elements) as arguments. If `LHS` represents
203
+
the left-hand side result in the inequality ``\mathbf{g_c}(\mathbf{U_e}, \mathbf{Ŷ_e},
204
+
\mathbf{D̂_e}, \mathbf{p}, ϵ) ≤ \mathbf{0}``, `gc` can be implemented in two ways:
209
205
210
206
1. **Non-mutating function** (out-of-place): define it as `gc(Ue, Ŷe, D̂e, p, ϵ) -> LHS`.
211
207
This syntax is simple and intuitive but it allocates more memory.
212
208
2. **Mutating function** (in-place): define it as `gc!(LHS, Ue, Ŷe, D̂e, p, ϵ) -> nothing`.
213
209
This syntax reduces the allocations and potentially the computational burden as well.
214
210
215
211
The keyword argument `nc` is the number of elements in the `LHS` vector, and `gc!`, an
216
-
alias for the `gc` argument (both accepts non-mutating and mutating functions). Note
217
-
that if `Cwt≠Inf`, the attribute `nlp_scaling_max_gradient` of `Ipopt` is set to
212
+
alias for the `gc` argument (both accepts non-mutating and mutating functions).
213
+
214
+
The optimization relies on [`JuMP`](https://github.com/jump-dev/JuMP.jl) automatic
215
+
differentiation (AD) to compute the objective and constraint derivatives. Optimizers
216
+
generally benefit from exact derivatives like AD. However, the [`NonLinModel`](@ref)
217
+
state-space functions must be compatible with this feature. See [Automatic differentiation](https://jump.dev/JuMP.jl/stable/manual/nlp/#Automatic-differentiation)
218
+
for common mistakes when writing these functions.
219
+
220
+
Note that if `Cwt≠Inf`, the attribute `nlp_scaling_max_gradient` of `Ipopt` is set to
218
221
`10/Cwt` (if not already set), to scale the small values of ``ϵ``.
0 commit comments