You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/controller/nonlinmpc.jl
+38-13Lines changed: 38 additions & 13 deletions
Original file line number
Diff line number
Diff line change
@@ -223,7 +223,8 @@ This controller allocates memory at each time step for the optimization.
223
223
- `jacobian=default_jacobian(transcription)` : an `AbstractADType` backend for the Jacobian
224
224
of the nonlinear constraints, see `gradient` above for the options (default in Extended Help).
225
225
- `hessian=false` : an `AbstractADType` backend for the Hessian of the Lagrangian, see
226
-
`gradient` above for the options (`false` to skip it and use `optim` approximation).
226
+
`gradient` above for the options. The default `false` skip it and use the quasi-Newton
227
+
method of `optim`, which is always the case if `oracle=false` (see Extended Help).
227
228
- `oracle=JuMP.solver_name(optim)=="Ipopt"`: use the efficient [`VectorNonlinearOracle`](@extref MathOptInterface MathOptInterface.VectorNonlinearOracle)
228
229
for the nonlinear constraints (not supported by most optimizers for now).
229
230
- additional keyword arguments are passed to [`UnscentedKalmanFilter`](@ref) constructor
@@ -282,16 +283,20 @@ NonLinMPC controller with a sample time Ts = 10.0 s:
282
283
the `gc` argument (both `gc` and `gc!` accepts non-mutating and mutating functions).
283
284
284
285
By default, the optimization relies on dense [`ForwardDiff`](@extref ForwardDiff)
285
-
automatic differentiation (AD) to compute the objective and constraint derivatives. One
286
-
exception: if `transcription` is not a [`SingleShooting`](@ref), the `jacobian` argument
287
-
defaults to this [sparse backend](@extref DifferentiationInterface AutoSparse-object):
286
+
automatic differentiation (AD) to compute the objective and constraint derivatives. Two
287
+
exceptions: if `transcription` is not a [`SingleShooting`](@ref), the `jacobian`
288
+
argument defaults to this [sparse backend](@extref DifferentiationInterface AutoSparse-object):
288
289
```julia
289
290
AutoSparse(
290
291
AutoForwardDiff();
291
292
sparsity_detector = TracerSparsityDetector(),
292
293
coloring_algorithm = GreedyColoringAlgorithm()
293
294
)
294
295
```
296
+
This is also the sparse backend selected for the Hessian of the Lagrangian function if
297
+
`oracle=true` and `hessian=true`, which is the second exception. Second order
298
+
derivatives are only supported with `oracle=true` option.
299
+
295
300
Optimizers generally benefit from exact derivatives like AD. However, the [`NonLinModel`](@ref)
296
301
state-space functions must be compatible with this feature. See [`JuMP` documentation](@extref JuMP Common-mistakes-when-writing-a-user-defined-operator)
0 commit comments