You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Return the operators for the nonlinear optimization of `mpc` [`NonLinMPC`](@ref) controller.
630
+
Return the nonlinear constraint oracles for [`NonLinMPC`](@ref) `mpc`.
630
631
631
632
Return `g_oracle` and `geq_oracle`, the inequality and equality [`VectorNonlinearOracle`](@extref MathOptInterface MathOptInterface.VectorNonlinearOracle)
632
633
for the two respective constraints. Note that `g_oracle` only includes the non-`Inf`
633
-
inequality constraints, thus it must be re-constructed if they change. Also return `J_op`,
634
-
the [`NonlinearOperator`](@extref JuMP NonlinearOperator) for the objective function, based
635
-
on the splatting syntax. This method is really intricate and that's because of 3 elements:
636
-
637
-
- These functions are used inside the nonlinear optimization, so they must be type-stable
638
-
and as efficient as possible. All the function outputs and derivatives are cached and
639
-
updated in-place if required to use the efficient [`value_and_jacobian!`](@extref DifferentiationInterface DifferentiationInterface.value_and_jacobian!).
640
-
- The splatting syntax for objective functions implies the use of `Vararg{T,N}` (see the [performance tip](@extref Julia Be-aware-of-when-Julia-avoids-specializing))
641
-
and memoization to avoid redundant computations. This is already complex, but it's even
642
-
worse knowing that the automatic differentiation tools do not support splatting.
643
-
- The signature of gradient and hessian functions is not the same for univariate (`nZ̃ == 1`)
644
-
and multivariate (`nZ̃ > 1`) operators in `JuMP`. Both must be defined.
634
+
inequality constraints, thus it must be re-constructed if they change. This method is really
635
+
intricate because the oracles are used inside the nonlinear optimization, so they must be
636
+
type-stable and as efficient as possible. All the function outputs and derivatives are
637
+
ached and updated in-place if required to use the efficient [`value_and_jacobian!`](@extref DifferentiationInterface DifferentiationInterface.value_and_jacobian!).
645
638
"""
646
-
functionget_nonlinops(mpc::NonLinMPC, optim::JuMP.GenericModel{JNT}) where JNT<:Real
639
+
functionget_nonlincon_oracle(mpc::NonLinMPC, ::JuMP.GenericModel{JNT}) where JNT<:Real
647
640
# ----------- common cache for all functions ----------------------------------------
648
641
model = mpc.estim.model
649
642
transcription = mpc.transcription
650
-
grad, jac, hess = mpc.gradient, mpc.jacobian, mpc.hessian
Return the nonlinear operator for the objective function of `mpc` [`NonLinMPC`](@ref).
797
+
798
+
It is based on the splatting syntax. This method is really intricate and that's because of:
799
+
800
+
- These functions are used inside the nonlinear optimization, so they must be type-stable
801
+
and as efficient as possible. All the function outputs and derivatives are cached and
802
+
updated in-place if required to use the efficient [`value_and_gradient!`](@extref DifferentiationInterface DifferentiationInterface.value_and_jacobian!).
803
+
- The splatting syntax for objective functions implies the use of `Vararg{T,N}` (see the [performance tip](@extref Julia Be-aware-of-when-Julia-avoids-specializing))
804
+
and memoization to avoid redundant computations. This is already complex, but it's even
805
+
worse knowing that the automatic differentiation tools do not support splatting.
806
+
- The signature of gradient and hessian functions is not the same for univariate (`nZ̃ == 1`)
807
+
and multivariate (`nZ̃ > 1`) operators in `JuMP`. Both must be defined.
808
+
"""
809
+
functionget_nonlinobj_op(mpc::NonLinMPC, optim::JuMP.GenericModel{JNT}) where JNT<:Real
0 commit comments