You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`F` represents a vector of objective functions, where `u` are the optimization variables and `p` are fixed parameters or data. This struct defines all related functions such as Jacobians, Hessians, constraint functions, and their derivatives needed to solve multi-objective optimization problems.
- `F(u, p, args...)`: The vector-valued multi-objective function `F` to be minimized. `u` is the vector of decision variables,
1964
+
`p` contains the parameters, and extra arguments can be passed as needed. Each element of `F` corresponds to an individual objective function `f1, f2, ..., fn`.
1965
+
1966
+
## Keyword Arguments
1967
+
- `jac(J, u, p)` or `J = jac(u, p)`: Jacobian of the multi-objective function `F` with respect to `u`. Can accept additional arguments: `jac(J, u, p, args...)`.
1968
+
- `hess(H, u, p)` or `H = hess(u, p)`: Hessian matrix of the multi-objective function `F`. Accepts additional arguments: `hess(H, u, p, args...)`.
1969
+
- `hv(Hv, u, v, p)` or `Hv = hv(u, v, p)`: Hessian-vector product for the multi-objective function `F`. Extra arguments: `hv(Hv, u, v, p, args...)`.
1970
+
- `cons(res,u,p)` or `res=cons(u,p)` : the constraints function, should mutate the passed `res` array
1971
+
with value of the `i`th constraint, evaluated at the current values of variables
1972
+
inside the optimization routine. This takes just the function evaluations
1973
+
and the equality or inequality assertion is applied by the solver based on the constraint
1974
+
bounds passed as `lcons` and `ucons` to [`OptimizationProblem`](@ref), in case of equality
1975
+
constraints `lcons` and `ucons` should be passed equal values.
1976
+
- `cons_j(J,u,p)` or `J=cons_j(u,p)`: the Jacobian of the constraints.
1977
+
- `cons_jvp(Jv,u,v,p)` or `Jv=cons_jvp(u,v,p)`: the Jacobian-vector product of the constraints.
1978
+
- `cons_vjp(Jv,u,v,p)` or `Jv=cons_vjp(u,v,p)`: the Jacobian-vector product of the constraints.
1979
+
- `cons_h(H,u,p)` or `H=cons_h(u,p)`: the Hessian of the constraints, provided as
1980
+
an array of Hessians with `res[i]` being the Hessian with respect to the `i`th output on `cons`.
1981
+
- `hess_prototype`: a prototype matrix matching the type that matches the Hessian. For example,
1982
+
if the Hessian is tridiagonal, then an appropriately sized `Hessian` matrix can be used
1983
+
as the prototype and optimization solvers will specialize on this structure where possible. Non-structured
1984
+
sparsity patterns should use a `SparseMatrixCSC` with a correct sparsity pattern for the Hessian.
1985
+
The default is `nothing`, which means a dense Hessian.
1986
+
- `cons_jac_prototype`: a prototype matrix matching the type that matches the constraint Jacobian.
1987
+
The default is `nothing`, which means a dense constraint Jacobian.
1988
+
- `cons_hess_prototype`: a prototype matrix matching the type that matches the constraint Hessian.
1989
+
This is defined as an array of matrices, where `hess[i]` is the Hessian w.r.t. the `i`th output.
1990
+
For example, if the Hessian is sparse, then `hess` is a `Vector{SparseMatrixCSC}`.
1991
+
The default is `nothing`, which means a dense constraint Hessian.
1992
+
- `lag_h(res,u,sigma,mu,p)` or `res=lag_h(u,sigma,mu,p)`: the Hessian of the Lagrangian,
1993
+
where `sigma` is a multiplier of the cost function and `mu` are the Lagrange multipliers
1994
+
multiplying the constraints. This can be provided instead of `hess` and `cons_h`
1995
+
to solvers that directly use the Hessian of the Lagrangian.
1996
+
- `hess_colorvec`: a color vector according to the SparseDiffTools.jl definition for the sparsity
1997
+
pattern of the `hess_prototype`. This specializes the Hessian construction when using
1998
+
finite differences and automatic differentiation to be computed in an accelerated manner
1999
+
based on the sparsity pattern. Defaults to `nothing`, which means a color vector will be
2000
+
internally computed on demand when required. The cost of this operation is highly dependent
2001
+
on the sparsity pattern.
2002
+
- `cons_jac_colorvec`: a color vector according to the SparseDiffTools.jl definition for the sparsity
2003
+
pattern of the `cons_jac_prototype`.
2004
+
- `cons_hess_colorvec`: an array of color vector according to the SparseDiffTools.jl definition for
2005
+
the sparsity pattern of the `cons_hess_prototype`.
2006
+
2007
+
When [Symbolic Problem Building with ModelingToolkit](https://docs.sciml.ai/Optimization/stable/tutorials/symbolic/) interface is used the following arguments are also relevant:
2008
+
2009
+
- `observed`: an algebraic combination of optimization variables that is of interest to the user
2010
+
which will be available in the solution. This can be single or multiple expressions.
2011
+
- `sys`: field that stores the `OptimizationSystem`.
2012
+
2013
+
## iip: In-Place vs Out-Of-Place
2014
+
2015
+
For more details on this argument, see the ODEFunction documentation.
2016
+
2017
+
## specialize: Controlling Compilation and Specialization
2018
+
2019
+
For more details on this argument, see the ODEFunction documentation.
2020
+
2021
+
## Fields
2022
+
2023
+
The fields of the MultiObjectiveOptimizationFunction type directly match the names of the inputs.
0 commit comments