Skip to content

Commit 990b907

Browse files
Merge pull request #792 from alonsoC1s/wrap_nlpmodel
Add a conversion mechanism between NLPModel to OptimizationProblem
2 parents d8cf4fb + 2c80a2b commit 990b907

File tree

8 files changed

+283
-1
lines changed

8 files changed

+283
-1
lines changed

.github/workflows/CI.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ jobs:
3636
- OptimizationQuadDIRECT
3737
- OptimizationSpeedMapping
3838
- OptimizationPolyalgorithms
39+
- OptimizationNLPModels
3940
version:
4041
- '1'
4142
steps:

docs/Project.toml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ Juniper = "2ddba703-00a4-53a7-87a5-e8b9971dde84"
1212
Manifolds = "1cead3c2-87b3-11e9-0ccd-23c62b72b94e"
1313
Manopt = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
1414
ModelingToolkit = "961ee093-0014-501f-94e3-6117800e7a78"
15+
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
16+
NLPModelsTest = "7998695d-6960-4d3a-85c4-e1bceb8cd856"
1517
NLopt = "76087f3c-5699-56af-9a33-bf431cd00edd"
1618
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
1719
OptimizationBBO = "3e6eede4-6085-4f62-9a71-46d9bc1eb92b"
@@ -25,6 +27,7 @@ OptimizationManopt = "e57b7fff-7ee7-4550-b4f0-90e9476e9fb6"
2527
OptimizationMetaheuristics = "3aafef2f-86ae-4776-b337-85a36adf0b55"
2628
OptimizationMultistartOptimization = "e4316d97-8bbb-4fd3-a7d8-3851d2a72823"
2729
OptimizationNLopt = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
30+
OptimizationNLPModels = "064b21be-54cf-11ef-1646-cdfee32b588f"
2831
OptimizationNOMAD = "2cab0595-8222-4775-b714-9828e6a9e01b"
2932
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
3033
OptimizationOptimisers = "42dfb2eb-d2b4-4451-abcd-913932933ac1"
@@ -66,6 +69,7 @@ OptimizationManopt = "0.0.2, 0.0.3"
6669
OptimizationMetaheuristics = "0.1, 0.2"
6770
OptimizationMultistartOptimization = "0.1, 0.2"
6871
OptimizationNLopt = "0.1, 0.2"
72+
OptimizationNLPModels = "0.0.1"
6973
OptimizationNOMAD = "0.1, 0.2"
7074
OptimizationOptimJL = "0.1, 0.2, 0.3"
7175
OptimizationOptimisers = "0.1, 0.2"

docs/pages.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ pages = ["index.md",
3636
"PRIMA.jl" => "optimization_packages/prima.md",
3737
"Polyalgorithms.jl" => "optimization_packages/polyopt.md",
3838
"QuadDIRECT.jl" => "optimization_packages/quaddirect.md",
39-
"SpeedMapping.jl" => "optimization_packages/speedmapping.md"
39+
"SpeedMapping.jl" => "optimization_packages/speedmapping.md",
40+
"NLPModels.jl" => "optimization_packages/nlpmodels.md"
4041
]
4142
]
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# NLPModels.jl
2+
3+
[NLPModels](https://jso.dev/NLPModels.jl/latest/), similarly to Optimization.jl itself,
4+
provides a standardized modeling interface for representing Non-Linear Programs that
5+
facilitates using different solvers on the same problem. The Optimization.jl extension of
6+
NLPModels aims to provide a thin translation layer to make `NLPModel`s, the main export of
7+
the package, compatible with the optimizers in the Optimization.jl ecosystem.
8+
9+
## Installation: NLPModels.jl
10+
11+
To translate an `NLPModel`, install the OptimizationNLPModels package:
12+
13+
```julia
14+
import Pkg;
15+
Pkg.add("OptimizationNLPModels")
16+
```
17+
18+
The package NLPModels.jl itself contains no optimizers or models. Several packages
19+
provide optimization problem ([CUTEst.jl](https://jso.dev/CUTEst.jl/stable/),
20+
[NLPModelsTest.jl](https://jso.dev/NLPModelsTest.jl/dev/)) which can then be solved with
21+
any optimizer supported by Optimization.jl
22+
23+
## Usage
24+
25+
For example, solving a problem defined in `NLPModelsTest` with
26+
[`Ipopt.jl`](https://github.com/jump-dev/Ipopt.jl). First, install the packages like so:
27+
28+
```julia
29+
import Pkg;
30+
Pkg.add("NLPModelsTest", "Ipopt")
31+
```
32+
33+
We instantiate [problem
34+
10](https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.HS10) in the
35+
Hock--Schittkowski optimization suite available from `NLPModelsTest` as `HS10`, then
36+
translate it to an `OptimizationProblem`.
37+
38+
```@example NLPModels
39+
using OptimizationNLPModels, Optimization, NLPModelsTest, Ipopt
40+
using Optimization: OptimizationProblem
41+
nlpmodel = NLPModelsTest.HS10()
42+
prob = OptimizationProblem(nlpmodel, AutoForwardDiff())
43+
```
44+
45+
which can now be solved like any other `OptimizationProblem`:
46+
47+
```@example NLPModels
48+
sol = solve(prob, Ipopt.Optimizer())
49+
```
50+
51+
## API
52+
53+
Problems represented as `NLPModel`s can be used to create [`OptimizationProblem`](@ref)s and
54+
[`OptimizationFunction`](@ref).

lib/OptimizationNLPModels/LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2023 Vaibhav Dixit <[email protected]> and contributors
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
name = "OptimizationNLPModels"
2+
uuid = "064b21be-54cf-11ef-1646-cdfee32b588f"
3+
authors = ["Vaibhav Dixit <[email protected]> and contributors"]
4+
version = "0.0.1"
5+
6+
[deps]
7+
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
8+
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
9+
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
10+
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
11+
12+
[extras]
13+
NLPModelsTest = "7998695d-6960-4d3a-85c4-e1bceb8cd856"
14+
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
15+
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
16+
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
17+
Ipopt = "b6b21f68-93f8-5de0-b562-5493be1d77c9"
18+
OptimizationMOI = "fd9f6733-72f4-499f-8506-86b2bdd0dea1"
19+
20+
[targets]
21+
test = ["Test", "NLPModelsTest", "OptimizationOptimJL", "Zygote", "Ipopt", "OptimizationMOI"]
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
module OptimizationNLPModels
2+
3+
using Reexport
4+
@reexport using NLPModels, Optimization, ADTypes
5+
6+
"""
7+
OptimizationFunction(nlpmodel::AbstractNLPModel, adtype::AbstractADType = NoAD())
8+
9+
Returns an `OptimizationFunction` from the `NLPModel` defined in `nlpmodel` where the
10+
available derivates are re-used from the model, and the rest are populated with the
11+
Automatic Differentiation backend specified by `adtype`.
12+
"""
13+
function SciMLBase.OptimizationFunction(nlpmodel::AbstractNLPModel,
14+
adtype::ADTypes.AbstractADType = SciMLBase.NoAD(); kwargs...)
15+
f(x, p) = NLPModels.obj(nlpmodel, x)
16+
grad(G, u, p) = NLPModels.grad!(nlpmodel, u, G)
17+
hess(H, u, p) = (H .= NLPModels.hess(nlpmodel, u))
18+
hv(Hv, u, v, p) = NLPModels.hprod!(nlpmodel, u, v, Hv)
19+
20+
if !unconstrained(nlpmodel) && !bound_constrained(nlpmodel)
21+
cons(res, x, p) = NLPModels.cons!(nlpmodel, x, res)
22+
cons_j(J, x, p) = (J .= NLPModels.jac(nlpmodel, x))
23+
cons_jvp(Jv, v, x, p) = NLPModels.jprod!(nlpmodel, x, v, Jv)
24+
25+
return OptimizationFunction(
26+
f, adtype; grad, hess, hv, cons, cons_j, cons_jvp, kwargs...)
27+
end
28+
29+
return OptimizationFunction(f, adtype; grad, hess, hv, kwargs...)
30+
end
31+
32+
"""
33+
OptimizationProblem(nlpmodel::AbstractNLPModel, adtype::AbstractADType = NoAD())
34+
35+
Returns an `OptimizationProblem` with the bounds and constraints defined in `nlpmodel`.
36+
The optimization function and its derivatives are re-used from `nlpmodel` when available
37+
or populated wit the Automatic Differentiation backend specified by `adtype`.
38+
"""
39+
function SciMLBase.OptimizationProblem(nlpmodel::AbstractNLPModel,
40+
adtype::ADTypes.AbstractADType = SciMLBase.NoAD(); kwargs...)
41+
f = OptimizationFunction(nlpmodel, adtype; kwargs...)
42+
u0 = nlpmodel.meta.x0
43+
lb, ub = if has_bounds(nlpmodel)
44+
(nlpmodel.meta.lvar, nlpmodel.meta.uvar)
45+
else
46+
(nothing, nothing)
47+
end
48+
49+
lcons, ucons = if has_inequalities(nlpmodel) || has_equalities(nlpmodel)
50+
(nlpmodel.meta.lcon, nlpmodel.meta.ucon)
51+
else
52+
(nothing, nothing)
53+
end
54+
sense = nlpmodel.meta.minimize ? Optimization.MinSense : Optimization.MaxSense
55+
56+
# The number of variables, geometry of u0, etc.. are valid and were checked when the
57+
# nlpmodel was created.
58+
59+
return Optimization.OptimizationProblem(
60+
f, u0; lb = lb, ub = ub, lcons = lcons, ucons = ucons, sense = sense, kwargs...)
61+
end
62+
63+
end
Lines changed: 117 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,117 @@
1+
using OptimizationNLPModels, Optimization, NLPModelsTest, Ipopt, OptimizationMOI, Zygote,
2+
OptimizationOptimJL
3+
using Test
4+
5+
@testset "NLPModels" begin
6+
# First problem: Problem 5 in the Hock-Schittkowski suite
7+
# https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.HS5
8+
# Problem with box bounds
9+
hs5f(u, p) = sin(u[1] + u[2]) + (u[1] - u[2])^2 - (3 / 2) * u[1] + (5 / 2)u[2] + 1
10+
f = Optimization.OptimizationFunction(hs5f, Optimization.AutoZygote())
11+
lb = [-1.5; -3]
12+
ub = [4.0; 3.0]
13+
u0 = [0.0; 0.0]
14+
oprob = Optimization.OptimizationProblem(
15+
f, u0, lb = lb, ub = ub, sense = Optimization.MinSense)
16+
17+
nlpmo = NLPModelsTest.HS5()
18+
converted = OptimizationNLPModels.OptimizationProblem(nlpmo, Optimization.AutoZygote())
19+
20+
sol_native = solve(oprob, Optimization.LBFGS(), maxiters = 1000)
21+
sol_converted = solve(converted, Optimization.LBFGS(), maxiters = 1000)
22+
23+
@test sol_converted.retcode == sol_native.retcode
24+
@test sol_converted.u sol_native.u
25+
26+
# Second problem: Brown and Dennis function
27+
# https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.BROWNDEN
28+
# Problem without bounds
29+
function brown_dennis(u, p)
30+
return sum([((u[1] + (i / 5) * u[2] - exp(i / 5))^2 +
31+
(u[3] + sin(i / 5) * u[4] - cos(i / 5))^2)^2 for i in 1:20])
32+
end
33+
f = Optimization.OptimizationFunction(brown_dennis, Optimization.AutoZygote())
34+
u0 = [25.0; 5.0; -5.0; -1.0]
35+
oprob = Optimization.OptimizationProblem(f, u0, sense = Optimization.MinSense)
36+
37+
nlpmo = NLPModelsTest.BROWNDEN()
38+
converted = OptimizationNLPModels.OptimizationProblem(nlpmo, Optimization.AutoZygote())
39+
40+
sol_native = solve(oprob, BFGS())
41+
sol_converted = solve(converted, BFGS())
42+
43+
@test sol_converted.retcode == sol_native.retcode
44+
@test sol_converted.u sol_native.u
45+
46+
# Third problem: Problem 10 in the Hock-Schittkowski suite
47+
# https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.HS10
48+
# Problem with inequality bounds
49+
hs10(u, p) = u[1] - u[2]
50+
hs10_cons(res, u, p) = (res .= -3.0 * u[1]^2 + 2.0 * u[1] * u[2] - u[2]^2 + 1.0)
51+
lcons = [0.0]
52+
ucons = [Inf]
53+
u0 = [-10.0; 10.0]
54+
f = Optimization.OptimizationFunction(
55+
hs10, Optimization.AutoForwardDiff(); cons = hs10_cons)
56+
oprob = Optimization.OptimizationProblem(
57+
f, u0, lcons = lcons, ucons = ucons, sense = Optimization.MinSense)
58+
59+
nlpmo = NLPModelsTest.HS10()
60+
converted = OptimizationNLPModels.OptimizationProblem(
61+
nlpmo, Optimization.AutoForwardDiff())
62+
63+
sol_native = solve(oprob, Ipopt.Optimizer())
64+
sol_converted = solve(converted, Ipopt.Optimizer())
65+
66+
@test sol_converted.retcode == sol_native.retcode
67+
@test sol_converted.u sol_native.u
68+
69+
# Fourth problem: Problem 13 in the Hock-Schittkowski suite
70+
# https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.HS13
71+
# Problem with box & inequality bounds
72+
hs13(u, p) = (u[1] - 2.0)^2 + u[2]^2
73+
hs13_cons(res, u, p) = (res .= (1.0 - u[1])^3 - u[2])
74+
lcons = [0.0]
75+
ucons = [Inf]
76+
lb = [0.0; 0.0]
77+
ub = [Inf; Inf]
78+
u0 = [-2.0; -2.0]
79+
f = Optimization.OptimizationFunction(
80+
hs13, Optimization.AutoForwardDiff(); cons = hs13_cons)
81+
oprob = Optimization.OptimizationProblem(f, u0, lb = lb, ub = ub, lcons = lcons,
82+
ucons = ucons, sense = Optimization.MinSense)
83+
84+
nlpmo = NLPModelsTest.HS13()
85+
converted = OptimizationNLPModels.OptimizationProblem(
86+
nlpmo, Optimization.AutoForwardDiff())
87+
88+
sol_native = solve(oprob, Ipopt.Optimizer())
89+
sol_converted = solve(converted, Ipopt.Optimizer())
90+
91+
@test sol_converted.retcode == sol_native.retcode
92+
@test sol_converted.u sol_native.u
93+
94+
# Fifth problem: Problem 14 in the Hock-Schittkowski suite
95+
# https://jso.dev/NLPModelsTest.jl/dev/reference/#NLPModelsTest.HS14
96+
# Problem with mixed equality & inequality constraints
97+
hs14(u, p) = (u[1] - 2.0)^2 + (u[2] - 1.0)^2
98+
hs14_cons(res, u, p) = (res .= [u[1] - 2.0 * u[2];
99+
-0.25 * u[1]^2 - u[2]^2 + 1.0])
100+
lcons = [-1.0; 0.0]
101+
ucons = [-1.0; Inf]
102+
u0 = [2.0; 2.0]
103+
f = Optimization.OptimizationFunction(
104+
hs14, Optimization.AutoForwardDiff(); cons = hs14_cons)
105+
oprob = Optimization.OptimizationProblem(
106+
f, u0, lcons = lcons, ucons = ucons, sense = Optimization.MinSense)
107+
108+
nlpmo = NLPModelsTest.HS14()
109+
converted = OptimizationNLPModels.OptimizationProblem(
110+
nlpmo, Optimization.AutoForwardDiff())
111+
112+
sol_native = solve(oprob, Ipopt.Optimizer())
113+
sol_converted = solve(converted, Ipopt.Optimizer())
114+
115+
@test sol_converted.retcode == sol_native.retcode
116+
@test sol_converted.u sol_native.u
117+
end

0 commit comments

Comments
 (0)