Skip to content

Commit 076dc0e

Browse files
committed
Rename solve by minimize
1 parent 4770f22 commit 076dc0e

File tree

13 files changed

+106
-106
lines changed

13 files changed

+106
-106
lines changed

docs/src/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,13 +15,13 @@ All these solvers rely on the `NLPModel API` from [NLPModels.jl](https://github.
1515

1616
The package `JSOSuite` exports a function [`solve`](@ref):
1717
```
18-
output = solve(args...; kwargs...)
18+
output = minimize(args...; kwargs...)
1919
```
2020
The arguments are used to define the problem, see [Tutorial](@ref tutorial-section).
2121

2222
It is also possible to define an `NLPModel` or a `JuMP` model representing the problem, and then call `solve`:
2323
```
24-
output = solve(nlpmodel; kwargs...)
24+
output = minimize(nlpmodel; kwargs...)
2525
```
2626

2727
The `NLPModel API` is a general consistent API for solvers to interact with models by providing flexible data types to represent the objective and constraint functions to evaluate their derivatives, and to provide essentially any information that a solver might request from a model. [JuliaSmoothOrganization's website](https://juliasmoothoptimizers.github.io) or [NLPModels.jl's documentation](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/) provide more tutorials on this topic.

docs/src/nls.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -44,21 +44,21 @@ nls = ADNLSModel(F, x0, nres, c, l, l, name="AD-Rosenbrock")
4444
Note that the length of the residual function is given explictly to avoid any superfluous evaluation of this (potentially very large) function.
4545

4646
```@example ex1
47-
stats = solve(nls)
47+
stats = minimize(nls)
4848
```
4949

5050
`JSOSuite.jl` uses by default automatic differentiation, so the following code would be equivalent:
5151

5252
```@example ex1
53-
stats = solve(F, x0, nres, c, l, l)
53+
stats = minimize(F, x0, nres, c, l, l)
5454
```
5555

5656
By default, `JSOSuite.solve` will use a solver tailored for nonlineat least squares problem.
5757
Nevertheless, it is also possible to specify the solver to be used.
5858

5959
```@example ex1
6060
using NLPModelsIpopt
61-
stats = solve("IPOPT", F, x0, nres, c, l, l)
61+
stats = minimize("IPOPT", F, x0, nres, c, l, l)
6262
```
6363

6464
We refer to the documentation of [`ADNLPModels.jl`](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/backend/) for more details on the AD system use and how to modify it.
@@ -78,7 +78,7 @@ x0 = [-1.2; 1.0]
7878
@NLconstraint(model, x[1] * x[2] == 1)
7979
8080
nls = MathOptNLSModel(model, [F1, F2], name="Ju-Rosenbrock")
81-
stats = solve(nls)
81+
stats = minimize(nls)
8282
```
8383

8484
## Find a feasible point of an optimization problem or solve a nonlinear system

docs/src/qp.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -18,10 +18,10 @@ In JuliaSmoothOptimizers, the package [`QuadraticModels.jl`](https://github.com/
1818
The function `solve` with the following sets of arguments will automatically build a `QuadraticModel` and choose the adequate solver.
1919

2020
```julia
21-
stats = solve(c, H, c0 = c0, x0 = x0, name = name; kwargs...)
22-
stats = solve(c, H, lvar, uvar, c0 = c0, x0 = x0, name = name; kwargs...)
23-
stats = solve(c, H, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
24-
stats = solve(c, H, lvar, uvar, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
21+
stats = minimize(c, H, c0 = c0, x0 = x0, name = name; kwargs...)
22+
stats = minimize(c, H, lvar, uvar, c0 = c0, x0 = x0, name = name; kwargs...)
23+
stats = minimize(c, H, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
24+
stats = minimize(c, H, lvar, uvar, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
2525
```
2626

2727
## Example
@@ -41,20 +41,20 @@ The quadratic model can then be solved using [`solve`](@ref).
4141

4242
```@example ex1
4343
using JSOSuite
44-
stats = solve(c, H, A, lcon, ucon, name = "eqconqp_QP")
44+
stats = minimize(c, H, A, lcon, ucon, name = "eqconqp_QP")
4545
```
4646

4747
This is equivalent to building a `QuadraticModel` and then [`solve`](@ref).
4848

4949
```@example ex1
5050
using QuadraticModels, JSOSuite
5151
qp_model = QuadraticModel(c, H, A, lcon, ucon, name = "eqconqp_QP")
52-
stats = solve(qp_model)
52+
stats = minimize(qp_model)
5353
```
5454

5555
As usual, it is also possible to select manually the solver to be used.
5656

5757
```@example ex1
5858
using RipQP
59-
stats = solve("RipQP", c, H, A, lcon, ucon, name = "eqconqp_QP")
59+
stats = minimize("RipQP", c, H, A, lcon, ucon, name = "eqconqp_QP")
6060
```

docs/src/speed-up.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The latter is usually a good tradeoff for very large problems.
1616
using JSOSuite
1717
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
1818
x0 = [-1.2; 1.0]
19-
stats = solve(f, x0, verbose = 0, highest_derivative_available = 1)
19+
stats = minimize(f, x0, verbose = 0, highest_derivative_available = 1)
2020
stats
2121
```
2222

docs/src/tutorial.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ All these optimizers rely on the `NLPModel API` from [NLPModels.jl](https://gith
1919

2020
The function `solve` accepts as an argument any model `nlp` subtype of `AbstractNLPModel`.
2121
```julia
22-
output = solve(nlpmodel::AbstractNLPModel; kwargs...)
22+
output = minimize(nlpmodel::AbstractNLPModel; kwargs...)
2323
```
2424

2525
In the rest of this section, we focus on examples using generic modeling tools.
@@ -35,7 +35,7 @@ model = Model()
3535
@variable(model, y)
3636
@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
3737
38-
solve(model)
38+
minimize(model)
3939
```
4040

4141
We refer to [`JuMP tutorial`](https://jump.dev/JuMP.jl/stable/).
@@ -50,15 +50,15 @@ We refer to [`ADNLPModel`](https://juliasmoothoptimizers.github.io/ADNLPModels.j
5050
using JSOSuite
5151
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
5252
x0 = [-1.2; 1.0]
53-
stats = solve(f, x0, verbose = 0)
53+
stats = minimize(f, x0, verbose = 0)
5454
```
5555

5656
```@example
5757
using ADNLPModels, JSOSuite
5858
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
5959
x0 = [-1.2; 1.0]
6060
nlp = ADNLPModel(f, x0)
61-
stats = solve(nlp)
61+
stats = minimize(nlp)
6262
```
6363

6464
One of the main advantages of this constructor is the possibility to run computations in different arithmetics.
@@ -67,7 +67,7 @@ One of the main advantages of this constructor is the possibility to run computa
6767
using JSOSuite
6868
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
6969
x0 = Float32[-1.2; 1.0]
70-
stats = solve(f, x0, verbose = 0)
70+
stats = minimize(f, x0, verbose = 0)
7171
```
7272

7373
#### Bound-constrained
@@ -77,15 +77,15 @@ using JSOSuite
7777
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
7878
x0 = [-1.2; 1.0]
7979
lvar, uvar = 2 * ones(2), 4 * ones(2)
80-
stats = solve(f, x0, lvar, uvar, verbose = 0)
80+
stats = minimize(f, x0, lvar, uvar, verbose = 0)
8181
```
8282

8383
```@example
8484
using ADNLPModels, JSOSuite
8585
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
8686
x0 = [-1.2; 1.0]
8787
nlp = ADNLPModel(f, x0)
88-
stats = solve(nlp)
88+
stats = minimize(nlp)
8989
```
9090

9191
#### Nonlinear constrained
@@ -96,7 +96,7 @@ f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
9696
x0 = [-1.2; 1.0]
9797
c = x -> [x[1]]
9898
l = ones(1)
99-
stats = solve(f, x0, c, l, l, verbose = 0)
99+
stats = minimize(f, x0, c, l, l, verbose = 0)
100100
```
101101

102102
#### Linearly constrained
@@ -110,7 +110,7 @@ A = sparse([
110110
2.0 3.0
111111
])
112112
l = ones(2)
113-
stats = solve(f, x0, A, l, l, verbose = 0)
113+
stats = minimize(f, x0, A, l, l, verbose = 0)
114114
```
115115

116116
#### All constraints
@@ -122,7 +122,7 @@ x0 = [-1.2; 1.0]
122122
A = sparse([2.0 3.0])
123123
c = x -> [x[1]]
124124
l = ones(2)
125-
stats = solve(f, x0, A, c, l, l, verbose = 0)
125+
stats = minimize(f, x0, A, c, l, l, verbose = 0)
126126
```
127127

128128
## Solving
@@ -170,7 +170,7 @@ using ADNLPModels, JSOSuite
170170
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
171171
x0 = [-1.2; 1.0]
172172
nlp = ADNLPModel(f, x0)
173-
stats = solve(nlp, atol = 1e-5, rtol = 1e-7, max_time = 10.0, max_eval = 10000, verbose = 1)
173+
stats = minimize(nlp, atol = 1e-5, rtol = 1e-7, max_time = 10.0, max_eval = 10000, verbose = 1)
174174
```
175175

176176
Further possible options are documented in each solver's documentation. For instance, we can update the `mem` parameter of `LBFGS`.
@@ -180,5 +180,5 @@ using ADNLPModels, JSOSuite
180180
f = x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
181181
x0 = [-1.2; 1.0]
182182
nlp = ADNLPModel(f, x0)
183-
stats = solve("LBFGS", nlp, mem = 10, verbose = 1)
183+
stats = minimize("LBFGS", nlp, mem = 10, verbose = 1)
184184
```

src/JSOSuite.jl

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -297,28 +297,28 @@ push!(
297297

298298
include("selection.jl")
299299

300-
export solve, solve!, feasible_point
300+
export minimize, solve!, feasible_point
301301

302302
"""
303-
stats = solve(nlp::Union{AbstractNLPModel, JuMP.Model}; kwargs...)
303+
stats = minimize(nlp::Union{AbstractNLPModel, JuMP.Model}; kwargs...)
304304
305305
Compute a local minimum of the optimization problem `nlp`.
306306
307-
stats = solve(f::Function, x0::AbstractVector, args...; kwargs...)
308-
stats = solve(F::Function, x0::AbstractVector, nequ::Integer, args...; kwargs...)
307+
stats = minimize(f::Function, x0::AbstractVector, args...; kwargs...)
308+
stats = minimize(F::Function, x0::AbstractVector, nequ::Integer, args...; kwargs...)
309309
310310
Define an NLPModel using [`ADNLPModel`](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/stable/).
311311
312-
stats = solve(c, H, c0 = c0, x0 = x0, name = name; kwargs...)
313-
stats = solve(c, H, lvar, uvar, c0 = c0, x0 = x0, name = name; kwargs...)
314-
stats = solve(c, H, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
315-
stats = solve(c, H, lvar, uvar, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
312+
stats = minimize(c, H, c0 = c0, x0 = x0, name = name; kwargs...)
313+
stats = minimize(c, H, lvar, uvar, c0 = c0, x0 = x0, name = name; kwargs...)
314+
stats = minimize(c, H, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
315+
stats = minimize(c, H, lvar, uvar, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
316316
317317
Define a QuadraticModel using [`QuadraticModel`](https://juliasmoothoptimizers.github.io/QuadraticModels.jl/stable/).
318318
319-
The solver can be chosen as follows.
319+
The optimizer can be chosen as follows.
320320
321-
stats = solve(solver_name::String, args...; kwargs...)
321+
stats = minimize(optimizer_name::String, args...; kwargs...)
322322
323323
`JuMP.Model` are converted in NLPModels via NLPModelsJuMP.jl.
324324
@@ -359,7 +359,7 @@ The value returned is a `GenericExecutionStats`, see `SolverCore.jl`.
359359
# Examples
360360
```jldoctest; output = false
361361
using JSOSuite
362-
stats = solve(x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0], verbose = 0)
362+
stats = minimize(x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0], verbose = 0)
363363
stats
364364
365365
# output
@@ -371,7 +371,7 @@ The list of available solver can be obtained using `JSOSuite.optimizers[!, :name
371371
372372
```jldoctest; output = false
373373
using JSOSuite
374-
stats = solve("TRON", x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0], verbose = 0)
374+
stats = minimize("TRON", x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0], verbose = 0)
375375
stats
376376
377377
# output
@@ -383,13 +383,13 @@ Some optimizers are available after loading only.
383383
384384
```jldoctest; output = false
385385
using JSOSuite
386-
# We solve here a quadratic problem with bound-constraints
386+
# We minimize here a quadratic problem with bound-constraints
387387
c = [1.0; 1.0]
388388
H = [-2.0 0.0; 3.0 4.0]
389389
uvar = [1.0; 1.0]
390390
lvar = [0.0; 0.0]
391391
x0 = [0.5; 0.5]
392-
stats = solve("TRON", c, H, lvar, uvar, x0 = x0, name = "bndqp_QP", verbose = 0)
392+
stats = minimize("TRON", c, H, lvar, uvar, x0 = x0, name = "bndqp_QP", verbose = 0)
393393
stats
394394
395395
# output
@@ -399,7 +399,7 @@ stats
399399
```
400400
401401
"""
402-
function solve end
402+
function minimize end
403403

404404
"""
405405
solve!(solver::AbstractOptimizationSolver, model::Union{AbstractNLPModel, JuMP.Model}; kwargs...)
@@ -421,7 +421,7 @@ include("solve.jl")
421421
@init begin
422422
@require CaNNOLeS = "5a1c9e79-9c58-5ec0-afc4-3298fdea2875" begin
423423
JSOSuite.optimizers[JSOSuite.optimizers.name .== "CaNNOLeS", :is_available] .= 1
424-
function solve(::Val{:CaNNOLeS}, nlp; kwargs...)
424+
function minimize(::Val{:CaNNOLeS}, nlp; kwargs...)
425425
return CaNNOLeS.cannoles(nlp; linsolve = :ldlfactorizations, kwargs...)
426426
end
427427

@@ -431,7 +431,7 @@ end
431431
@init begin
432432
@require DCISolver = "bee2e536-65f6-11e9-3844-e5bb4c9c55c9" begin
433433
JSOSuite.optimizers[JSOSuite.optimizers.name .== "DCISolver", :is_available] .= 1
434-
function solve(::Val{:DCISolver}, nlp; kwargs...)
434+
function minimize(::Val{:DCISolver}, nlp; kwargs...)
435435
return DCISolver.dci(nlp; kwargs...)
436436
end
437437
end
@@ -440,7 +440,7 @@ end
440440
@init begin
441441
@require FletcherPenaltySolver = "e59f0261-166d-4fee-8bf3-5e50457de5db" begin
442442
JSOSuite.optimizers[JSOSuite.optimizers.name .== "FletcherPenaltySolver", :is_available] .= 1
443-
function solve(::Val{:FletcherPenaltySolver}, nlp; kwargs...)
443+
function minimize(::Val{:FletcherPenaltySolver}, nlp; kwargs...)
444444
return FletcherPenaltySolver.fps_solve(nlp; kwargs...)
445445
end
446446
end
@@ -576,7 +576,7 @@ function bmark_solvers end
576576
)
577577
for s in solver_names
578578
solvers[Symbol(s)] =
579-
nlp -> solve(
579+
nlp -> minimize(
580580
s,
581581
nlp;
582582
atol = atol,
@@ -619,7 +619,7 @@ function feasible_point end
619619

620620
function feasible_point(nlp::AbstractNLPModel, args...; kwargs...)
621621
nls = FeasibilityFormNLS(FeasibilityResidual(nlp))
622-
stats_nls = solve(nls, args...; kwargs...)
622+
stats_nls = minimize(nls, args...; kwargs...)
623623
stats = GenericExecutionStats(nlp)
624624
set_status!(stats, stats_nls.status)
625625
set_solution!(stats, stats_nls.solution[1:get_nvar(nlp)])

src/solve.jl

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
function solve(
1+
function minimize(
22
nlp::AbstractNLPModel;
33
verbose = 1,
44
highest_derivative_available::Integer = 2,
@@ -7,10 +7,10 @@ function solve(
77
select = select_optimizers(nlp, verbose, highest_derivative_available)
88
(verbose 1) && println("Solve using $(first(select).name):")
99
solver = first(select)
10-
return solve(Val(Symbol(solver.name)), nlp; verbose = verbose, kwargs...)
10+
return minimize(Val(Symbol(solver.name)), nlp; verbose = verbose, kwargs...)
1111
end
1212

13-
function solve(
13+
function minimize(
1414
nlp::AbstractNLSModel;
1515
verbose = 1,
1616
highest_derivative_available::Integer = 2,
@@ -24,16 +24,16 @@ function solve(
2424
first(select)
2525
end
2626
(verbose 1) && println("Solve using $(solver.name):")
27-
return solve(Val(Symbol(solver.name)), nlp; verbose = verbose, kwargs...)
27+
return minimize(Val(Symbol(solver.name)), nlp; verbose = verbose, kwargs...)
2828
end
2929

30-
function solve(solver_name::String, nlp; kwargs...)
30+
function minimize(solver_name::String, nlp; kwargs...)
3131
solver = optimizers[optimizers.name .== solver_name, :]
3232
if isempty(solver)
3333
@warn "$(solver_name) does not exist."
3434
return GenericExecutionStats(nlp)
3535
end
36-
return solve(Val(Symbol(solver_name)), nlp; kwargs...)
36+
return minimize(Val(Symbol(solver_name)), nlp; kwargs...)
3737
end
3838

3939
function throw_error_solve(solver::Symbol)
@@ -47,7 +47,7 @@ function throw_error_solve(solver::Symbol)
4747
return throw(ArgumentError(str))
4848
end
4949

50-
function solve(::Val{solver_name}, nlp; kwargs...) where {solver_name}
50+
function minimize(::Val{solver_name}, nlp; kwargs...) where {solver_name}
5151
solver = optimizers[optimizers.name .== string(solver_name), :]
5252
if !is_available(solver_name)
5353
throw_error_solve(solver_name)

0 commit comments

Comments
 (0)