You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/benchmark.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Benchmarking is very important when researching new algorithms or selecting the
4
4
5
5
The package [`SolverBenchmark`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl) exports the function [`bmark_solvers`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl/blob/main/src/bmark_solvers.jl) that runs a set of optimizers on a set of problems. `JSOSuite.jl` specialize this function, see `bmark_solvers`.
6
6
7
-
The [JuliaSmoothOptimizers organization](https://juliasmoothoptimizers.github.io) contains several packages of test problems ready to use for benchmarking. The main ones are
7
+
The [JuliaSmoothOptimizers organization](https://jso.dev) contains several packages of test problems ready to use for benchmarking. The main ones are
8
8
-[`OptimizationProblems.jl`](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl): This package provides a collection of optimization problems in JuMP and ADNLPModels syntax;
Then, we generate the list of problems using [`ADNLPModel`](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/reference/#ADNLPModels.ADNLPModel-Union{Tuple{S},%20Tuple{Any,%20S}}%20where%20S).
30
+
Then, we generate the list of problems using [`ADNLPModel`](https://jso.dev/ADNLPModels.jl/dev/reference/#ADNLPModels.ADNLPModel-Union{Tuple{S},%20Tuple{Any,%20S}}%20where%20S).
31
31
32
32
```@example op
33
33
ad_problems = [
@@ -39,6 +39,7 @@ length(ad_problems) # return the number of problems
39
39
We now want to select appropriate optimizers using the `JSOSuite.optimizers`.
40
40
41
41
```@example op
42
+
using NLPModelsIpopt
42
43
selected_optimizers = JSOSuite.optimizers
43
44
# optimizers can solve general `nlp` as some are specific to variants (NLS, ...)
Copy file name to clipboardExpand all lines: docs/src/index.md
+12-11Lines changed: 12 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,45 +9,46 @@ All these solvers rely on the `NLPModel API` from [NLPModels.jl](https://github.
9
9
\min \quad & f(x) \\
10
10
& c_L \leq c(x) \leq c_U \\
11
11
& c_A \leq Ax \leq l_A, \\
12
-
& \ell \leq x \leq u,
12
+
& \ell \leq x \leq u.
13
13
\end{aligned}
14
14
```
15
15
16
16
The package `JSOSuite` exports a function [`minimize`](@ref):
17
17
```
18
18
output = minimize(args...; kwargs...)
19
19
```
20
-
The arguments are used to define the problem, see [Tutorial](@ref tutorial-section).
20
+
where the arguments define the problem, see [Tutorial](@ref tutorial-section).
21
21
22
22
It is also possible to define an `NLPModel` or a `JuMP` model representing the problem, and then call `minimize`:
23
23
```
24
24
output = minimize(nlpmodel; kwargs...)
25
+
output = minimize(jump; kwargs...)
25
26
```
26
27
27
-
The `NLPModel API` is a general consistent API for solvers to interact with models by providing flexible data types to represent the objective and constraint functions to evaluate their derivatives, and to provide essentially any information that a solver might request from a model. [JuliaSmoothOrganization's website](https://juliasmoothoptimizers.github.io) or [NLPModels.jl's documentation](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/) provide more tutorials on this topic.
28
+
The `NLPModel API` is a general API for solvers to interact with models by providing flexible data types to represent the objective and constraint functions to evaluate their derivatives, and to provide essentially any information that a solver might request from a model. [JuliaSmoothOrganization's website jso.dev](https://jso.dev) or [NLPModels.jl's documentation](https://jso.dev/NLPModels.jl/dev/) provide more tutorials on this topic.
28
29
29
30
### NLPModel
30
31
31
-
JuliaSmoothOptimizers' compliant solvers accept any model compatible with the NLPModel API. See the [Tutorial](@ref tutorial-section) section for examples.
32
+
JuliaSmoothOptimizers' compliant solvers accept any model compatible with the `NLPModel API`. See the [Tutorial](@ref tutorial-section) section for examples.
32
33
33
34
Depending on the origin of the problem several modeling tools are available. The following generic modeling tools are accepted:
34
-
-`JuMP` models are internally made compatible with NLPModel via [NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl).
35
-
-`Ampl` models stored in a `.nl` file can `AmplModel("name_of_file.nl")` using [AmplNLReader.jl](https://github.com/JuliaSmoothOptimizers/AmplNLReader.jl).
36
-
-[QPSReader.jl](https://github.com/JuliaSmoothOptimizers/QPSReader.jl) reads linear problems in MPS format and quadratic problems in QPS format.
37
-
- Models using automatic differentiation can be generated using [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl).
35
+
-`JuMP` models are internally made compatible with NLPModel via [NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl);
36
+
-`Ampl` models stored in a `.nl` file can be instantiated with `AmplModel("name_of_file.nl")` using [AmplNLReader.jl](https://github.com/JuliaSmoothOptimizers/AmplNLReader.jl);
37
+
-[QPSReader.jl](https://github.com/JuliaSmoothOptimizers/QPSReader.jl) reads linear problems in MPS format and quadratic problems in QPS format;
38
+
- Models using automatic differentiation can be generated using [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl);
38
39
- Models with manually input derivatives can be defined using [ManualNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ManualNLPModels.jl).
39
40
40
-
It is also possible to define your NLPModel variant. Several examples are available within JuliaSmoothOptimizers' umbrella:
41
+
It is also possible to define your `NLPModel` variant. Several examples are available within JuliaSmoothOptimizers' umbrella:
41
42
-[KnetNLPModels.jl](https://github.com/JuliaSmoothOptimizers/KnetNLPModels.jl): An NLPModels Interface to Knet.
42
43
-[PDENLPModels.jl](https://github.com/JuliaSmoothOptimizers/PDENLPModels.jl): A NLPModel API for optimization problems with PDE-constraints.
43
44
44
45
A nonlinear least squares problem is a special case with the objective function defined as ``f(x) = \tfrac{1}{2}\|F(x)\|^2_2``.
45
46
Although the problem can be solved using only ``f``, knowing ``F`` independently allows the development of more efficient methods.
46
-
See the [Nonlinear Least Squares](@ref nls-section) for special treatment of these problems.
47
+
See the [Nonlinear Least Squares](@ref nls-section) for more on the special treatment of these problems.
47
48
48
49
### Output
49
50
50
-
The value returned is a [`GenericExecutionStats`](https://juliasmoothoptimizers.github.io/SolverCore.jl/dev/reference/#SolverCore.GenericExecutionStats), which is a structure containing the available information at the end of the execution, such as a solver status, the objective function value, the norm of the residuals, the elapsed time, etc.
51
+
The value returned is a [`GenericExecutionStats`](https://jso.dev/SolverCore.jl/dev/reference/#SolverCore.GenericExecutionStats), which is a structure containing the available information at the end of the execution, such as a solver status, the objective function value, the norm of the residuals, the elapsed time, etc.
51
52
52
53
It contains the following fields:
53
54
-`status`: Indicates the output of the solver. Use `show_statuses()` for the full list;
Copy file name to clipboardExpand all lines: docs/src/nls.md
+9-10Lines changed: 9 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ The nonlinear least squares (NLS) optimization problem is a specific case where
7
7
\min \quad & f(x):=\tfrac{1}{2}\|F(x)\|^2_2 \\
8
8
& c_L \leq c(x) \leq c_U \\
9
9
& c_A \leq Ax \leq l_A, \\
10
-
& \ell \leq x \leq u,
10
+
& \ell \leq x \leq u.
11
11
\end{aligned}
12
12
```
13
13
@@ -24,9 +24,7 @@ In this tutorial, we consider the following equality-constrained problem
24
24
```
25
25
where ``1 \leq x[1] x[2] \leq 1`` implies that ``x[1] x[2] = 1``.
26
26
27
-
There are two important challenges in solving an optimization problem: (i) model the problem, and (ii) solve the problem with an appropriate solve.
28
-
29
-
Let's see two ways to model this problem exploiting the knowledge of the structure of the problem.
27
+
In the rest of this tutorial, we will see two ways to model this problem exploiting the knowledge of the structure of the problem.
30
28
31
29
### NLS using automatic differentiation
32
30
@@ -53,19 +51,19 @@ stats = minimize(nls)
53
51
stats = minimize(F, x0, nres, c, l, l)
54
52
```
55
53
56
-
By default, `JSOSuite.solve` will use a solver tailored for nonlineat least squares problem.
54
+
By default, `JSOSuite.minimize` will use a solver tailored for nonlineat least squares problem.
57
55
Nevertheless, it is also possible to specify the solver to be used.
58
56
59
57
```@example ex1
60
58
using NLPModelsIpopt
61
59
stats = minimize("IPOPT", F, x0, nres, c, l, l)
62
60
```
63
61
64
-
We refer to the documentation of [`ADNLPModels.jl`](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/backend/) for more details on the AD system use and how to modify it.
62
+
We refer to the documentation of [`ADNLPModels.jl`](https://jso.dev/ADNLPModels.jl/dev/backend/) for more details on the AD system use and how to modify it.
65
63
66
64
### NLS using JuMP
67
65
68
-
The package [NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl) exports a constructor, [`MathOptNLSModel`](https://juliasmoothoptimizers.github.io/NLPModelsJuMP.jl/dev/tutorial/#NLPModelsJuMP.MathOptNLSModel), to build an `AbstractNLSModel` using `JuMP`.
66
+
The package [NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl) exports a constructor, [`MathOptNLSModel`](https://jso.dev/NLPModelsJuMP.jl/dev/tutorial/#NLPModelsJuMP.MathOptNLSModel), to build an `AbstractNLSModel` using `JuMP`.
69
67
70
68
```@example
71
69
using JuMP, JSOSuite, NLPModelsJuMP
@@ -89,12 +87,13 @@ We show here how to find the feasible point of a given model.
89
87
\begin{aligned}
90
88
\min \quad & \tfrac{1}{2}\|s\|^2_2 \\
91
89
& 0 \leq s - c(x) \leq 0
92
-
& \ell \leq x \leq u,
90
+
& \ell \leq x \leq u.
93
91
\end{aligned}
94
92
```
95
93
96
94
This formulation can also be used to solve a set of nonlinear equations.
97
-
Finding a feasible point of an optimization problem is useful to find the problem is feasible and it is a good practice to find an initial guess.
95
+
Finding a feasible point of an optimization problem is useful to determine whether the problem is feasible or not.
96
+
Moreover, it is a good practice to find an initial guess.
Copy file name to clipboardExpand all lines: docs/src/qp.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,14 +6,14 @@ The quadratic model with linear constraints is another specific case where the o
6
6
\begin{aligned}
7
7
\min \quad & x^TQx + c^Tx + c_0 \\
8
8
& c_A \leq Ax \leq l_A, \\
9
-
& \ell \leq x \leq u,
9
+
& \ell \leq x \leq u.
10
10
\end{aligned}
11
11
```
12
12
13
13
This problem is convex whenever the matrix `Q` is positive semi-definite. A key aspect here is the modeling of the matrices `Q` and `A`.
14
14
The main data structure available in Julia are: `LinearAlgebra.Matrix`, `SparseArrays.sparse`, `SparseMatricesCOO.sparse`, `LinearOperators.LinearOperator`.
15
15
16
-
In JuliaSmoothOptimizers, the package [`QuadraticModels.jl`](https://github.com/JuliaSmoothOptimizers/QuadraticModels.jl) can be used to access the NLPModel API for such instance.
16
+
In JuliaSmoothOptimizers, the package [`QuadraticModels.jl`](https://github.com/JuliaSmoothOptimizers/QuadraticModels.jl) can be used to access the `NLPModel API` for such instance.
17
17
18
18
The function `minimize` with the following sets of arguments will automatically build a `QuadraticModel` and choose the adequate solver.
19
19
@@ -24,7 +24,7 @@ The function `minimize` with the following sets of arguments will automatically
24
24
stats =minimize(c, H, lvar, uvar, A, lcon, ucon, c0 = c0, x0 = x0, name = name; kwargs...)
This classification is straightforwardly extended to handling constraints with the Jacobian matrix explicitly of via matrix-vector products.
24
+
23
25
## Find a better initial guess
24
26
25
27
The majority of derivative-based optimizers are local methods whose performance are dependent of the initial guess.
@@ -41,14 +43,29 @@ Once a solver has been chosen it is also possible to play with the key parameter
41
43
42
44
Note that all optimizers presented here have been carefully optimized. All have different strengths. Trying another solver on the same problem sometimes provide a different solution.
43
45
44
-
### Unconstrained/Bound-constrained
46
+
### Unconstrained
45
47
46
-
##### LBFGS
48
+
##### LBFGS (1st order)
47
49
48
50
-`mem::Int = 5`: memory parameter of the `lbfgs` algorithm;
49
51
-`τ₁::T = T(0.9999)`: slope factor in the Wolfe condition when performing the line search;
50
52
-`bk_max:: Int = 25`: maximum number of backtracks when performing the line search.
Copy file name to clipboardExpand all lines: docs/src/tutorial.md
+16-14Lines changed: 16 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
In this tutorial, we provide examples of usage of the `minimize` function exported by `JSOSuite.jl`.
4
4
5
-
There are two important challenges in solving an optimization problem: (i) model the problem, and (ii) solve the problem with an appropriate solver.
5
+
There are two important challenges in solving an optimization problem: (i) model the problem, and (ii) solve the problem with an appropriate optimizer.
6
6
7
7
## Modeling
8
8
@@ -13,7 +13,7 @@ All these optimizers rely on the `NLPModel API` from [NLPModels.jl](https://gith
In the rest of this section, we focus on examples using generic modeling tools.
26
26
27
-
It is generally of great interest if available to use a modeling that handles the structure of the problem, see [Nonlinear Least Squares](@ref nls-section) for an example with nonlinear least squares.
27
+
It is generally of great interest if available to use a modeling that exploits the structure of the problem, see [Nonlinear Least Squares](@ref nls-section) for an example with nonlinear least squares.
28
28
29
29
### JuMP Model
30
30
@@ -38,11 +38,11 @@ model = Model()
38
38
minimize(model)
39
39
```
40
40
41
-
We refer to [`JuMP tutorial`](https://jump.dev/JuMP.jl/stable/).
41
+
We refer to [`JuMP tutorial`](https://jump.dev/JuMP.jl/stable/) for more on modeling problems with JuMP.
42
42
43
43
### NLPModel with Automatic Differentiation
44
44
45
-
We refer to [`ADNLPModel`](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/reference/#ADNLPModels.ADNLPModel-Union{Tuple{S},%20Tuple{Any,%20S}}%20where%20S) for the description of the different constructors.
45
+
We refer to [`ADNLPModel`](https://jso.dev/ADNLPModels.jl/dev/reference/#ADNLPModels.ADNLPModel-Union{Tuple{S},%20Tuple{Any,%20S}}%20where%20S) for the description of the different constructors.
46
46
47
47
#### Unconstrained
48
48
@@ -125,9 +125,9 @@ l = ones(2)
125
125
stats = minimize(f, x0, A, c, l, l, verbose = 0)
126
126
```
127
127
128
-
## Solving
128
+
## Optimizing
129
129
130
-
Internally, the `minimize` function selects optimizers according to the problem's property and JSO-compliant optimizers available.
130
+
Internally, the `minimize` function selects optimizers according to the problem's property and their availability.
0 commit comments