You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/benchmark.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
-
# Benchmarking solvers
1
+
# Benchmarking optimizers
2
2
3
3
Benchmarking is very important when researching new algorithms or selecting the most approriate ones.
4
4
5
-
The package [`SolverBenchmark`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl) exports the function [`bmark_solvers`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl/blob/main/src/bmark_solvers.jl) that runs a set of solvers on a set of problems. `JSOSuite.jl` specialize this function, see `bmark_solvers`.
5
+
The package [`SolverBenchmark`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl) exports the function [`bmark_solvers`](https://github.com/JuliaSmoothOptimizers/SolverBenchmark.jl/blob/main/src/bmark_solvers.jl) that runs a set of optimizers on a set of problems. `JSOSuite.jl` specialize this function, see `bmark_solvers`.
6
6
7
7
The [JuliaSmoothOptimizers organization](https://juliasmoothoptimizers.github.io) contains several packages of test problems ready to use for benchmarking. The main ones are
8
8
-[`OptimizationProblems.jl`](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl): This package provides a collection of optimization problems in JuMP and ADNLPModels syntax;
@@ -36,21 +36,21 @@ ad_problems = [
36
36
length(ad_problems) # return the number of problems
37
37
```
38
38
39
-
We now want to select appropriate solvers using the `JSOSuite.solvers`.
39
+
We now want to select appropriate optimizers using the `JSOSuite.optimizers`.
40
40
41
41
```@example op
42
-
selected_solvers = JSOSuite.solvers
43
-
# solvers can solve general `nlp` as some are specific to variants (NLS, ...)
Copy file name to clipboardExpand all lines: docs/src/speed-up.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,11 @@ The following contains a list of tips to speed up the solver selection and usage
4
4
5
5
## Derivatives
6
6
7
-
The solvers available in `JSOSuite.jl` are all using first and sometines second-order derivatives. There are mainly three categories:
7
+
The optimizers available in `JSOSuite.jl` are all using first and sometines second-order derivatives. There are mainly three categories:
8
8
- 1st order methods use only gradient information;
9
9
- 1st order quasi-Newton methods require only gradient information, and uses it to build an approximation of the Hessian;
10
10
- 2nd order methods: Those are using gradients and Hessian information.
11
-
- 2nd order methods matrix-free: Those are solvers using Hessian information, but without ever forming the matrix, so only matrix-vector products are computed.
11
+
- 2nd order methods matrix-free: Those are optimizers using Hessian information, but without ever forming the matrix, so only matrix-vector products are computed.
12
12
13
13
The latter is usually a good tradeoff for very large problems.
14
14
@@ -22,7 +22,7 @@ stats
22
22
23
23
## Find a better initial guess
24
24
25
-
The majority of derivative-based solvers are local methods whose performance are dependent of the initial guess.
25
+
The majority of derivative-based optimizers are local methods whose performance are dependent of the initial guess.
26
26
This usually relies on specific knowledge of the problem.
27
27
28
28
The function [`feasible_point`](@ref) computes a point satisfying the constraints of the problem that can be used as an initial guess.
@@ -31,15 +31,15 @@ An alternative is to solve a simpler version of the problem and reuse the soluti
31
31
## Use the structure of the problem
32
32
33
33
If the problem has linear constraints, then it is efficient to specify it at the modeling stage to avoid having them treated like nonlinear ones.
34
-
Some of the solvers will also exploit this information.
34
+
Some of the optimizers will also exploit this information.
35
35
36
-
Similarly, quadratic objective or least squares problems have tailored modeling tools and solvers.
36
+
Similarly, quadratic objective or least squares problems have tailored modeling tools and optimizers.
37
37
38
38
## Change the parameters of the solver
39
39
40
-
Once a solver has been chosen it is also possible to play with the key parameters. Find below a list of the available solvers and parameters.
40
+
Once a solver has been chosen it is also possible to play with the key parameters. Find below a list of the available optimizers and parameters.
41
41
42
-
Note that all solvers presented here have been carefully optimized. All have different strengths. Trying another solver on the same problem sometimes provide a different solution.
42
+
Note that all optimizers presented here have been carefully optimized. All have different strengths. Trying another solver on the same problem sometimes provide a different solution.
Copy file name to clipboardExpand all lines: docs/src/tutorial.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ There are two important challenges in solving an optimization problem: (i) model
6
6
7
7
## Modeling
8
8
9
-
All these solvers rely on the `NLPModel API` from [NLPModels.jl](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) for general nonlinear optimization problems of the form
9
+
All these optimizers rely on the `NLPModel API` from [NLPModels.jl](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) for general nonlinear optimization problems of the form
The first argument should be of type `SolverCore.AbstractOptimizationSolver`, see for instance `JSOSuite.solvers[!, :name_solver]`.
409
+
The first argument should be of type `SolverCore.AbstractOptimizationSolver`, see for instance `JSOSuite.optimizers[!, :name_solver]`.
410
410
"""
411
411
function SolverCore.solve!(solver, args...; kwargs...)
412
412
throw(
413
-
"solve! not implemented first argument should be of type `SolverCore.AbstractOptimizationSolver` and not $(typeof(solver)), see for instance `JSOSuite.solvers[!, :name_solver]`.",
413
+
"solve! not implemented first argument should be of type `SolverCore.AbstractOptimizationSolver` and not $(typeof(solver)), see for instance `JSOSuite.optimizers[!, :name_solver]`.",
0 commit comments