Skip to content

Commit c9c667e

Browse files
Merge pull request #380 from ArnoStrouwen/fixtuts
example and strict docs
2 parents c9556c9 + 135fd8c commit c9c667e

16 files changed

+80
-58
lines changed

docs/Project.toml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,15 +7,22 @@ ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
77
Ipopt = "b6b21f68-93f8-5de0-b562-5493be1d77c9"
88
Ipopt_jll = "9cc047cb-c261-5740-88fc-0cf96f7bdcc7"
99
IterTools = "c8e1da08-722c-5040-9ed9-7db0dc04731e"
10+
Juniper = "2ddba703-00a4-53a7-87a5-e8b9971dde84"
1011
ModelingToolkit = "961ee093-0014-501f-94e3-6117800e7a78"
12+
NLopt = "76087f3c-5699-56af-9a33-bf431cd00edd"
1113
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
1214
OptimizationBBO = "3e6eede4-6085-4f62-9a71-46d9bc1eb92b"
1315
OptimizationCMAEvolutionStrategy = "bd407f91-200f-4536-9381-e4ba712f53f8"
1416
OptimizationEvolutionary = "cb963754-43f6-435e-8d4b-99009ff27753"
17+
OptimizationGCMAES = "6f0a0517-dbc2-4a7a-8a20-99ae7f27e911"
1518
OptimizationMOI = "fd9f6733-72f4-499f-8506-86b2bdd0dea1"
19+
OptimizationMetaheuristics = "3aafef2f-86ae-4776-b337-85a36adf0b55"
20+
OptimizationMultistartOptimization = "e4316d97-8bbb-4fd3-a7d8-3851d2a72823"
1621
OptimizationNLopt = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
22+
OptimizationNOMAD = "2cab0595-8222-4775-b714-9828e6a9e01b"
1723
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
1824
OptimizationOptimisers = "42dfb2eb-d2b4-4451-abcd-913932933ac1"
25+
OptimizationSpeedMapping = "3d669222-0d7d-4eb9-8a9f-d8528b0d9b91"
1926
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
2027
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
2128
SciMLSensitivity = "1ed8b502-d754-442c-8d5d-10ac956f44a1"

docs/make.jl

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,17 @@ include("pages.jl")
55

66
makedocs(sitename = "Optimization.jl",
77
authors = "Chris Rackauckas, Vaibhav Kumar Dixit et al.",
8-
clean = true,
9-
doctest = false,
108
modules = [Optimization, Optimization.SciMLBase, FiniteDiff,
119
ForwardDiff, ModelingToolkit, ReverseDiff, Tracker, Zygote],
10+
clean = true, doctest = false,
11+
strict = [
12+
:doctest,
13+
:linkcheck,
14+
:parse_error,
15+
:example_block,
16+
# Other available options are
17+
# :autodocs_block, :cross_references, :docs_block, :eval_block, :example_block, :footnote, :meta_block, :missing_docs, :setup_block
18+
],
1219
format = Documenter.HTML(analytics = "UA-90474609-3",
1320
assets = ["assets/favicon.ico"],
1421
canonical = "https://Optimization.sciml.ai/stable/"),

docs/src/API/optimization_function.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -20,11 +20,11 @@ The choices for the auto-AD fill-ins with quick descriptions are:
2020
The following sections describe the Auto-AD choices in detail.
2121

2222
```@docs
23-
AutoForwardDiff
24-
AutoFiniteDiff
25-
AutoReverseDiff
26-
AutoZygote
27-
AutoTracker
28-
AutoModelingToolkit
23+
Optimization.AutoForwardDiff
24+
Optimization.AutoFiniteDiff
25+
Optimization.AutoReverseDiff
26+
Optimization.AutoZygote
27+
Optimization.AutoTracker
28+
Optimization.AutoModelingToolkit
2929
```
3030

docs/src/optimization_packages/blackboxoptim.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,8 @@ The currently available algorithms are listed [here](https://github.com/robertfe
4848

4949
The Rosenbrock function can optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:
5050

51-
```julia
51+
```@example BBO
52+
using Optimization, OptimizationBBO
5253
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
5354
x0 = zeros(2)
5455
p = [1.0, 100.0]

docs/src/optimization_packages/cmaevolutionstrategy.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,8 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
2121

2222
The Rosenbrock function can optimized using the `CMAEvolutionStrategyOpt()` as follows:
2323

24-
```julia
24+
```@example CMAEvolutionStrategy
25+
using Optimization, OptimizationCMAEvolutionStrategy
2526
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
2627
x0 = zeros(2)
2728
p = [1.0, 100.0]

docs/src/optimization_packages/evolutionary.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,8 @@ Algorithm specific options are defined as `kwargs`. See the respective documenta
3131

3232
The Rosenbrock function can optimized using the `Evolutionary.CMAES()` as follows:
3333

34-
```julia
34+
```@example Evolutionary
35+
using Optimization, OptimizationEvolutionary
3536
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
3637
x0 = zeros(2)
3738
p = [1.0, 100.0]

docs/src/optimization_packages/gcmaes.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,8 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
2121

2222
The Rosenbrock function can optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:
2323

24-
```julia
24+
```@example GCMAES
25+
using Optimization, OptimizationGCMAES
2526
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
2627
x0 = zeros(2)
2728
p = [1.0, 100.0]
@@ -32,11 +33,8 @@ sol = solve(prob, GCMAESOpt())
3233

3334
We can also utilise the gradient information of the optimization problem to aid the optimization as follows:
3435

35-
```julia
36-
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
37-
x0 = zeros(2)
38-
p = [1.0, 100.0]
39-
f = OptimizationFunction(rosenbrock, Optimization.ForwardDiff)
36+
```@example GCMAES
37+
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
4038
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
4139
sol = solve(prob, GCMAESOpt())
4240
```

docs/src/optimization_packages/mathoptinterface.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,16 +61,15 @@ sol = solve(prob, Ipopt.Optimizer(); option_name = option_value, ...)
6161
[Juniper documentation](https://github.com/lanl-ansi/Juniper.jl) for more
6262
detail.
6363

64-
```julia
65-
using Optimization, ForwardDiff
64+
```@example MOI
65+
using Optimization, OptimizationMOI, Juniper, Ipopt
6666
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
6767
x0 = zeros(2)
6868
_p = [1.0, 100.0]
6969
7070
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
7171
prob = Optimization.OptimizationProblem(f, x0, _p)
7272
73-
using OptimizationMOI, Juniper, Ipopt
7473
opt = OptimizationMOI.MOI.OptimizerWithAttributes(
7574
Juniper.Optimizer,
7675
"nl_solver"=>OptimizationMOI.MOI.OptimizerWithAttributes(Ipopt.Optimizer, "print_level"=>0),

docs/src/optimization_packages/metaheuristics.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,8 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
4848

4949
The Rosenbrock function can optimized using the Evolutionary Centers Algorithm `ECA()` as follows:
5050

51-
```julia
51+
```@example Metaheuristics
52+
using Optimization, OptimizationMetaheuristics
5253
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
5354
x0 = zeros(2)
5455
p = [1.0, 100.0]
@@ -59,12 +60,7 @@ sol = solve(prob, ECA(), maxiters=100000, maxtime=1000.0)
5960

6061
Per default `Metaheuristics` ignores the initial values `x0` set in the `OptimizationProblem`. In order to for `Optimization` to use `x0` we have to set `use_initial=true`:
6162

62-
```julia
63-
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
64-
x0 = zeros(2)
65-
p = [1.0, 100.0]
66-
f = OptimizationFunction(rosenbrock)
67-
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
63+
```@example Metaheuristics
6864
sol = solve(prob, ECA(), use_initial=true, maxiters=100000, maxtime=1000.0)
6965
```
7066

docs/src/optimization_packages/multistartoptimization.md

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -26,9 +26,8 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
2626

2727
The Rosenbrock function can optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
2828

29-
```julia
30-
using OptimizationMultistartOptimization
31-
using OptimizationNLopt
29+
```@example MultiStart
30+
using Optimization, OptimizationMultistartOptimization, OptimizationNLopt
3231
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
3332
x0 = zeros(2)
3433
p = [1.0, 100.0]
@@ -39,12 +38,8 @@ sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
3938

4039
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. This for example means we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you access and adjust all the optimizer settings as you normally would:
4140

42-
```julia
41+
```@example MultiStart
4342
using OptimizationOptimJL
44-
using ForwardDiff
45-
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
46-
x0 = zeros(2)
47-
p = [1.0, 100.0]
4843
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
4944
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
5045
sol = solve(prob, MultistartOptimization.TikTak(100), LBFGS(), maxiters=5)

0 commit comments

Comments
 (0)