Skip to content

Commit 129389e

Browse files
2 parents 7fbca50 + d308614 commit 129389e

File tree

27 files changed

+2571
-19
lines changed

27 files changed

+2571
-19
lines changed

.github/workflows/CI.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,9 @@ jobs:
3333
- OptimizationOptimJL
3434
- OptimizationOptimisers
3535
- OptimizationPRIMA
36+
- OptimizationPyCMA
3637
- OptimizationQuadDIRECT
38+
- OptimizationSciPy
3739
- OptimizationSpeedMapping
3840
- OptimizationPolyalgorithms
3941
- OptimizationNLPModels

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "Optimization"
22
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
3-
version = "4.3.0"
3+
version = "4.4.0"
44

55
[deps]
66
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"

docs/Project.toml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,9 @@ SymbolicAnalysis = "4297ee4d-0239-47d8-ba5d-195ecdf594fe"
4444
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
4545
Tracker = "9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c"
4646
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
47+
BlackBoxOptim = "a134a8b2-14d6-55f6-9291-3336d3ab0209"
48+
Metaheuristics = "bcdb8e00-2c21-11e9-3065-2b553b22f898"
49+
Evolutionary = "86b6b26d-c046-49b6-aa0b-5f0f74682bd6"
4750

4851
[compat]
4952
AmplNLWriter = "1"
@@ -90,3 +93,7 @@ SymbolicAnalysis = "0.3"
9093
Symbolics = "6"
9194
Tracker = ">= 0.2"
9295
Zygote = ">= 0.5"
96+
BlackBoxOptim = "0.6"
97+
Metaheuristics = "3"
98+
Evolutionary = "0.11"
99+

docs/pages.jl

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ pages = ["index.md",
2323
"API/modelingtoolkit.md",
2424
"API/FAQ.md"
2525
],
26-
"Optimizer Packages" => [
26+
"Optimizer Packages" => [
2727
"BlackBoxOptim.jl" => "optimization_packages/blackboxoptim.md",
2828
"CMAEvolutionStrategy.jl" => "optimization_packages/cmaevolutionstrategy.md",
2929
"Evolutionary.jl" => "optimization_packages/evolutionary.md",
@@ -40,7 +40,9 @@ pages = ["index.md",
4040
"Optimization.jl" => "optimization_packages/optimization.md",
4141
"Polyalgorithms.jl" => "optimization_packages/polyopt.md",
4242
"PRIMA.jl" => "optimization_packages/prima.md",
43+
"PyCMA.jl" => "optimization_packages/pycma.md",
4344
"QuadDIRECT.jl" => "optimization_packages/quaddirect.md",
44-
"SpeedMapping.jl" => "optimization_packages/speedmapping.md"
45+
"SpeedMapping.jl" => "optimization_packages/speedmapping.md",
46+
"SciPy.jl" => "optimization_packages/scipy.md"
4547
]
4648
]

docs/src/API/modelingtoolkit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,4 @@ Secondly, one can generate `OptimizationProblem`s for use in
1616
Optimization.jl from purely a symbolic front-end. This is the form
1717
users will encounter when using ModelingToolkit.jl directly, and it is
1818
also the form supplied by domain-specific languages. For more information,
19-
see the [OptimizationSystem documentation](https://docs.sciml.ai/ModelingToolkit/stable/systems/OptimizationSystem/).
19+
see the [OptimizationSystem documentation](https://docs.sciml.ai/ModelingToolkit/stable/API/problems/#SciMLBase.OptimizationProblem).

docs/src/optimization_packages/blackboxoptim.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,3 +67,21 @@ prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0,
6767
sol = solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited(), maxiters = 100000,
6868
maxtime = 1000.0)
6969
```
70+
71+
## Multi-objective optimization
72+
The optimizer for Multi-Objective Optimization is `BBO_borg_moea()`. Your objective function should return a vector of the objective values and you should indicate the fitness scheme to be (typically) Pareto fitness and specify the number of objectives. Otherwise, the use is similar, here is an example:
73+
74+
```@example MOO-BBO
75+
using OptimizationBBO, Optimization, BlackBoxOptim
76+
using SciMLBase: MultiObjectiveOptimizationFunction
77+
u0 = [0.25, 0.25]
78+
opt = OptimizationBBO.BBO_borg_moea()
79+
function multi_obj_func(x, p)
80+
f1 = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2 # Rosenbrock function
81+
f2 = -20.0 * exp(-0.2 * sqrt(0.5 * (x[1]^2 + x[2]^2))) - exp(0.5 * (cos(2π * x[1]) + cos(2π * x[2]))) + exp(1) + 20.0 # Ackley function
82+
return (f1, f2)
83+
end
84+
mof = MultiObjectiveOptimizationFunction(multi_obj_func)
85+
prob = Optimization.OptimizationProblem(mof, u0; lb = [0.0, 0.0], ub = [2.0, 2.0])
86+
sol = solve(prob, opt, NumDimensions=2, FitnessScheme=ParetoFitnessScheme{2}(is_minimizing=true))
87+
```

docs/src/optimization_packages/evolutionary.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,3 +41,20 @@ f = OptimizationFunction(rosenbrock)
4141
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])
4242
sol = solve(prob, Evolutionary.CMAES(μ = 40, λ = 100))
4343
```
44+
45+
## Multi-objective optimization
46+
The Rosenbrock and Ackley functions can be optimized using the `Evolutionary.NSGA2()` as follows:
47+
48+
```@example MOO-Evolutionary
49+
using Optimization, OptimizationEvolutionary, Evolutionary
50+
function func(x, p=nothing)::Vector{Float64}
51+
f1 = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2 # Rosenbrock function
52+
f2 = -20.0 * exp(-0.2 * sqrt(0.5 * (x[1]^2 + x[2]^2))) - exp(0.5 * (cos(2π * x[1]) + cos(2π * x[2]))) + exp(1) + 20.0 # Ackley function
53+
return [f1, f2]
54+
end
55+
initial_guess = [1.0, 1.0]
56+
obj_func = MultiObjectiveOptimizationFunction(func)
57+
algorithm = OptimizationEvolutionary.NSGA2()
58+
problem = OptimizationProblem(obj_func, initial_guess)
59+
result = solve(problem, algorithm)
60+
```

docs/src/optimization_packages/metaheuristics.md

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -70,3 +70,46 @@ sol = solve(prob, ECA(), use_initial = true, maxiters = 100000, maxtime = 1000.0
7070
### With Constraint Equations
7171

7272
While `Metaheuristics.jl` supports such constraints, `Optimization.jl` currently does not relay these constraints.
73+
74+
75+
## Multi-objective optimization
76+
The zdt1 functions can be optimized using the `Metaheuristics.jl` as follows:
77+
78+
```@example MOO-Metaheuristics
79+
using Optimization, OptimizationEvolutionary,OptimizationMetaheuristics, Metaheuristics
80+
function zdt1(x)
81+
f1 = x[1]
82+
g = 1 + 9 * mean(x[2:end])
83+
h = 1 - sqrt(f1 / g)
84+
f2 = g * h
85+
# In this example, we have no constraints
86+
gx = [0.0] # Inequality constraints (not used)
87+
hx = [0.0] # Equality constraints (not used)
88+
return [f1, f2], gx, hx
89+
end
90+
multi_obj_fun = MultiObjectiveOptimizationFunction((x, p) -> zdt1(x))
91+
92+
# Define the problem bounds
93+
lower_bounds = [0.0, 0.0, 0.0]
94+
upper_bounds = [1.0, 1.0, 1.0]
95+
96+
# Define the initial guess
97+
initial_guess = [0.5, 0.5, 0.5]
98+
99+
# Create the optimization problem
100+
prob = OptimizationProblem(multi_obj_fun, initial_guess; lb = lower_bounds, ub = upper_bounds)
101+
102+
nobjectives = 2
103+
npartitions = 100
104+
105+
# reference points (Das and Dennis's method)
106+
weights = Metaheuristics.gen_ref_dirs(nobjectives, npartitions)
107+
108+
# Choose the algorithm and solve the problem
109+
sol1 = solve(prob, Metaheuristics.NSGA2(); maxiters = 100, use_initial = true)
110+
sol2 = solve(prob, Metaheuristics.NSGA3(); maxiters = 100, use_initial = true)
111+
sol3 = solve(prob, Metaheuristics.SPEA2(); maxiters = 100, use_initial = true)
112+
sol4 = solve(prob, Metaheuristics.CCMO(NSGA2(N=100, p_m=0.001)))
113+
sol5 = solve(prob, Metaheuristics.MOEAD_DE(weights, options=Options(debug=false, iterations = 250)); maxiters = 100, use_initial = true)
114+
sol6 = solve(prob, Metaheuristics.SMS_EMOA(); maxiters = 100, use_initial = true)
115+
```
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# PyCMA.jl
2+
3+
[`PyCMA`](https://github.com/CMA-ES/pycma) is a Python implementation of CMA-ES and a few related numerical optimization tools. `OptimizationPyCMA.jl` gives access to the CMA-ES optimizer through the unified `Optimization.jl` interface just like any native Julia optimizer.
4+
5+
`OptimizationPyCMA.jl` relies on [`PythonCall`](https://github.com/cjdoris/PythonCall.jl). A minimal Python distribution containing PyCMA will be installed automatically on first use, so no manual Python set-up is required.
6+
7+
## Installation: OptimizationPyCMA.jl
8+
9+
```julia
10+
import Pkg
11+
Pkg.add("OptimizationPyCMA")
12+
```
13+
14+
## Methods
15+
16+
`PyCMAOpt` supports the usual keyword arguments `maxiters`, `maxtime`, `abstol`, `reltol`, `callback` in addition to any PyCMA-specific options (passed verbatim via keyword arguments to `solve`).
17+
18+
## Example
19+
20+
```@example PyCMA
21+
using OptimizationPyCMA
22+
23+
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
24+
x0 = zeros(2)
25+
_p = [1.0, 100.0]
26+
l1 = rosenbrock(x0, _p)
27+
f = OptimizationFunction(rosenbrock)
28+
prob = OptimizationProblem(f, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
29+
sol = solve(prob, PyCMAOpt())
30+
```
31+
32+
## Passing solver-specific options
33+
34+
Any keyword that `Optimization.jl` does not interpret is forwarded directly to PyCMA.
35+
36+
In the event an `Optimization.jl` keyword overlaps with a `PyCMA` keyword, the `Optimization.jl` keyword takes precedence.
37+
38+
An exhaustive list of keyword arguments can be found by running the following python script:
39+
40+
```python
41+
import cma
42+
options = cma.CMAOptions()
43+
print(options)
44+
```
45+
46+
An example passing the `PyCMA` keywords "verbose" and "seed":
47+
```julia
48+
sol = solve(prob, PyCMA(), verbose = -9, seed = 42)
49+
```
50+
51+
## Troubleshooting
52+
53+
The original Python result object is attached to the solution in the `original` field:
54+
55+
```julia
56+
sol = solve(prob, PyCMAOpt())
57+
println(sol.original)
58+
```
59+
60+
## Contributing
61+
62+
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
63+
Lines changed: 133 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,133 @@
1+
# SciPy.jl
2+
3+
[`SciPy`](https://scipy.org/) is a mature Python library that offers a rich family of optimization, root–finding and linear‐programming algorithms. `OptimizationSciPy.jl` gives access to these routines through the unified `Optimization.jl` interface just like any native Julia optimizer.
4+
5+
!!! note
6+
`OptimizationSciPy.jl` relies on [`PythonCall`](https://github.com/cjdoris/PythonCall.jl). A minimal Python distribution containing SciPy will be installed automatically on first use, so no manual Python set-up is required.
7+
8+
## Installation: OptimizationSciPy.jl
9+
10+
```julia
11+
import Pkg
12+
Pkg.add("OptimizationSciPy")
13+
```
14+
15+
## Methods
16+
17+
Below is a catalogue of the solver families exposed by `OptimizationSciPy.jl` together with their convenience constructors. All of them accept the usual keyword arguments `maxiters`, `maxtime`, `abstol`, `reltol`, `callback`, `progress` in addition to any SciPy-specific options (passed verbatim via keyword arguments to `solve`).
18+
19+
### Local Optimizer
20+
21+
#### Derivative-Free
22+
23+
* `ScipyNelderMead()` – Simplex Nelder–Mead algorithm
24+
* `ScipyPowell()` – Powell search along conjugate directions
25+
* `ScipyCOBYLA()` – Linear approximation of constraints (supports nonlinear constraints)
26+
27+
#### Gradient-Based
28+
29+
* `ScipyCG()` – Non-linear conjugate gradient
30+
* `ScipyBFGS()` – Quasi-Newton BFGS
31+
* `ScipyLBFGSB()` – Limited-memory BFGS with simple bounds
32+
* `ScipyNewtonCG()` – Newton-conjugate gradient (requires Hessian-vector products)
33+
* `ScipyTNC()` – Truncated Newton with bounds
34+
* `ScipySLSQP()` – Sequential least-squares programming (supports constraints)
35+
* `ScipyTrustConstr()` – Trust-region method for non-linear constraints
36+
37+
#### Hessian–Based / Trust-Region
38+
39+
* `ScipyDogleg()`, `ScipyTrustNCG()`, `ScipyTrustKrylov()`, `ScipyTrustExact()` – Trust-region algorithms that optionally use or build Hessian information
40+
41+
### Global Optimizer
42+
43+
* `ScipyDifferentialEvolution()` – Differential evolution (requires bounds)
44+
* `ScipyBasinhopping()` – Basin-hopping with local search
45+
* `ScipyDualAnnealing()` – Dual annealing simulated annealing
46+
* `ScipyShgo()` – Simplicial homology global optimisation (supports constraints)
47+
* `ScipyDirect()` – Deterministic `DIRECT` algorithm (requires bounds)
48+
* `ScipyBrute()` – Brute-force grid search (requires bounds)
49+
50+
### Linear & Mixed-Integer Programming
51+
52+
* `ScipyLinprog("highs")` – LP solvers from the HiGHS project and legacy interior-point/simplex methods
53+
* `ScipyMilp()` – Mixed-integer linear programming via HiGHS branch-and-bound
54+
55+
### Root Finding & Non-Linear Least Squares *(experimental)*
56+
57+
Support for `ScipyRoot`, `ScipyRootScalar` and `ScipyLeastSquares` is available behind the scenes and will be documented once the APIs stabilise.
58+
59+
## Examples
60+
61+
### Unconstrained minimisation
62+
63+
```@example SciPy1
64+
using Optimization, OptimizationSciPy
65+
66+
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
67+
x0 = zeros(2)
68+
p = [1.0, 100.0]
69+
70+
f = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
71+
prob = OptimizationProblem(f, x0, p)
72+
73+
sol = solve(prob, ScipyBFGS())
74+
@show sol.objective # ≈ 0 at optimum
75+
```
76+
77+
### Constrained optimisation with COBYLA
78+
79+
```@example SciPy2
80+
using Optimization, OptimizationSciPy
81+
82+
# Objective
83+
obj(x, p) = (x[1] + x[2] - 1)^2
84+
85+
# Single non-linear constraint: x₁² + x₂² ≈ 1 (with small tolerance)
86+
cons(res, x, p) = (res .= [x[1]^2 + x[2]^2 - 1.0])
87+
88+
x0 = [0.5, 0.5]
89+
prob = OptimizationProblem(
90+
OptimizationFunction(obj; cons = cons),
91+
x0, nothing, lcons = [-1e-6], ucons = [1e-6]) # Small tolerance instead of exact equality
92+
93+
sol = solve(prob, ScipyCOBYLA())
94+
@show sol.u, sol.objective
95+
```
96+
97+
### Differential evolution (global) with custom options
98+
99+
```@example SciPy3
100+
using Optimization, OptimizationSciPy, Random, Statistics
101+
Random.seed!(123)
102+
103+
ackley(x, p) = -20exp(-0.2*sqrt(mean(x .^ 2))) - exp(mean(cos.(2π .* x))) + 20 + ℯ
104+
x0 = zeros(2) # initial guess is ignored by DE
105+
prob = OptimizationProblem(ackley, x0; lb = [-5.0, -5.0], ub = [5.0, 5.0])
106+
107+
sol = solve(prob, ScipyDifferentialEvolution(); popsize = 20, mutation = (0.5, 1))
108+
@show sol.objective
109+
```
110+
111+
## Passing solver-specific options
112+
113+
Any keyword that `Optimization.jl` does not interpret is forwarded directly to SciPy. Refer to the [SciPy optimisation API](https://docs.scipy.org/doc/scipy/reference/optimize.html) for the exhaustive list of options.
114+
115+
```julia
116+
sol = solve(prob, ScipyTrustConstr(); verbose = 3, maxiter = 10_000)
117+
```
118+
119+
## Troubleshooting
120+
121+
The original Python result object is attached to the solution in the `original` field:
122+
123+
```julia
124+
sol = solve(prob, ScipyBFGS())
125+
println(sol.original) # SciPy OptimizeResult
126+
```
127+
128+
If SciPy raises an error it is re-thrown as a Julia `ErrorException` carrying the Python message, so look there first.
129+
130+
## Contributing
131+
132+
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
133+

0 commit comments

Comments
 (0)