Skip to content

Commit de723bc

Browse files
Added docs for SciPy
Signed-off-by: AdityaPandeyCN <[email protected]>
1 parent 0019304 commit de723bc

File tree

1 file changed

+132
-0
lines changed

1 file changed

+132
-0
lines changed
Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# SciPy.jl
2+
3+
[`SciPy`](https://scipy.org/) is a mature Python library that offers a rich family of optimization, root–finding and linear‐programming algorithms. `OptimizationSciPy.jl` gives access to these routines through the unified `Optimization.jl` interface just like any native Julia optimizer.
4+
5+
!!! note
6+
`OptimizationSciPy.jl` relies on [`PythonCall`](https://github.com/cjdoris/PythonCall.jl). A minimal Python distribution containing SciPy will be installed automatically on first use, so no manual Python set-up is required.
7+
8+
## Installation: OptimizationSciPy.jl
9+
10+
```julia
11+
import Pkg
12+
Pkg.add("OptimizationSciPy")
13+
```
14+
15+
## Methods
16+
17+
Below is a catalogue of the solver families exposed by `OptimizationSciPy.jl` together with their convenience constructors. All of them accept the usual keyword arguments `maxiters`, `maxtime`, `abstol`, `reltol`, `callback`, `progress` in addition to any SciPy-specific options (passed verbatim via keyword arguments to `solve`).
18+
19+
### Local Optimizer
20+
21+
#### Derivative-Free
22+
23+
* `ScipyNelderMead()` – Simplex Nelder–Mead algorithm
24+
* `ScipyPowell()` – Powell search along conjugate directions
25+
* `ScipyCOBYLA()` – Linear approximation of constraints (supports nonlinear constraints)
26+
27+
#### Gradient-Based
28+
29+
* `ScipyCG()` – Non-linear conjugate gradient
30+
* `ScipyBFGS()` – Quasi-Newton BFGS
31+
* `ScipyLBFGSB()` – Limited-memory BFGS with simple bounds
32+
* `ScipyNewtonCG()` – Newton-conjugate gradient (requires Hessian-vector products)
33+
* `ScipyTNC()` – Truncated Newton with bounds
34+
* `ScipySLSQP()` – Sequential least-squares programming (supports constraints)
35+
* `ScipyTrustConstr()` – Trust-region method for non-linear constraints
36+
37+
#### Hessian–Based / Trust-Region
38+
39+
* `ScipyDogleg()`, `ScipyTrustNCG()`, `ScipyTrustKrylov()`, `ScipyTrustExact()` – Trust-region algorithms that optionally use or build Hessian information
40+
41+
### Global Optimizer
42+
43+
* `ScipyDifferentialEvolution()` – Differential evolution (requires bounds)
44+
* `ScipyBasinhopping()` – Basin-hopping with local search
45+
* `ScipyDualAnnealing()` – Dual annealing simulated annealing
46+
* `ScipyShgo()` – Simplicial homology global optimisation (supports constraints)
47+
* `ScipyDirect()` – Deterministic `DIRECT` algorithm (requires bounds)
48+
* `ScipyBrute()` – Brute-force grid search (requires bounds)
49+
50+
### Linear & Mixed-Integer Programming
51+
52+
* `ScipyLinprog("highs")` – LP solvers from the HiGHS project and legacy interior-point/simplex methods
53+
* `ScipyMilp()` – Mixed-integer linear programming via HiGHS branch-and-bound
54+
55+
### Root Finding & Non-Linear Least Squares *(experimental)*
56+
57+
Support for `ScipyRoot`, `ScipyRootScalar` and `ScipyLeastSquares` is available behind the scenes and will be documented once the APIs stabilise.
58+
59+
## Examples
60+
61+
### Unconstrained minimisation
62+
63+
```@example SciPy1
64+
using Optimization, OptimizationSciPy
65+
66+
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
67+
x0 = zeros(2)
68+
p = [1.0, 100.0]
69+
70+
f = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
71+
prob = OptimizationProblem(f, x0, p)
72+
73+
sol = solve(prob, ScipyBFGS())
74+
@show sol.objective # ≈ 0 at optimum
75+
```
76+
77+
### Constrained optimisation with COBYLA
78+
79+
```@example SciPy2
80+
using Optimization, OptimizationSciPy
81+
82+
# Objective
83+
obj(x, p) = (x[1] + x[2] - 1)^2
84+
85+
# Single non-linear equality: x₁² + x₂² = 1
86+
cons(res, x, p) = (res .= [x[1]^2 + x[2]^2 - 1.0])
87+
88+
x0 = [0.5, 0.5]
89+
prob = OptimizationProblem(
90+
OptimizationFunction(obj; cons = cons),
91+
x0, nothing, lcons = [0.0], ucons = [0.0])
92+
93+
sol = solve(prob, ScipyCOBYLA())
94+
@show sol.u, sol.objective
95+
```
96+
97+
### Differential evolution (global) with custom options
98+
99+
```@example SciPy3
100+
using Optimization, OptimizationSciPy, Random
101+
Random.seed!(123)
102+
103+
ackley(x, p) = -20exp(-0.2*sqrt(mean(x .^ 2))) - exp(mean(cos.(2π .* x))) + 20 + ℯ
104+
x0 = zeros(2) # initial guess is ignored by DE
105+
prob = OptimizationProblem(ackley, x0; lb = [-5.0, -5.0], ub = [5.0, 5.0])
106+
107+
sol = solve(prob, ScipyDifferentialEvolution(); popsize = 20, mutation = (0.5, 1))
108+
@show sol.objective
109+
```
110+
111+
## Passing solver-specific options
112+
113+
Any keyword that `Optimization.jl` does not interpret is forwarded directly to SciPy. Refer to the [SciPy optimisation API](https://docs.scipy.org/doc/scipy/reference/optimize.html) for the exhaustive list of options.
114+
115+
```julia
116+
sol = solve(prob, ScipyTrustConstr(); verbose = 3, maxiter = 10_000)
117+
```
118+
119+
## Troubleshooting
120+
121+
The original Python result object is attached to the solution in the `original` field:
122+
123+
```julia
124+
sol = solve(prob, ScipyBFGS())
125+
println(sol.original) # SciPy OptimizeResult
126+
```
127+
128+
If SciPy raises an error it is re-thrown as a Julia `ErrorException` carrying the Python message, so look there first.
129+
130+
## Contributing
131+
132+
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.

0 commit comments

Comments
 (0)