You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/getting_started.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ Tada! That's how you do it. Now let's dive in a little more into what each part
37
37
38
38
## Understanding the Solution Object
39
39
40
-
The solution object is a `SciMLBase.AbstractNoTimeSolution`, and thus it follows the
40
+
The solution object is a `SciMLBase.AbstractNoTimeSolution`, and thus it follows the
41
41
[SciMLBase Solution Interface for non-timeseries objects](https://docs.sciml.ai/SciMLBase/stable/interfaces/Solutions/) and is documented at the [solution type page](@ref solution).
42
42
However, for simplicity let's show a bit of it in action.
43
43
@@ -61,13 +61,13 @@ rosenbrock(sol.u, p)
61
61
sol.objective
62
62
```
63
63
64
-
The `sol.retcode` gives us more information about the solution process.
64
+
The `sol.retcode` gives us more information about the solution process.
65
65
66
66
```@example intro
67
67
sol.retcode
68
68
```
69
69
70
-
Here it says `ReturnCode.Success` which means that the solutuion successfully solved. We can learn more about the different return codes at
70
+
Here it says `ReturnCode.Success` which means that the solutuion successfully solved. We can learn more about the different return codes at
71
71
[the ReturnCode part of the SciMLBase documentation](https://docs.sciml.ai/SciMLBase/stable/interfaces/Solutions/#retcodes).
72
72
73
73
If we are interested about some of the statistics of the solving process, for example to help choose a better solver, we can investigate the `sol.stats`
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/pycma.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,9 +31,9 @@ sol = solve(prob, PyCMAOpt())
31
31
32
32
## Passing solver-specific options
33
33
34
-
Any keyword that `Optimization.jl` does not interpret is forwarded directly to PyCMA.
34
+
Any keyword that `Optimization.jl` does not interpret is forwarded directly to PyCMA.
35
35
36
-
In the event an `Optimization.jl` keyword overlaps with a `PyCMA` keyword, the `Optimization.jl` keyword takes precedence.
36
+
In the event an `Optimization.jl` keyword overlaps with a `PyCMA` keyword, the `Optimization.jl` keyword takes precedence.
37
37
38
38
An exhaustive list of keyword arguments can be found by running the following python script:
39
39
@@ -44,6 +44,7 @@ print(options)
44
44
```
45
45
46
46
An example passing the `PyCMA` keywords "verbose" and "seed":
47
+
47
48
```julia
48
49
sol =solve(prob, PyCMA(), verbose =-9, seed =42)
49
50
```
@@ -54,10 +55,9 @@ The original Python result object is attached to the solution in the `original`
54
55
55
56
```julia
56
57
sol =solve(prob, PyCMAOpt())
57
-
println(sol.original)
58
+
println(sol.original)
58
59
```
59
60
60
61
## Contributing
61
62
62
-
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
63
-
63
+
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/scipy.md
+24-24Lines changed: 24 additions & 24 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,6 +3,7 @@
3
3
[`SciPy`](https://scipy.org/) is a mature Python library that offers a rich family of optimization, root–finding and linear‐programming algorithms. `OptimizationSciPy.jl` gives access to these routines through the unified `Optimization.jl` interface just like any native Julia optimizer.
4
4
5
5
!!! note
6
+
6
7
`OptimizationSciPy.jl` relies on [`PythonCall`](https://github.com/cjdoris/PythonCall.jl). A minimal Python distribution containing SciPy will be installed automatically on first use, so no manual Python set-up is required.
7
8
8
9
## Installation: OptimizationSciPy.jl
@@ -20,37 +21,37 @@ Below is a catalogue of the solver families exposed by `OptimizationSciPy.jl` to
-`ScipyTrustConstr()` – Trust-region method for non-linear constraints
36
37
37
38
#### Hessian–Based / Trust-Region
38
39
39
-
*`ScipyDogleg()`, `ScipyTrustNCG()`, `ScipyTrustKrylov()`, `ScipyTrustExact()` – Trust-region algorithms that optionally use or build Hessian information
40
+
-`ScipyDogleg()`, `ScipyTrustNCG()`, `ScipyTrustKrylov()`, `ScipyTrustExact()` – Trust-region algorithms that optionally use or build Hessian information
x0, nothing, lcons = [-1e-6], ucons = [1e-6]) # Small tolerance instead of exact equality
@@ -129,5 +130,4 @@ If SciPy raises an error it is re-thrown as a Julia `ErrorException` carrying th
129
130
130
131
## Contributing
131
132
132
-
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
133
-
133
+
Bug reports and feature requests are welcome in the [Optimization.jl](https://github.com/SciML/Optimization.jl) issue tracker. Pull requests that improve either the Julia wrapper or the documentation are highly appreciated.
Copy file name to clipboardExpand all lines: docs/src/tutorials/reusage_interface.md
+13-13Lines changed: 13 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,5 @@
1
1
# Optimization Problem Reusage and Caching Interface
2
2
3
-
4
3
## Reusing Optimization Caches with `reinit!`
5
4
6
5
The `reinit!` function allows you to efficiently reuse an existing optimization cache with new parameters or initial values. This is particularly useful when solving similar optimization problems repeatedly with different parameter values, as it avoids the overhead of creating a new cache from scratch.
0 commit comments