You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if: github.event_name == 'pull_request' && github.repository == github.event.pull_request.head.repo.full_name # if this is a pull request build AND the pull request is NOT made from a fork
github.issues.createComment({ issue_number, owner, repo, body: 'Once the build has completed, you can preview your PR at this URL: https://juliasmoothoptimizers.github.io/JSOTutorials.jl/' });
98
+
github.issues.createComment({ issue_number, owner, repo, body: 'Once the build has completed, you can preview your PR at this URL: https://jso.dev/JSOTutorials.jl/' });
Copy file name to clipboardExpand all lines: tutorials/generic-adnlpmodels/index.jmd
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -10,10 +10,10 @@ using ADNLPModels, ForwardDiff, NLPModels, OptimizationProblems
10
10
11
11
One of the main strengths of Julia for scientific computing is its native usage of [arbitrary precision arithmetic](https://docs.julialang.org/en/v1/manual/integers-and-floating-point-numbers/#Arbitrary-Precision-Arithmetic).
12
12
The same can be exploited for optimization models and solvers.
13
-
In the organization [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io), the package [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API.
14
-
This package is modular in the sense that it implements a backend system allowing the user to use essentially any AD system available, see [ADNLPModels.jl/dev/backend/](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/backend/) for a tutorial.
13
+
In the organization [JuliaSmoothOptimizers](https://jso.dev), the package [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API.
14
+
This package is modular in the sense that it implements a backend system allowing the user to use essentially any AD system available, see [ADNLPModels.jl/dev/backend/](https://jso.dev/ADNLPModels.jl/dev/backend/) for a tutorial.
15
15
16
-
Note that most of the solvers available in [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io) will accept generic types.
16
+
Note that most of the solvers available in [JuliaSmoothOptimizers](https://jso.dev) will accept generic types.
17
17
For instance, it is possible to use the classical L-BFGS method implemented in [JSOSolvers.jl](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/) in single precision.
18
18
19
19
```julia
@@ -115,7 +115,7 @@ The same can be done for the other backends jacobian, hessian, etc.
115
115
## Multiprecision test problems
116
116
117
117
Designing a multi-precision algorithm is very often connected with benchmarking and test problems.
118
-
The package [OptimizationProblems.jl](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl) provides a collection of optimization problems in JuMP and ADNLPModels syntax, see [introduction to OptimizationProblems.jl tutorial](https://juliasmoothoptimizers.github.io/tutorials/introduction-to-optimizationproblems/).
118
+
The package [OptimizationProblems.jl](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl) provides a collection of optimization problems in JuMP and ADNLPModels syntax, see [introduction to OptimizationProblems.jl tutorial](https://jso.dev/tutorials/introduction-to-optimizationproblems/).
119
119
120
120
This package provides a `DataFrame` with all the information on the implemented problems.
121
121
@@ -145,4 +145,4 @@ grad!(nlp, x16, g) # returns a vector of Float16
145
145
We should pay additional attention when using multiple precisions as casting, for instance `x0`, from `Float64` into `Float16` implies that rounding errors occur.
146
146
Therefore, `x0` is different than `x16`, and the gradients evaluated for these values too.
147
147
148
-
Feel free to look at [OptimizationProblems.jl documentation](https://juliasmoothoptimizers.github.io/OptimizationProblems.jl/dev/) to learn more or the tutorials at [juliasmoothoptimizers.github.io](https://juliasmoothoptimizers.github.io).
148
+
Feel free to look at [OptimizationProblems.jl documentation](https://jso.dev/OptimizationProblems.jl/dev/) to learn more or the tutorials at [juliasmoothoptimizers.github.io](https://jso.dev).
The `Dataframe` is listing the matrices that you can have access to, but they still need to be downloaded.
35
35
36
-
Following the example above, we filtered two problems.
36
+
Following the example above, we filtered two problems.
37
37
What we want to do now is to select the first one in the listing.
38
38
39
39
```julia
@@ -43,7 +43,7 @@ name = filter_df[1, :name] # select the name of the first problem
43
43
Now that the name is selected, we need to access the problem itself, and there are 2 solutions:
44
44
45
45
- You can download the problem's archive file;
46
-
- You can automatically create a nonlinear least squares problem using [`NLPModels`](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) from [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io/).
46
+
- You can automatically create a nonlinear least squares problem using [`NLPModels`](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) from [JuliaSmoothOptimizers](https://jso.dev/).
47
47
48
48
## Get the problem archive file
49
49
@@ -53,21 +53,21 @@ This package uses Julia Artifacts to handle the problems archives so that
53
53
2. They are identified with a unique hash;
54
54
3. They can be deleted with a single command line.
55
55
56
-
The method [`fetch_ba_name`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_name-Tuple{AbstractString}) will automatically download the problem (if needed) and return its path.
56
+
The method [`fetch_ba_name`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_name-Tuple{AbstractString}) will automatically download the problem (if needed) and return its path.
57
57
58
58
```julia
59
59
path = fetch_ba_name(name)
60
60
```
61
61
62
-
It is also possible to directly download and get access to an entire group of problems using [`fetch_ba_group`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_group-Tuple{AbstractString}).
62
+
It is also possible to directly download and get access to an entire group of problems using [`fetch_ba_group`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_group-Tuple{AbstractString}).
63
63
64
64
```julia
65
65
paths = fetch_ba_group("ladybug")
66
66
```
67
67
68
68
## Generate a nonlinear least squares model
69
69
70
-
Now, it is possible to load the model using [`BundleAdjustmentModel`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.BundleAdjustmentModel-Tuple{AbstractString})
70
+
Now, it is possible to load the model using [`BundleAdjustmentModel`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.BundleAdjustmentModel-Tuple{AbstractString})
71
71
72
72
```julia
73
73
df = problems_df()
@@ -84,21 +84,21 @@ model = BundleAdjustmentModel("problem-49-7776-pre");
84
84
85
85
The function `BundleAdjustmentModel` will instantiate the model and automatically download it if needed.
86
86
The resulting structure is an instance of `AbstractNLPModel`.
87
-
So, it is possible to access its API as any other [`NLPModel`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/).
87
+
So, it is possible to access its API as any other [`NLPModel`](https://jso.dev/NLPModels.jl/dev/).
88
88
89
89
```julia
90
90
using NLPModels
91
91
```
92
92
93
-
Using [`residual`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.residual), it is possible to compute the residual of the model
93
+
Using [`residual`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.residual), it is possible to compute the residual of the model
94
94
95
95
```julia
96
96
model = BundleAdjustmentModel("problem-49-7776-pre.txt.bz2")
97
97
x = get_x0(model) # or `model.meta.x0`
98
98
Fx = residual(model, x)
99
99
```
100
100
101
-
or use the in-place method [`residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.residual!)
101
+
or use the in-place method [`residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.residual!)
102
102
103
103
```julia
104
104
model = BundleAdjustmentModel("problem-49-7776-pre.txt.bz2")
@@ -110,7 +110,7 @@ residual!(model, x, Fx);
110
110
111
111
You can also have access to the [`LinearOperator`](https://github.com/JuliaSmoothOptimizers/LinearOperators.jl) of the Jacobian matrix of the residual of the model which is calculated by hand (in contradiction to automatic differentiation).
112
112
113
-
You need to call [`jac_structure_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_structure_residual!) at least once before calling [`jac_op_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!).
113
+
You need to call [`jac_structure_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_structure_residual!) at least once before calling [`jac_op_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!).
You need to call [`jac_coord_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_coord_residual!) to update it to the current point.
124
+
You need to call [`jac_coord_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_coord_residual!) to update it to the current point.
125
125
126
126
```julia
127
127
model = BundleAdjustmentModel("problem-49-7776")
@@ -130,7 +130,7 @@ vals = similar(x, nnzj)
130
130
jac_coord_residual!(model, x, vals)
131
131
```
132
132
133
-
Finally you can use [`jac_op_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!):
133
+
Finally you can use [`jac_op_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!):
All solvers are based on [NLPModels.jl](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) and [SolverCore.jl](https://github.com/JuliaSmoothOptimizers/SolverCore.jl).
15
15
@@ -115,12 +115,12 @@ The following table provides the correspondance between the solvers and the solv
It is also possible to pre-allocate the output structure `stats` and call `solve!(solver, nlp, stats)`.
126
126
```julia
@@ -133,7 +133,7 @@ solve!(solver, nlp, stats)
133
133
134
134
## Callback
135
135
136
-
All the solvers have a callback mechanism called at each iteration, see also the [Using callbacks tutorial](https://juliasmoothoptimizers.github.io/tutorials/using-callbacks/).
136
+
All the solvers have a callback mechanism called at each iteration, see also the [Using callbacks tutorial](https://jso.dev/tutorials/using-callbacks/).
137
137
The expected signature of the callback is `callback(nlp, solver, stats)`, and its output is ignored.
138
138
Changing any of the input arguments will affect the subsequent iterations.
139
139
In particular, setting `stats.status = :user` will stop the algorithm.
[LinearOperators.jl](https://juliasmoothoptimizers.github.io/LinearOperators.jl/stable) is a package for matrix-like operators. Linear operators are defined by how they act on a vector, which is useful in a variety of situations where you don't want to materialize the matrix.
7
+
[LinearOperators.jl](https://jso.dev/LinearOperators.jl/stable) is a package for matrix-like operators. Linear operators are defined by how they act on a vector, which is useful in a variety of situations where you don't want to materialize the matrix.
0 commit comments