Skip to content

Commit 577dd2c

Browse files
committed
Update URL to use jso.dev
1 parent 51c3d83 commit 577dd2c

File tree

13 files changed

+60
-63
lines changed

13 files changed

+60
-63
lines changed

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,5 +14,5 @@
1414
- [ ] The preview has your tutorial listed.
1515
- [ ] The preview looks like what you want.
1616

17-
[preview]: https://juliasmoothoptimizers.github.io/JSOTutorials.jl/
17+
[preview]: https://jso.dev/JSOTutorials.jl/
1818
[Grammarly]: https://www.grammarly.com

.github/workflows/Deploy-internal.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ jobs:
4343
- name: Install Julia
4444
uses: julia-actions/setup-julia@v1
4545
with:
46-
version: 1
46+
version: 1
4747
- name: Build tutorial
4848
run: |
4949
bash .github/workflows/build_tutorial.sh ${{ matrix.tutorial }}
@@ -73,7 +73,7 @@ jobs:
7373
cd JSOTutorials.jl-gh-pages
7474
echo "## JSOTutorials preview page
7575
76-
For the complete list of tutorials, go to <https://juliasmoothoptimizers.github.io/tutorials/>.
76+
For the complete list of tutorials, go to <https://jso.dev/tutorials/>.
7777
7878
" > index.md
7979
for file in **/*.md; do NAME=$(echo $file | cut -d/ -f 1); TITLE=$(grep "title:" $file | cut -d\" -f2); echo "- [$TITLE]($NAME/)"; done >> index.md
@@ -88,11 +88,11 @@ jobs:
8888
runs-on: ubuntu-latest
8989
if: ${{ needs.generate-job-strategy-matrix.outputs.job-strategy-matrix != '[]' }}
9090
steps:
91-
- name: 'Comment PR'
91+
- name: "Comment PR"
9292
uses: actions/[email protected]
9393
if: github.event_name == 'pull_request' && github.repository == github.event.pull_request.head.repo.full_name # if this is a pull request build AND the pull request is NOT made from a fork
9494
with:
9595
github-token: ${{ secrets.GITHUB_TOKEN }}
9696
script: |
9797
const { issue: { number: issue_number }, repo: { owner, repo } } = context;
98-
github.issues.createComment({ issue_number, owner, repo, body: 'Once the build has completed, you can preview your PR at this URL: https://juliasmoothoptimizers.github.io/JSOTutorials.jl/' });
98+
github.issues.createComment({ issue_number, owner, repo, body: 'Once the build has completed, you can preview your PR at this URL: https://jso.dev/JSOTutorials.jl/' });

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# JSO Tutorials
22

33
Curated repository of tutorials regarding all things JSO.
4-
See the renderer tutorials in <https://juliasmoothoptimizers.github.io>.
4+
See the renderer tutorials in <https://jso.dev>.
55

66
## Development
77

src/JSOTutorials.jl

Lines changed: 19 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ using Weave, Pkg, IJulia, InteractiveUtils, Markdown, YAML
99
repo_directory = joinpath(@__DIR__, "..")
1010
# cssfile = joinpath(@__DIR__, "..", "templates", "skeleton_css.css")
1111
# latexfile = joinpath(@__DIR__, "..", "templates", "julia_tex.tpl")
12-
default_builds = (:github, )
12+
default_builds = (:github,)
1313

1414
function weave_file(folder, file, build_list = default_builds)
1515
target = joinpath(repo_directory, "tutorials", folder, file)
@@ -54,7 +54,7 @@ function weave_file(folder, file, build_list = default_builds)
5454
dir = joinpath(repo_directory, "markdown", basename(folder))
5555
mkpath(dir)
5656
weave(target, doctype = "github", out_path = dir, args = args)
57-
add_package_info(joinpath(dir, file[1:end-4] * ".md"))
57+
add_package_info(joinpath(dir, file[1:(end - 4)] * ".md"))
5858
end
5959
if :notebook build_list
6060
println("Building Notebook")
@@ -89,7 +89,7 @@ end
8989
function package_information()
9090
proj = sprint(io -> Pkg.status(io = io))
9191
re_str = r"\[[0-f]+\]\s+(.*) v(.*)"
92-
pkg_info = Dict{String,String}()
92+
pkg_info = Dict{String, String}()
9393
for line in split(proj, "\n")
9494
m = match(re_str, line)
9595
if m !== nothing
@@ -108,7 +108,7 @@ function badge(name, version)
108108

109109
badge_img = "![$name $version](https://img.shields.io/badge/$name-$version-$color?style=flat-square&labelColor=$lbl_color)"
110110
if name in jso_pkgs
111-
link = "https://juliasmoothoptimizers.github.io/$name.jl/stable/"
111+
link = "https://jso.dev/$name.jl/stable/"
112112
"[$badge_img]($link)"
113113
else
114114
badge_img
@@ -120,11 +120,11 @@ function add_package_info(filename)
120120
j = findall(lines .== "---")[2]
121121
pkg_info = package_information()
122122
out = [
123-
lines[1:j];
124-
"";
125-
[badge(pkg, ver) for (pkg, ver) in pkg_info];
126-
"";
127-
lines[j+1:end]
123+
lines[1:j]
124+
""
125+
[badge(pkg, ver) for (pkg, ver) in pkg_info]
126+
""
127+
lines[(j + 1):end]
128128
]
129129
open(filename, "w") do io
130130
print(io, join(out, "\n"))
@@ -203,13 +203,16 @@ function parse_markdown_into_franklin(infile, outfile)
203203
end
204204

205205
open(outfile, "w") do io
206-
println(io, """
207-
@def title = "$(yaml["title"])"
208-
@def showall = true
209-
@def tags = $(yaml["tags"])
210-
211-
\\preamble{$(yaml["author"])}
212-
""")
206+
println(
207+
io,
208+
"""
209+
@def title = "$(yaml["title"])"
210+
@def showall = true
211+
@def tags = $(yaml["tags"])
212+
213+
\\preamble{$(yaml["author"])}
214+
""",
215+
)
213216

214217
code_fence = startswith("```")
215218
inside_fenced_code = false

tutorials/generic-adnlpmodels/index.jmd

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,10 @@ using ADNLPModels, ForwardDiff, NLPModels, OptimizationProblems
1010

1111
One of the main strengths of Julia for scientific computing is its native usage of [arbitrary precision arithmetic](https://docs.julialang.org/en/v1/manual/integers-and-floating-point-numbers/#Arbitrary-Precision-Arithmetic).
1212
The same can be exploited for optimization models and solvers.
13-
In the organization [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io), the package [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API.
14-
This package is modular in the sense that it implements a backend system allowing the user to use essentially any AD system available, see [ADNLPModels.jl/dev/backend/](https://juliasmoothoptimizers.github.io/ADNLPModels.jl/dev/backend/) for a tutorial.
13+
In the organization [JuliaSmoothOptimizers](https://jso.dev), the package [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API.
14+
This package is modular in the sense that it implements a backend system allowing the user to use essentially any AD system available, see [ADNLPModels.jl/dev/backend/](https://jso.dev/ADNLPModels.jl/dev/backend/) for a tutorial.
1515

16-
Note that most of the solvers available in [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io) will accept generic types.
16+
Note that most of the solvers available in [JuliaSmoothOptimizers](https://jso.dev) will accept generic types.
1717
For instance, it is possible to use the classical L-BFGS method implemented in [JSOSolvers.jl](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/) in single precision.
1818

1919
```julia
@@ -115,7 +115,7 @@ The same can be done for the other backends jacobian, hessian, etc.
115115
## Multiprecision test problems
116116

117117
Designing a multi-precision algorithm is very often connected with benchmarking and test problems.
118-
The package [OptimizationProblems.jl](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl) provides a collection of optimization problems in JuMP and ADNLPModels syntax, see [introduction to OptimizationProblems.jl tutorial](https://juliasmoothoptimizers.github.io/tutorials/introduction-to-optimizationproblems/).
118+
The package [OptimizationProblems.jl](https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl) provides a collection of optimization problems in JuMP and ADNLPModels syntax, see [introduction to OptimizationProblems.jl tutorial](https://jso.dev/tutorials/introduction-to-optimizationproblems/).
119119

120120
This package provides a `DataFrame` with all the information on the implemented problems.
121121

@@ -145,4 +145,4 @@ grad!(nlp, x16, g) # returns a vector of Float16
145145
We should pay additional attention when using multiple precisions as casting, for instance `x0`, from `Float64` into `Float16` implies that rounding errors occur.
146146
Therefore, `x0` is different than `x16`, and the gradients evaluated for these values too.
147147

148-
Feel free to look at [OptimizationProblems.jl documentation](https://juliasmoothoptimizers.github.io/OptimizationProblems.jl/dev/) to learn more or the tutorials at [juliasmoothoptimizers.github.io](https://juliasmoothoptimizers.github.io).
148+
Feel free to look at [OptimizationProblems.jl documentation](https://jso.dev/OptimizationProblems.jl/dev/) to learn more or the tutorials at [juliasmoothoptimizers.github.io](https://jso.dev).

tutorials/introduction-to-bundleadjustmentmodels/index.jmd

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ filter_df = df[ ( df.nequ .≥ 50000 ) .& ( df.nvar .≤ 34000 ), :]
3333

3434
The `Dataframe` is listing the matrices that you can have access to, but they still need to be downloaded.
3535

36-
Following the example above, we filtered two problems.
36+
Following the example above, we filtered two problems.
3737
What we want to do now is to select the first one in the listing.
3838

3939
```julia
@@ -43,7 +43,7 @@ name = filter_df[1, :name] # select the name of the first problem
4343
Now that the name is selected, we need to access the problem itself, and there are 2 solutions:
4444

4545
- You can download the problem's archive file;
46-
- You can automatically create a nonlinear least squares problem using [`NLPModels`](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) from [JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io/).
46+
- You can automatically create a nonlinear least squares problem using [`NLPModels`](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) from [JuliaSmoothOptimizers](https://jso.dev/).
4747

4848
## Get the problem archive file
4949

@@ -53,21 +53,21 @@ This package uses Julia Artifacts to handle the problems archives so that
5353
2. They are identified with a unique hash;
5454
3. They can be deleted with a single command line.
5555

56-
The method [`fetch_ba_name`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_name-Tuple{AbstractString}) will automatically download the problem (if needed) and return its path.
56+
The method [`fetch_ba_name`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_name-Tuple{AbstractString}) will automatically download the problem (if needed) and return its path.
5757

5858
```julia
5959
path = fetch_ba_name(name)
6060
```
6161

62-
It is also possible to directly download and get access to an entire group of problems using [`fetch_ba_group`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_group-Tuple{AbstractString}).
62+
It is also possible to directly download and get access to an entire group of problems using [`fetch_ba_group`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.fetch_ba_group-Tuple{AbstractString}).
6363

6464
```julia
6565
paths = fetch_ba_group("ladybug")
6666
```
6767

6868
## Generate a nonlinear least squares model
6969

70-
Now, it is possible to load the model using [`BundleAdjustmentModel`](https://juliasmoothoptimizers.github.io/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.BundleAdjustmentModel-Tuple{AbstractString})
70+
Now, it is possible to load the model using [`BundleAdjustmentModel`](https://jso.dev/BundleAdjustmentModels.jl/dev/reference/#BundleAdjustmentModels.BundleAdjustmentModel-Tuple{AbstractString})
7171

7272
```julia
7373
df = problems_df()
@@ -84,21 +84,21 @@ model = BundleAdjustmentModel("problem-49-7776-pre");
8484

8585
The function `BundleAdjustmentModel` will instantiate the model and automatically download it if needed.
8686
The resulting structure is an instance of `AbstractNLPModel`.
87-
So, it is possible to access its API as any other [`NLPModel`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/).
87+
So, it is possible to access its API as any other [`NLPModel`](https://jso.dev/NLPModels.jl/dev/).
8888

8989
```julia
9090
using NLPModels
9191
```
9292

93-
Using [`residual`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.residual), it is possible to compute the residual of the model
93+
Using [`residual`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.residual), it is possible to compute the residual of the model
9494

9595
```julia
9696
model = BundleAdjustmentModel("problem-49-7776-pre.txt.bz2")
9797
x = get_x0(model) # or `model.meta.x0`
9898
Fx = residual(model, x)
9999
```
100100

101-
or use the in-place method [`residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.residual!)
101+
or use the in-place method [`residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.residual!)
102102

103103
```julia
104104
model = BundleAdjustmentModel("problem-49-7776-pre.txt.bz2")
@@ -110,7 +110,7 @@ residual!(model, x, Fx);
110110

111111
You can also have access to the [`LinearOperator`](https://github.com/JuliaSmoothOptimizers/LinearOperators.jl) of the Jacobian matrix of the residual of the model which is calculated by hand (in contradiction to automatic differentiation).
112112

113-
You need to call [`jac_structure_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_structure_residual!) at least once before calling [`jac_op_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!).
113+
You need to call [`jac_structure_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_structure_residual!) at least once before calling [`jac_op_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!).
114114

115115
```julia
116116
model = BundleAdjustmentModel("problem-49-7776")
@@ -121,7 +121,7 @@ cols = Vector{Int}(undef, nnzj)
121121
jac_structure_residual!(model, rows, cols);
122122
```
123123

124-
You need to call [`jac_coord_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_coord_residual!) to update it to the current point.
124+
You need to call [`jac_coord_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_coord_residual!) to update it to the current point.
125125

126126
```julia
127127
model = BundleAdjustmentModel("problem-49-7776")
@@ -130,7 +130,7 @@ vals = similar(x, nnzj)
130130
jac_coord_residual!(model, x, vals)
131131
```
132132

133-
Finally you can use [`jac_op_residual!`](https://juliasmoothoptimizers.github.io/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!):
133+
Finally you can use [`jac_op_residual!`](https://jso.dev/NLPModels.jl/dev/api/#NLPModels.jac_op_residual!):
134134

135135
```julia
136136
model = BundleAdjustmentModel("problem-49-7776")

tutorials/introduction-to-cutest/index.jmd

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ CUTEst can be accessed in two ways.
2121

2222
NLPModels defines an abstract interface to access the objective, constraints,
2323
derivatives, etc. of the problem. A
24-
[reference guide](https://juliasmoothoptimizers.github.io/NLPModels.jl/latest/api/)
24+
[reference guide](https://jso.dev/NLPModels.jl/latest/api/)
2525
is available to check what you need.
2626

2727
Once CUTEst has been installed, open a problem with
@@ -59,7 +59,7 @@ println("Hx = $( hess(nlp, nlp.meta.x0) )")
5959
```
6060

6161
Remember to check the
62-
[API](https://juliasmoothoptimizers.github.io/NLPModels.jl/latest/api/)
62+
[API](https://jso.dev/NLPModels.jl/latest/api/)
6363
in case of doubts about these functions.
6464

6565
Notice how `hess` returns a symmetric matrix.
@@ -228,5 +228,3 @@ length(problems)
228228
problems = CUTEst.select(objtype="quadratic", contype="linear")
229229
length(problems)
230230
```
231-
232-

tutorials/introduction-to-jsosolvers/index.jmd

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ author: "Tangi Migot"
99
# JSOSolvers.jl Tutorial
1010

1111
This package provides optimization solvers curated by the
12-
[JuliaSmoothOptimizers](https://juliasmoothoptimizers.github.io)
12+
[JuliaSmoothOptimizers](https://jso.dev)
1313
organization.
1414
All solvers are based on [NLPModels.jl](https://github.com/JuliaSmoothOptimizers/NLPModels.jl) and [SolverCore.jl](https://github.com/JuliaSmoothOptimizers/SolverCore.jl).
1515

@@ -115,12 +115,12 @@ The following table provides the correspondance between the solvers and the solv
115115

116116
| Algorithm | Solver structure |
117117
| ------------------- | ---------------- |
118-
| [lbfgs](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.lbfgs-Union{Tuple{NLPModels.AbstractNLPModel},%20Tuple{V}}%20where%20V) | LBFGSSolver |
119-
| [R2](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.R2-Union{Tuple{NLPModels.AbstractNLPModel{T,%20V}},%20Tuple{V},%20Tuple{T}}%20where%20{T,%20V}) | R2Solver |
120-
| [tron](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.tron-Union{Tuple{V},%20Tuple{Val{:Newton},%20NLPModels.AbstractNLPModel}}%20where%20V) | TronSolver |
121-
| [trunk](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.trunk-Union{Tuple{V},%20Tuple{Val{:Newton},%20NLPModels.AbstractNLPModel}}%20where%20V) | TrunkSolver |
122-
| [tron (nls-variant)](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.tron-Union{Tuple{V},%20Tuple{Val{:GaussNewton},%20NLPModels.AbstractNLSModel}}%20where%20V) | TronSolverNLS |
123-
| [trunk (nls-variant)](https://juliasmoothoptimizers.github.io/JSOSolvers.jl/stable/reference/#JSOSolvers.trunk-Union{Tuple{V},%20Tuple{Val{:GaussNewton},%20NLPModels.AbstractNLSModel}}%20where%20V) | TrunkSolverNLS |
118+
| [lbfgs](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.lbfgs-Union{Tuple{NLPModels.AbstractNLPModel},%20Tuple{V}}%20where%20V) | LBFGSSolver |
119+
| [R2](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.R2-Union{Tuple{NLPModels.AbstractNLPModel{T,%20V}},%20Tuple{V},%20Tuple{T}}%20where%20{T,%20V}) | R2Solver |
120+
| [tron](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.tron-Union{Tuple{V},%20Tuple{Val{:Newton},%20NLPModels.AbstractNLPModel}}%20where%20V) | TronSolver |
121+
| [trunk](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.trunk-Union{Tuple{V},%20Tuple{Val{:Newton},%20NLPModels.AbstractNLPModel}}%20where%20V) | TrunkSolver |
122+
| [tron (nls-variant)](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.tron-Union{Tuple{V},%20Tuple{Val{:GaussNewton},%20NLPModels.AbstractNLSModel}}%20where%20V) | TronSolverNLS |
123+
| [trunk (nls-variant)](https://jso.dev/JSOSolvers.jl/stable/reference/#JSOSolvers.trunk-Union{Tuple{V},%20Tuple{Val{:GaussNewton},%20NLPModels.AbstractNLSModel}}%20where%20V) | TrunkSolverNLS |
124124

125125
It is also possible to pre-allocate the output structure `stats` and call `solve!(solver, nlp, stats)`.
126126
```julia
@@ -133,7 +133,7 @@ solve!(solver, nlp, stats)
133133

134134
## Callback
135135

136-
All the solvers have a callback mechanism called at each iteration, see also the [Using callbacks tutorial](https://juliasmoothoptimizers.github.io/tutorials/using-callbacks/).
136+
All the solvers have a callback mechanism called at each iteration, see also the [Using callbacks tutorial](https://jso.dev/tutorials/using-callbacks/).
137137
The expected signature of the callback is `callback(nlp, solver, stats)`, and its output is ignored.
138138
Changing any of the input arguments will affect the subsequent iterations.
139139
In particular, setting `stats.status = :user` will stop the algorithm.

tutorials/introduction-to-linear-operators/index.jmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ tags: ["linear-algebra", "linear-operators"]
44
author: "Geoffroy Leconte and Dominique Orban"
55
---
66

7-
[LinearOperators.jl](https://juliasmoothoptimizers.github.io/LinearOperators.jl/stable) is a package for matrix-like operators. Linear operators are defined by how they act on a vector, which is useful in a variety of situations where you don't want to materialize the matrix.
7+
[LinearOperators.jl](https://jso.dev/LinearOperators.jl/stable) is a package for matrix-like operators. Linear operators are defined by how they act on a vector, which is useful in a variety of situations where you don't want to materialize the matrix.
88

99
\toc
1010

tutorials/introduction-to-pdenlpmodels/index.jmd

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ ncon = Gridap.FESpaces.num_free_dofs(Ycon)
6868
x0 = zeros(npde + ncon);
6969
```
7070

71-
Overall, we built a GridapPDENLPModel, which implements the [NLPModel](https://juliasmoothoptimizers.github.io/NLPModels.jl/stable/) API.
71+
Overall, we built a GridapPDENLPModel, which implements the [NLPModel](https://jso.dev/NLPModels.jl/stable/) API.
7272

7373
```julia
7474
nlp = GridapPDENLPModel(x0, f, trian, Ypde, Ycon, Xpde, Xcon, op, name = "Control elastic membrane")
@@ -128,7 +128,7 @@ Reinitialize the counters before re-solving.
128128
reset!(nlp);
129129
```
130130

131-
Most JSO-compliant solvers are using logger for printing iteration information.
131+
Most JSO-compliant solvers are using logger for printing iteration information.
132132
`NullLogger` avoids printing iteration information.
133133

134134
```julia

0 commit comments

Comments
 (0)