Skip to content

Commit b5b61f9

Browse files
authored
Use SparseConnectivityTracer.jl
1 parent 15d897d commit b5b61f9

20 files changed

+94
-448
lines changed

Project.toml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,16 +3,20 @@ uuid = "54578032-b7ea-4c30-94aa-7cbd1cce6c9a"
33
version = "0.7.2"
44

55
[deps]
6+
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
67
ColPack = "ffa27691-3a59-46ab-a8d4-551f45b8d401"
78
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
89
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
910
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
1011
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
1112
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
1213
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
14+
SparseConnectivityTracer = "9f842d2f-2579-4b1d-911e-f412cf18a3f5"
1315

1416
[compat]
17+
ADTypes = "1.2.1"
1518
ColPack = "0.4"
19+
SparseConnectivityTracer = "0.5"
1620
ForwardDiff = "0.9.0, 0.10.0"
1721
NLPModels = "0.18, 0.19, 0.20, 0.21"
1822
Requires = "1"

README.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -96,8 +96,6 @@ The following AD packages are supported:
9696
and as optional dependencies (you must load the package before):
9797

9898
- `Enzyme.jl`;
99-
- `SparseDiffTools.jl`;
100-
- `Symbolics.jl`;
10199
- `Zygote.jl`.
102100

103101
## Bug reports and discussions

docs/Project.toml

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,6 @@ OptimizationProblems = "5049e819-d29b-5fba-b941-0eee7e64c1c6"
1010
Percival = "01435c0c-c90d-11e9-3788-63660f8fbccc"
1111
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
1212
SolverBenchmark = "581a75fa-a23a-52d0-a590-d6201de2218a"
13-
SymbolicUtils = "d1185830-fcd6-423d-90d6-eec64667417b"
14-
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
1513
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
1614

1715
[compat]
@@ -24,6 +22,4 @@ OptimizationProblems = "0.7"
2422
Percival = "0.7"
2523
Plots = "1"
2624
SolverBenchmark = "0.5"
27-
SymbolicUtils = "=1.5.1"
28-
Symbolics = "5.3"
2925
Zygote = "0.6.62"

docs/src/backend.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,14 @@ The backend information is in a structure [`ADNLPModels.ADModelBackend`](@ref) i
77

88
The functions used internally to define the NLPModel API and the possible backends are defined in the following table:
99

10-
| Functions | FowardDiff backends | ReverseDiff backends | Zygote backends | Enzyme backend | SparseDiffTools backend | Symbolics backend |
10+
| Functions | FowardDiff backends | ReverseDiff backends | Zygote backends | Enzyme backend | Sparse backend |
1111
| ----------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- |
12-
| `gradient` and `gradient!` | `ForwardDiffADGradient`/`GenericForwardDiffADGradient` | `ReverseDiffADGradient`/`GenericReverseDiffADGradient` | `ZygoteADGradient` | `EnzymeADGradient` | -- | -- |
13-
| `jacobian` | `ForwardDiffADJacobian` | `ReverseDiffADJacobian` | `ZygoteADJacobian` | -- | `SDTSparseADJacobian` | `SparseADJacobian`/`SparseSymbolicsADJacobian` |
14-
| `hessian` | `ForwardDiffADHessian` | `ReverseDiffADHessian` | `ZygoteADHessian` | -- | -- | `SparseADHessian`/`SparseSymbolicsADHessian` |
15-
| `Jprod` | `ForwardDiffADJprod`/`GenericForwardDiffADJprod` | `ReverseDiffADJprod`/`GenericReverseDiffADJprod` | `ZygoteADJprod` | -- | `SDTForwardDiffADJprod` | -- |
16-
| `Jtprod` | `ForwardDiffADJtprod`/`GenericForwardDiffADJtprod` | `ReverseDiffADJtprod`/`GenericReverseDiffADJtprod` | `ZygoteADJtprod` | -- | -- | -- |
17-
| `Hvprod` | `ForwardDiffADHvprod`/`GenericForwardDiffADHvprod` | `ReverseDiffADHvprod`/`GenericReverseDiffADHvprod` | -- | -- | `SDTForwardDiffADHvprod` | -- |
12+
| `gradient` and `gradient!` | `ForwardDiffADGradient`/`GenericForwardDiffADGradient` | `ReverseDiffADGradient`/`GenericReverseDiffADGradient` | `ZygoteADGradient` | `EnzymeADGradient` | -- |
13+
| `jacobian` | `ForwardDiffADJacobian` | `ReverseDiffADJacobian` | `ZygoteADJacobian` | -- | `SparseADJacobian` |
14+
| `hessian` | `ForwardDiffADHessian` | `ReverseDiffADHessian` | `ZygoteADHessian` | -- | `SparseADHessian` |
15+
| `Jprod` | `ForwardDiffADJprod`/`GenericForwardDiffADJprod` | `ReverseDiffADJprod`/`GenericReverseDiffADJprod` | `ZygoteADJprod` | -- |
16+
| `Jtprod` | `ForwardDiffADJtprod`/`GenericForwardDiffADJtprod` | `ReverseDiffADJtprod`/`GenericReverseDiffADJtprod` | `ZygoteADJtprod` | -- |
17+
| `Hvprod` | `ForwardDiffADHvprod`/`GenericForwardDiffADHvprod` | `ReverseDiffADHvprod`/`GenericReverseDiffADHvprod` | -- |
1818
| `directional_second_derivative` | `ForwardDiffADGHjvprod` | -- | -- | -- | -- |
1919

2020
The functions `hess_structure!`, `hess_coord!`, `jac_structure!` and `jac_coord!` defined in `ad.jl` are generic to all the backends for now.
@@ -49,7 +49,7 @@ Thanks to the backends inside `ADNLPModels.jl`, it is easy to change the backend
4949

5050
```@example adnlp
5151
nlp = ADNLPModel(f, x0, gradient_backend = ADNLPModels.ReverseDiffADGradient)
52-
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `ReverseDiff`
52+
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `ReverseDiff`
5353
```
5454

5555
It is also possible to try some new implementation for each function. First, we define a new `ADBackend` structure.
@@ -81,7 +81,7 @@ Finally, we use the homemade backend to compute the gradient.
8181

8282
```@example adnlp
8383
nlp = ADNLPModel(sum, ones(3), gradient_backend = NewADGradient)
84-
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
84+
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
8585
```
8686

8787
### Change backend
@@ -104,7 +104,7 @@ set_adbackend!(nlp, adback)
104104
get_adbackend(nlp)
105105
```
106106

107-
The alternative is to use ``set_adbackend!` and pass the new backends via `kwargs`. In the second approach, it is possible to pass either the type of the desired backend or an instance as shown below.
107+
The alternative is to use `set_adbackend!` and pass the new backends via `kwargs`. In the second approach, it is possible to pass either the type of the desired backend or an instance as shown below.
108108

109109
```@example adnlp2
110110
set_adbackend!(

docs/src/performance.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ v = ones(2)
8484
It is tempting to define the most generic and efficient `ADNLPModel` from the start.
8585

8686
```@example ex2
87-
using ADNLPModels, NLPModels, Symbolics
87+
using ADNLPModels, NLPModels
8888
f(x) = (x[1] - x[2])^2
8989
x0 = ones(2)
9090
lcon = ucon = ones(1)

docs/src/predefined.md

Lines changed: 5 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -49,30 +49,12 @@ get_adbackend(nlp)
4949

5050
## Hessian and Jacobian computations
5151

52-
It is to be noted that by default the Jacobian and Hessian matrices are dense.
52+
It is to be noted that by default the Jacobian and Hessian matrices are sparse.
5353

5454
```@example ex1
55-
(get_nnzj(nlp), get_nnzh(nlp)) # number of nonzeros elements in the Jacobian and Hessian
55+
(get_nnzj(nlp), get_nnzh(nlp)) # number of nonzeros elements in the Jacobian and Hessian
5656
```
5757

58-
To enable sparse computations of these entries, one needs to first load the package [`Symbolics.jl`](https://github.com/JuliaSymbolics/Symbolics.jl)
59-
60-
```@example ex1
61-
using Symbolics
62-
```
63-
64-
and now
65-
66-
```@example ex1
67-
ADNLPModels.predefined_backend[:optimized][:jacobian_backend]
68-
```
69-
70-
```@example ex1
71-
ADNLPModels.predefined_backend[:optimized][:hessian_backend]
72-
```
73-
74-
Choosing another optimization problem with the optimized backend will compute sparse Jacobian and Hessian matrices.
75-
7658
```@example ex1
7759
f(x) = (x[1] - 1)^2
7860
T = Float64
@@ -92,4 +74,6 @@ x = rand(T, 2)
9274
jac(nlp, x)
9375
```
9476

95-
The package [`Symbolics.jl`](https://github.com/JuliaSymbolics/Symbolics.jl) is used to compute the sparsity pattern of the sparse matrix. The evaluation of the number of directional derivatives needed to evaluate the matrix is done by [`ColPack.jl`](https://github.com/michel2323/ColPack.jl).
77+
The package [`SparseConnectivityTracer.jl`](https://github.com/adrhill/SparseConnectivityTracer.jl) is used to compute the sparsity pattern of Jacobians and Hessians.
78+
The evaluation of the number of directional derivatives and the seeds needed to evaluate the compressed Jacobians and Hessians is done by [`ColPack.jl`](https://github.com/exanauts/ColPack.jl).
79+
We acknowledge Guillaume Dalle (@gdalle), Adrian Hill (@adrhill), and Michel Schanen (@michel2323) for the development of these packages.

docs/src/reference.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
11
# Reference
2-
2+
33
## Contents
4-
4+
55
```@contents
66
Pages = ["reference.md"]
77
```
8-
8+
99
## Index
10-
10+
1111
```@index
1212
Pages = ["reference.md"]
1313
```
14-
14+
1515
```@autodocs
1616
Modules = [ADNLPModels]
17-
```
17+
```

src/ADNLPModels.jl

Lines changed: 5 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,11 @@ module ADNLPModels
22

33
# stdlib
44
using LinearAlgebra, SparseArrays
5+
56
# external
6-
using ColPack, ForwardDiff, ReverseDiff
7+
using ADTypes: ADTypes, AbstractSparsityDetector
8+
using SparseConnectivityTracer, ColPack, ForwardDiff, ReverseDiff
9+
710
# JSO
811
using NLPModels
912
using Requires
@@ -16,39 +19,13 @@ const ADModel{T, S} = Union{AbstractADNLPModel{T, S}, AbstractADNLSModel{T, S}}
1619
include("ad.jl")
1720
include("ad_api.jl")
1821

19-
"""
20-
compute_jacobian_sparsity(c!, cx, x0)
21-
22-
Return a sparse matrix.
23-
"""
24-
function compute_jacobian_sparsity(args...)
25-
throw(
26-
ArgumentError(
27-
"Please load Symbolics.jl to enable sparse Jacobian or implement `compute_jacobian_sparsity`.",
28-
),
29-
)
30-
end
31-
32-
"""
33-
compute_hessian_sparsity(f, nvar, c!, ncon)
34-
35-
Return a sparse matrix.
36-
"""
37-
function compute_hessian_sparsity(args...)
38-
throw(
39-
ArgumentError(
40-
"Please load Symbolics.jl to enable sparse Hessian or implement `compute_hessian_sparsity`.",
41-
),
42-
)
43-
end
44-
22+
include("sparsity_pattern.jl")
4523
include("sparse_jacobian.jl")
4624
include("sparse_hessian.jl")
4725

4826
include("forward.jl")
4927
include("reverse.jl")
5028
include("enzyme.jl")
51-
include("sparse_diff_tools.jl")
5229
include("zygote.jl")
5330
include("predefined_backend.jl")
5431
include("nlp.jl")
@@ -181,20 +158,6 @@ function ADNLSModel!(model::AbstractNLSModel; kwargs...)
181158
end
182159
end
183160

184-
@init begin
185-
@require Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7" begin
186-
include("sparse_sym.jl")
187-
188-
predefined_backend[:default][:jacobian_backend] = SparseADJacobian
189-
predefined_backend[:default][:jacobian_residual_backend] = SparseADJacobian
190-
predefined_backend[:optimized][:jacobian_backend] = SparseADJacobian
191-
predefined_backend[:optimized][:jacobian_residual_backend] = SparseADJacobian
192-
193-
predefined_backend[:default][:hessian_backend] = SparseADHessian
194-
predefined_backend[:optimized][:hessian_backend] = SparseReverseADHessian
195-
end
196-
end
197-
198161
export get_adbackend, set_adbackend!
199162

200163
"""

src/predefined_backend.jl

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,13 +3,13 @@ default_backend = Dict(
33
:hprod_backend => ForwardDiffADHvprod,
44
:jprod_backend => ForwardDiffADJprod,
55
:jtprod_backend => ForwardDiffADJtprod,
6-
:jacobian_backend => ForwardDiffADJacobian,
7-
:hessian_backend => ForwardDiffADHessian,
6+
:jacobian_backend => SparseADJacobian, # ForwardDiffADJacobian
7+
:hessian_backend => SparseADHessian, # ForwardDiffADHessian
88
:ghjvprod_backend => ForwardDiffADGHjvprod,
99
:hprod_residual_backend => ForwardDiffADHvprod,
1010
:jprod_residual_backend => ForwardDiffADJprod,
1111
:jtprod_residual_backend => ForwardDiffADJtprod,
12-
:jacobian_residual_backend => ForwardDiffADJacobian,
12+
:jacobian_residual_backend => SparseADJacobian, # ForwardDiffADJacobian,
1313
:hessian_residual_backend => ForwardDiffADHessian,
1414
)
1515

@@ -18,13 +18,13 @@ optimized = Dict(
1818
:hprod_backend => ReverseDiffADHvprod,
1919
:jprod_backend => ForwardDiffADJprod,
2020
:jtprod_backend => ReverseDiffADJtprod,
21-
:jacobian_backend => ForwardDiffADJacobian,
22-
:hessian_backend => ForwardDiffADHessian,
21+
:jacobian_backend => SparseADJacobian, # ForwardDiffADJacobian
22+
:hessian_backend => SparseReverseADHessian, # ForwardDiffADHessian,
2323
:ghjvprod_backend => ForwardDiffADGHjvprod,
2424
:hprod_residual_backend => ReverseDiffADHvprod,
2525
:jprod_residual_backend => ForwardDiffADJprod,
2626
:jtprod_residual_backend => ReverseDiffADJtprod,
27-
:jacobian_residual_backend => ForwardDiffADJacobian,
27+
:jacobian_residual_backend => SparseADJacobian, # ForwardDiffADJacobian
2828
:hessian_residual_backend => ForwardDiffADHessian,
2929
)
3030

0 commit comments

Comments
 (0)