Skip to content

Commit d0a992c

Browse files
docs: fix docstring citations
1 parent bcecae7 commit d0a992c

File tree

5 files changed

+109
-77
lines changed

5 files changed

+109
-77
lines changed

README.md

Lines changed: 37 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,38 @@ Use the
2727
[in-development documentation](https://docs.sciml.ai/ReservoirComputing/dev/)
2828
to take a look at not yet released features.
2929

30+
## Citing
31+
32+
If you use this library in your work, please cite:
33+
34+
```bibtex
35+
@article{martinuzzi2022reservoircomputing,
36+
author = {Francesco Martinuzzi and Chris Rackauckas and Anas Abdelrehim and Miguel D. Mahecha and Karin Mora},
37+
title = {ReservoirComputing.jl: An Efficient and Modular Library for Reservoir Computing Models},
38+
journal = {Journal of Machine Learning Research},
39+
year = {2022},
40+
volume = {23},
41+
number = {288},
42+
pages = {1--8},
43+
url = {http://jmlr.org/papers/v23/22-0611.html}
44+
}
45+
```
46+
47+
## Installation
48+
49+
ReservoirComputing.jl can be installed using either of
50+
51+
```julia_repl
52+
julia> ] #actually press the closing square brackets
53+
pkg> add ReservoirComputing
54+
```
55+
or
56+
57+
```julia
58+
using Pkg
59+
Pkg.add("ReservoirComputing")
60+
```
61+
3062
## Quick Example
3163

3264
To illustrate the workflow of this library we will showcase
@@ -36,7 +68,9 @@ For the `Generative` prediction we need the target data
3668
to be one step ahead of the training data:
3769

3870
```julia
39-
using ReservoirComputing, OrdinaryDiffEq
71+
using ReservoirComputing, OrdinaryDiffEq, Random
72+
Random.seed!(42)
73+
rng = MersenneTwister(17)
4074

4175
#lorenz system parameters
4276
u0 = [1.0, 0.0, 0.0]
@@ -74,7 +108,8 @@ res_size = 300
74108
esn = ESN(input_data, input_size, res_size;
75109
reservoir=rand_sparse(; radius=1.2, sparsity=6 / res_size),
76110
input_layer=weighted_init,
77-
nla_type=NLAT2())
111+
nla_type=NLAT2(),
112+
rng=rng)
78113
```
79114

80115
The echo state network can now be trained and tested.
@@ -110,23 +145,6 @@ plot!(transpose(test)[:, 1], transpose(test)[:, 2], transpose(test)[:, 3]; label
110145

111146
![lorenz_attractor](https://user-images.githubusercontent.com/10376688/81470281-5a34b580-91ea-11ea-9eea-d2b266da19f4.png)
112147

113-
## Citing
114-
115-
If you use this library in your work, please cite:
116-
117-
```bibtex
118-
@article{JMLR:v23:22-0611,
119-
author = {Francesco Martinuzzi and Chris Rackauckas and Anas Abdelrehim and Miguel D. Mahecha and Karin Mora},
120-
title = {ReservoirComputing.jl: An Efficient and Modular Library for Reservoir Computing Models},
121-
journal = {Journal of Machine Learning Research},
122-
year = {2022},
123-
volume = {23},
124-
number = {288},
125-
pages = {1--8},
126-
url = {http://jmlr.org/papers/v23/22-0611.html}
127-
}
128-
```
129-
130148
## Acknowledgements
131149

132150
This project was possible thanks to initial funding through

docs/Project.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
CellularAutomata = "878138dc-5b27-11ea-1a71-cb95d38d6b29"
33
DifferentialEquations = "0c46a032-eb83-5123-abaf-570d42b7fbaa"
44
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
5+
DocumenterCitations = "daee34ce-89f3-4625-b898-19384cb65244"
56
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
67
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
78
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"

docs/make.jl

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
using Documenter, ReservoirComputing
1+
using Documenter, DocumenterCitations, ReservoirComputing
22

33
cp("./docs/Manifest.toml", "./docs/src/assets/Manifest.toml"; force = true)
44
cp("./docs/Project.toml", "./docs/src/assets/Project.toml"; force = true)
@@ -8,6 +8,11 @@ ENV["GKSwstype"] = "100"
88
include("pages.jl")
99
mathengine = Documenter.MathJax()
1010

11+
bib = CitationBibliography(
12+
joinpath(@__DIR__, "refs.bib");
13+
style = :authoryear
14+
)
15+
1116
makedocs(; modules = [ReservoirComputing],
1217
sitename = "ReservoirComputing.jl",
1318
clean = true, doctest = false, linkcheck = true,

docs/refs.bib

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
@article{Lu2017,
2+
title = {Reservoir observers: Model-free inference of unmeasured variables in chaotic systems},
3+
volume = {27},
4+
ISSN = {1089-7682},
5+
url = {http://dx.doi.org/10.1063/1.4979665},
6+
DOI = {10.1063/1.4979665},
7+
number = {4},
8+
journal = {Chaos: An Interdisciplinary Journal of Nonlinear Science},
9+
publisher = {AIP Publishing},
10+
author = {Lu, Zhixin and Pathak, Jaideep and Hunt, Brian and Girvan, Michelle and Brockett, Roger and Ott, Edward},
11+
year = {2017},
12+
month = apr
13+
}
14+
15+
@article{Pathak2018,
16+
title = {Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model},
17+
volume = {28},
18+
ISSN = {1089-7682},
19+
url = {http://dx.doi.org/10.1063/1.5028373},
20+
DOI = {10.1063/1.5028373},
21+
number = {4},
22+
journal = {Chaos: An Interdisciplinary Journal of Nonlinear Science},
23+
publisher = {AIP Publishing},
24+
author = {Pathak, Jaideep and Wikner, Alexander and Fussell, Rebeckah and Chandra, Sarthak and Hunt, Brian R. and Girvan, Michelle and Ott, Edward},
25+
year = {2018},
26+
month = apr
27+
}
28+
29+
@article{Rodan2011,
30+
title = {Minimum Complexity Echo State Network},
31+
volume = {22},
32+
ISSN = {1941-0093},
33+
url = {http://dx.doi.org/10.1109/TNN.2010.2089641},
34+
DOI = {10.1109/tnn.2010.2089641},
35+
number = {1},
36+
journal = {IEEE Transactions on Neural Networks},
37+
publisher = {Institute of Electrical and Electronics Engineers (IEEE)},
38+
author = {Rodan, A and Tino, P},
39+
year = {2011},
40+
month = jan,
41+
pages = {131–144}
42+
}

src/esn/esn_inits.jl

Lines changed: 23 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,8 @@ end
4949
5050
Create and return a matrix representing a weighted input layer.
5151
This initializer generates a weighted input matrix with random non-zero
52-
elements distributed uniformly within the range [-`scaling`, `scaling`] [^lu2017].
52+
elements distributed uniformly within the range
53+
[-`scaling`, `scaling`] [Lu2017](@cite).
5354
5455
# Arguments
5556
@@ -78,11 +79,6 @@ julia> res_input = weighted_init(8, 3)
7879
0.0 0.0 0.0577838
7980
0.0 0.0 -0.0562827
8081
```
81-
82-
[^lu2017]: Lu, Zhixin, et al.
83-
"Reservoir observers: Model-free inference of unmeasured variables in
84-
chaotic systems."
85-
Chaos: An Interdisciplinary Journal of Nonlinear Science 27.4 (2017): 041102.
8682
"""
8783
function weighted_init(rng::AbstractRNG, ::Type{T}, dims::Integer...;
8884
scaling::Number = T(0.1), return_sparse::Bool = false) where {T <: Number}
@@ -109,7 +105,7 @@ end
109105
Create and return a minimal weighted input layer matrix.
110106
This initializer generates a weighted input matrix with equal, deterministic
111107
elements in the same construction as [`weighted_minimal]`(@ref),
112-
inspired by [^lu2017].
108+
inspired by [Lu2017](@cite).
113109
114110
Please note that this initializer computes its own reservoir size! If
115111
the computed reservoir size is different than the provided one it will raise a
@@ -188,11 +184,6 @@ julia> res_input = weighted_minimal(9, 3; sampling_type = :bernoulli_sample!)
188184
-0.0 -0.0 0.1
189185
0.0 -0.0 0.1
190186
```
191-
192-
[^lu2017]: Lu, Zhixin, et al.
193-
"Reservoir observers: Model-free inference of unmeasured variables in
194-
chaotic systems."
195-
Chaos: An Interdisciplinary Journal of Nonlinear Science 27.4 (2017): 041102.
196187
"""
197188
function weighted_minimal(rng::AbstractRNG, ::Type{T}, dims::Integer...;
198189
weight::Number = T(0.1), return_sparse::Bool = false,
@@ -216,7 +207,8 @@ end
216207
informed_init([rng], [T], dims...;
217208
scaling=0.1, model_in_size, gamma=0.5)
218209
219-
Create an input layer for informed echo state networks [^pathak2018].
210+
Create an input layer for informed echo state
211+
networks [Pathak2018](@cite).
220212
221213
# Arguments
222214
@@ -234,10 +226,6 @@ Create an input layer for informed echo state networks [^pathak2018].
234226
- `gamma`: The gamma value. Default is 0.5.
235227
236228
# Examples
237-
238-
[^pathak2018]: Pathak, Jaideep, et al. "Hybrid forecasting of chaotic processes:
239-
Using machine learning in conjunction with a knowledge-based model."
240-
Chaos: An Interdisciplinary Journal of Nonlinear Science 28.4 (2018).
241229
"""
242230
function informed_init(rng::AbstractRNG, ::Type{T}, dims::Integer...;
243231
scaling::Number = T(0.1), model_in_size::Integer,
@@ -281,8 +269,9 @@ end
281269
sampling_type=:bernoulli_sample!, weight=0.1, irrational=pi,
282270
start=1, p=0.5)
283271
284-
Create a layer matrix with uniform weights determined by `weight` [^rodan2010].
285-
The sign difference is randomly determined by the `sampling` chosen.
272+
Create a layer matrix with uniform weights determined by
273+
`weight` [Rodan2011](@cite). The sign difference is randomly
274+
determined by the `sampling` chosen.
286275
287276
# Arguments
288277
@@ -358,10 +347,6 @@ julia> res_input = minimal_init(8, 3; p = 0.8)# higher p -> more positive signs
358347
-0.1 0.1 0.1
359348
0.1 0.1 0.1
360349
```
361-
362-
[^rodan2010]: Rodan, Ali, and Peter Tino.
363-
"Minimum complexity echo state network."
364-
IEEE transactions on neural networks 22.1 (2010): 131-144.
365350
"""
366351
function minimal_init(rng::AbstractRNG, ::Type{T}, dims::Integer...;
367352
weight::Number = T(0.1), sampling_type::Symbol = :bernoulli_sample!,
@@ -1034,7 +1019,7 @@ end
10341019
weight=0.1, return_sparse=false,
10351020
kwargs...)
10361021
1037-
Create and return a delay line reservoir matrix [^rodan2010].
1022+
Create and return a delay line reservoir matrix [Rodan2011](@cite).
10381023
10391024
# Arguments
10401025
@@ -1089,10 +1074,6 @@ julia> res_matrix = delay_line(5, 5; weight = 1)
10891074
0.0 0.0 1.0 0.0 0.0
10901075
0.0 0.0 0.0 1.0 0.0
10911076
```
1092-
1093-
[^rodan2010]: Rodan, Ali, and Peter Tino.
1094-
"Minimum complexity echo state network."
1095-
IEEE transactions on neural networks 22.1 (2010): 131-144.
10961077
"""
10971078
function delay_line(rng::AbstractRNG, ::Type{T}, dims::Integer...;
10981079
weight::Union{Number, AbstractVector} = T(0.1), shift::Integer = 1,
@@ -1106,11 +1087,11 @@ end
11061087

11071088
"""
11081089
delay_line_backward([rng], [T], dims...;
1109-
weight=0.1, fb_weight=0.2, return_sparse=false,
1090+
weight=0.1, fb_weight=0.1, return_sparse=false,
11101091
delay_kwargs=(), fb_kwargs=())
11111092
11121093
Create a delay line backward reservoir with the specified by `dims` and weights.
1113-
Creates a matrix with backward connections as described in [^rodan2010].
1094+
Creates a matrix with backward connections as described in [Rodan2011](@cite).
11141095
11151096
# Arguments
11161097
@@ -1134,7 +1115,7 @@ Creates a matrix with backward connections as described in [^rodan2010].
11341115
This can be provided as a single value or an array. In case it is provided as an
11351116
array please make sure that the lenght of the array matches the lenght of the sub-diagonal
11361117
you want to populate.
1137-
Default is 0.2.
1118+
Default is 0.1.
11381119
- `fb_shift`: How far the backward connection will be from the diagonal.
11391120
Default is 2.
11401121
- `return_sparse`: flag for returning a `sparse` matrix.
@@ -1163,24 +1144,20 @@ Creates a matrix with backward connections as described in [^rodan2010].
11631144
```jldoctest
11641145
julia> res_matrix = delay_line_backward(5, 5)
11651146
5×5 Matrix{Float32}:
1166-
0.0 0.2 0.0 0.0 0.0
1167-
0.1 0.0 0.2 0.0 0.0
1168-
0.0 0.1 0.0 0.2 0.0
1169-
0.0 0.0 0.1 0.0 0.2
1147+
0.0 0.1 0.0 0.0 0.0
1148+
0.1 0.0 0.1 0.0 0.0
1149+
0.0 0.1 0.0 0.1 0.0
1150+
0.0 0.0 0.1 0.0 0.1
11701151
0.0 0.0 0.0 0.1 0.0
11711152
11721153
julia> res_matrix = delay_line_backward(Float16, 5, 5)
11731154
5×5 Matrix{Float16}:
1174-
0.0 0.2 0.0 0.0 0.0
1175-
0.1 0.0 0.2 0.0 0.0
1176-
0.0 0.1 0.0 0.2 0.0
1177-
0.0 0.0 0.1 0.0 0.2
1155+
0.0 0.1 0.0 0.0 0.0
1156+
0.1 0.0 0.1 0.0 0.0
1157+
0.0 0.1 0.0 0.1 0.0
1158+
0.0 0.0 0.1 0.0 0.1
11781159
0.0 0.0 0.0 0.1 0.0
11791160
```
1180-
1181-
[^rodan2010]: Rodan, Ali, and Peter Tino.
1182-
"Minimum complexity echo state network."
1183-
IEEE transactions on neural networks 22.1 (2010): 131-144.
11841161
"""
11851162
function delay_line_backward(rng::AbstractRNG, ::Type{T}, dims::Integer...;
11861163
weight::Union{Number, AbstractVector} = T(0.1),
@@ -1201,7 +1178,7 @@ end
12011178
cycle_weight=0.1, jump_weight=0.1, jump_size=3, return_sparse=false,
12021179
cycle_kwargs=(), jump_kwargs=())
12031180
1204-
Create a cycle jumps reservoir [^Rodan2012].
1181+
Create a cycle jumps reservoir [Rodan2011](@cite).
12051182
12061183
# Arguments
12071184
@@ -1266,10 +1243,6 @@ julia> res_matrix = cycle_jumps(5, 5; jump_size = 2)
12661243
0.0 0.0 0.1 0.0 0.0
12671244
0.0 0.0 0.1 0.1 0.0
12681245
```
1269-
1270-
[^rodan2012]: Rodan, Ali, and Peter Tiňo.
1271-
"Simple deterministically constructed cycle reservoirs with regular jumps."
1272-
Neural computation 24.7 (2012): 1822-1852.
12731246
"""
12741247
function cycle_jumps(rng::AbstractRNG, ::Type{T}, dims::Integer...;
12751248
cycle_weight::Union{Number, AbstractVector} = T(0.1),
@@ -1291,7 +1264,7 @@ end
12911264
weight=0.1, return_sparse=false,
12921265
kwargs...)
12931266
1294-
Create a simple cycle reservoir [^rodan2010].
1267+
Create a simple cycle reservoir [Rodan2011](@cite).
12951268
12961269
# Arguments
12971270
@@ -1344,10 +1317,6 @@ julia> res_matrix = simple_cycle(5, 5; weight = 11)
13441317
0.0 0.0 11.0 0.0 0.0
13451318
0.0 0.0 0.0 11.0 0.0
13461319
```
1347-
1348-
[^rodan2010]: Rodan, Ali, and Peter Tino.
1349-
"Minimum complexity echo state network."
1350-
IEEE transactions on neural networks 22.1 (2010): 131-144.
13511320
"""
13521321
function simple_cycle(rng::AbstractRNG, ::Type{T}, dims::Integer...;
13531322
weight::Union{Number, AbstractVector} = T(0.1),
@@ -1424,7 +1393,7 @@ end
14241393
return_sparse=false)
14251394
14261395
Creates a true double cycle reservoir, ispired by [^fu2023],
1427-
with cycles built on the definition by [^rodan2010].
1396+
with cycles built on the definition by [Rodan2011](@cite).
14281397
14291398
# Arguments
14301399
@@ -1476,9 +1445,6 @@ julia> true_double_cycle(5, 5; cycle_weight = 0.1, second_cycle_weight = 0.3)
14761445
[^fu2023]: Fu, Jun, et al.
14771446
"A double-cycle echo state network topology for time series prediction."
14781447
Chaos: An Interdisciplinary Journal of Nonlinear Science 33.9 (2023).
1479-
[^rodan2010]: Rodan, Ali, and Peter Tino.
1480-
"Minimum complexity echo state network."
1481-
IEEE transactions on neural networks 22.1 (2010): 131-144.
14821448
"""
14831449
function true_double_cycle(rng::AbstractRNG, ::Type{T}, dims::Integer...;
14841450
cycle_weight::Union{Number, AbstractVector} = T(0.1),

0 commit comments

Comments
 (0)