Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "ReservoirComputing"
uuid = "7c2d2b1e-3dd4-11ea-355a-8f6a8116e294"
authors = ["Francesco Martinuzzi"]
version = "0.10.13"
version = "0.11.0"

[deps]
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
Expand Down
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,5 +17,5 @@ Documenter = "1"
OrdinaryDiffEq = "6"
Plots = "1"
PredefinedDynamicalSystems = "1"
ReservoirComputing = "0.10.5"
ReservoirComputing = "0.11.0"
StatsBase = "0.34.4"
11 changes: 11 additions & 0 deletions docs/src/api/inits.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,3 +30,14 @@
selfloop_forward_connection
forward_connection
```

## Building functions

```@docs
scale_radius!
delay_line!
backward_connection!
simple_cycle!
self_loop!
add_jumps!
```
33 changes: 24 additions & 9 deletions docs/src/esn_tutorials/change_layers.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
# Using different layers

A great deal of efforts in the ESNs field are devoted to finding an ideal construction for the reservoir matrices. ReservoirComputing.jl offers multiple implementation of reservoir and input matrices initializations found in the literature. The API is standardized, and follows by [WeightInitializers.jl](https://github.com/LuxDL/Lux.jl/tree/main/lib/WeightInitializers):
A great deal of efforts in the ESNs field are devoted to finding an ideal construction
for the reservoir matrices. ReservoirComputing.jl offers multiple implementation of
reservoir and input matrices initializations found in the literature.
The API is standardized, and follows
[WeightInitializers.jl](https://github.com/LuxDL/Lux.jl/tree/main/lib/WeightInitializers):

```julia
weights = init(rng, dims...)
Expand All @@ -22,9 +26,13 @@ Custom layers only need to follow these APIs to be compatible with ReservoirComp

## Example of minimally complex ESN

Using [^rodan2012] and [^rodan2010] as references this section will provide an example on how to change both the input layer and the reservoir for ESNs.
Using [^rodan2012] and [^rodan2010] as references this section will provide an
example on how to change both the input layer and the reservoir for ESNs.

The task for this example will be the one step ahead prediction of the Henon map. To obtain the data one can leverage the package [PredefinedDynamicalSystems.jl](https://juliadynamics.github.io/PredefinedDynamicalSystems.jl/dev/). The data is scaled to be between -1 and 1.
The task for this example will be the one step ahead prediction of the Henon map.
To obtain the data one can leverage the package
[PredefinedDynamicalSystems.jl](https://juliadynamics.github.io/PredefinedDynamicalSystems.jl/dev/).
The data is scaled to be between -1 and 1.

```@example minesn
using PredefinedDynamicalSystems
Expand All @@ -43,14 +51,16 @@ testing_input = data[:, (shift + train_len):(shift + train_len + predict_len - 1
testing_target = data[:, (shift + train_len + 1):(shift + train_len + predict_len)]
```

Now it is possible to define the input layers and reservoirs we want to compare and run the comparison in a simple for loop. The accuracy will be tested using the mean squared deviation msd from StatsBase.
Now it is possible to define the input layers and reservoirs we want to compare and run
the comparison in a simple for loop. The accuracy will be tested using the mean squared
deviation msd from StatsBase.

```@example minesn
using ReservoirComputing, StatsBase

res_size = 300
input_layer = [minimal_init(; weight=0.85, sampling_type=:irrational),
minimal_init(; weight=0.95, sampling_type=:irrational)]
input_layer = [minimal_init(; weight=0.85, sampling_type=:irrational_sample!),
minimal_init(; weight=0.95, sampling_type=:irrational_sample!)]
reservoirs = [simple_cycle(; weight=0.7),
cycle_jumps(; cycle_weight=0.7, jump_weight=0.2, jump_size=5)]

Expand All @@ -64,9 +74,14 @@ for i in 1:length(reservoirs)
end
```

As it is possible to see, changing layers in ESN models is straightforward. Be sure to check the API documentation for a full list of reservoir and layers.
As it is possible to see, changing layers in ESN models is straightforward.
Be sure to check the API documentation for a full list of reservoir and layers.

## Bibliography

[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852.
[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144.
[^rodan2012]: Rodan, Ali, and Peter Tiňo.
“Simple deterministically constructed cycle reservoirs with regular jumps.”
Neural computation 24.7 (2012): 1822-1852.
[^rodan2010]: Rodan, Ali, and Peter Tiňo.
“Minimum complexity echo state network.”
IEEE transactions on neural networks 22.1 (2010): 131-144.
3 changes: 3 additions & 0 deletions src/ReservoirComputing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ include("predict.jl")
include("train/linear_regression.jl")

#esn
include("esn/inits_components.jl")
include("esn/esn_inits.jl")
include("esn/esn_reservoir_drivers.jl")
include("esn/esn.jl")
Expand All @@ -42,6 +43,8 @@ export rand_sparse, delay_line, delay_line_backward, cycle_jumps,
simple_cycle, pseudo_svd, chaotic_init, low_connectivity, double_cycle,
selfloop_cycle, selfloop_feedback_cycle, selfloop_delayline_backward,
selfloop_forward_connection, forward_connection
export scale_radius!, delay_line!, backward_connection!, simple_cycle!, add_jumps!,
self_loop!
export RNN, MRNN, GRU, GRUParams, FullyGated, Minimal
export train
export ESN, HybridESN, KnowledgeModel, DeepESN
Expand Down
Loading
Loading