You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/esn_tutorials/change_layers.md
+24-9Lines changed: 24 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,10 @@
1
1
# Using different layers
2
2
3
-
A great deal of efforts in the ESNs field are devoted to finding an ideal construction for the reservoir matrices. ReservoirComputing.jl offers multiple implementation of reservoir and input matrices initializations found in the literature. The API is standardized, and follows by [WeightInitializers.jl](https://github.com/LuxDL/Lux.jl/tree/main/lib/WeightInitializers):
3
+
A great deal of efforts in the ESNs field are devoted to finding an ideal construction
4
+
for the reservoir matrices. ReservoirComputing.jl offers multiple implementation of
5
+
reservoir and input matrices initializations found in the literature.
@@ -22,9 +26,13 @@ Custom layers only need to follow these APIs to be compatible with ReservoirComp
22
26
23
27
## Example of minimally complex ESN
24
28
25
-
Using [^rodan2012] and [^rodan2010] as references this section will provide an example on how to change both the input layer and the reservoir for ESNs.
29
+
Using [^rodan2012] and [^rodan2010] as references this section will provide an
30
+
example on how to change both the input layer and the reservoir for ESNs.
26
31
27
-
The task for this example will be the one step ahead prediction of the Henon map. To obtain the data one can leverage the package [PredefinedDynamicalSystems.jl](https://juliadynamics.github.io/PredefinedDynamicalSystems.jl/dev/). The data is scaled to be between -1 and 1.
32
+
The task for this example will be the one step ahead prediction of the Henon map.
Now it is possible to define the input layers and reservoirs we want to compare and run the comparison in a simple for loop. The accuracy will be tested using the mean squared deviation msd from StatsBase.
54
+
Now it is possible to define the input layers and reservoirs we want to compare and run
55
+
the comparison in a simple for loop. The accuracy will be tested using the mean squared
As it is possible to see, changing layers in ESN models is straightforward. Be sure to check the API documentation for a full list of reservoir and layers.
77
+
As it is possible to see, changing layers in ESN models is straightforward.
78
+
Be sure to check the API documentation for a full list of reservoir and layers.
68
79
69
80
## Bibliography
70
81
71
-
[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852.
72
-
[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144.
82
+
[^rodan2012]: Rodan, Ali, and Peter Tiňo.
83
+
“Simple deterministically constructed cycle reservoirs with regular jumps.”
84
+
Neural computation 24.7 (2012): 1822-1852.
85
+
[^rodan2010]: Rodan, Ali, and Peter Tiňo.
86
+
“Minimum complexity echo state network.”
87
+
IEEE transactions on neural networks 22.1 (2010): 131-144.
0 commit comments