You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+27-7Lines changed: 27 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,11 +17,21 @@
17
17
18
18
# ReservoirComputing.jl
19
19
20
-
ReservoirComputing.jl provides an efficient, modular and easy to use implementation of Reservoir Computing models such as Echo State Networks (ESNs). For information on using this package please refer to the [stable documentation](https://docs.sciml.ai/ReservoirComputing/stable/). Use the [in-development documentation](https://docs.sciml.ai/ReservoirComputing/dev/) to take a look at at not yet released features.
20
+
ReservoirComputing.jl provides an efficient, modular and easy to use
21
+
implementation of Reservoir Computing models such as Echo State Networks (ESNs).
22
+
For information on using this package please refer to the
To illustrate the workflow of this library we will showcase how it is possible to train an ESN to learn the dynamics of the Lorenz system. As a first step we will need to gather the data. For the `Generative` prediction we need the target data to be one step ahead of the training data:
30
+
To illustrate the workflow of this library we will showcase
31
+
how it is possible to train an ESN to learn the dynamics of the
32
+
Lorenz system. As a first step we gather the data.
33
+
For the `Generative` prediction we need the target data
Now that we have the data we can initialize the ESN with the chosen parameters. Given that this is a quick example we are going to change the least amount of possible parameters. For more detailed examples and explanations of the functions please refer to the documentation.
65
+
Now that we have the data we can initialize the ESN with the chosen parameters.
66
+
Given that this is a quick example we are going to change the least amount of
The echo state network can now be trained and tested. If not specified, the training will always be ordinary least squares regression. The full range of training methods is detailed in the documentation.
78
+
The echo state network can now be trained and tested.
79
+
If not specified, the training will always be ordinary least squares regression:
The data is returned as a matrix, `output` in the code above, that contains the predicted trajectories. The results can now be easily plotted (for the actual script used to obtain this plot please refer to the documentation):
86
+
The data is returned as a matrix, `output` in the code above,
One can also visualize the phase space of the attractor and the comparison with the actual one:
98
+
One can also visualize the phase space of the attractor and the
99
+
comparison with the actual one:
84
100
85
101
```julia
86
102
plot(transpose(output)[:, 1],
@@ -111,4 +127,8 @@ If you use this library in your work, please cite:
111
127
112
128
## Acknowledgements
113
129
114
-
This project was possible thanks to initial funding through the [Google summer of code](https://summerofcode.withgoogle.com/) 2020 program. Francesco M. further acknowledges [ScaDS.AI](https://scads.ai/) and [RSC4Earth](https://rsc4earth.de/) for supporting the current progress on the library.
130
+
This project was possible thanks to initial funding through
131
+
the [Google summer of code](https://summerofcode.withgoogle.com/)
132
+
2020 program. Francesco M. further acknowledges [ScaDS.AI](https://scads.ai/)
133
+
and [RSC4Earth](https://rsc4earth.de/) for supporting the current progress
0 commit comments