You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+11-1Lines changed: 11 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -185,6 +185,16 @@ xor_model
185
185
&mutTrainingOptions {
186
186
loss_fn:&mutloss, // the type of loss function that should be used for Intricate
187
187
// to determine how bad the Model is
188
+
// these two functions are quite useful for a Model that needs to work with very large
189
+
// data that will cost a lot of RAM and computing
190
+
from_inputs_to_vectors:&(|inputs|Ok(inputs.to_vec())), // a function to
191
+
// preprocess the inputs
192
+
from_expected_outputs_to_vectors:&(|outputs|Ok(outputs.to_vec())), // a function
193
+
// to
194
+
// preprocess
195
+
// the
196
+
// expected
197
+
// outputs
188
198
verbosity:TrainingVerbosity {
189
199
show_current_epoch:true, // show a message for each epoch like `epoch #5`
190
200
show_epoch_progress:false, // show a progress bar of the training steps in a
@@ -208,7 +218,6 @@ xor_model
208
218
)
209
219
.unwrap();
210
220
```
211
-
212
221
As you can see it is extremely easy creating these models, and blazingly fast as well.
213
222
214
223
---
@@ -250,3 +259,4 @@ to use the Model after loading it, you **must** call the `init` method in the `l
250
259
- add a way to send into the training process a callback closure that would be called everytime a epoch finished or even a step too with some cool info
251
260
- make an example after doing the thing above ^, that uses that same function to plot the loss realtime using a crate like `textplots`
252
261
- add embedding layers for text such as bag of words with an expected vocabulary size
262
+
- add a better way to define training options as to not need to write such large code when there is no need for it
0 commit comments