File tree Expand file tree Collapse file tree 1 file changed +25
-2
lines changed Expand file tree Collapse file tree 1 file changed +25
-2
lines changed Original file line number Diff line number Diff line change @@ -7,13 +7,36 @@ To actually train a model we need four things:
7
7
* A collection of data points that will be provided to the objective function.
8
8
* An [ optimiser] ( optimisers.md ) that will update the model parameters appropriately.
9
9
10
- With these we can call ` train! ` :
10
+ Training a model is typically an iterative process, where we go over the data set,
11
+ calculate the objective function over the datapoints, and optimise that.
12
+ This can be visualised in the form of a simple loop.
13
+
14
+ ``` julia
15
+ for d in datapoints
16
+
17
+ # `d` should produce a collection of arguments
18
+ # to the loss function
19
+
20
+ # Calculate the gradients of the parameters
21
+ # with respect to the loss function
22
+ grads = Flux. gradient (parameters) do
23
+ loss (d... )
24
+ end
25
+
26
+ # Update the parameters based on the chosen
27
+ # optimiser (opt)
28
+ Flux. Optimise. update! (opt, parameters, grads)
29
+ end
30
+ ```
31
+
32
+ To make it easy, Flux defines ` train! ` :
11
33
12
34
``` @docs
13
35
Flux.Optimise.train!
14
36
```
15
37
16
- There are plenty of examples in the [ model zoo] ( https://github.com/FluxML/model-zoo ) .
38
+ There are plenty of examples in the [ model zoo] ( https://github.com/FluxML/model-zoo ) , and
39
+ more information can be found on [ Custom Training Loops] ( ../models/advanced.md ) .
17
40
18
41
## Loss Functions
19
42
You can’t perform that action at this time.
0 commit comments