Skip to content

Commit 21383ab

Browse files
committed
typo
1 parent 1e9d5e5 commit 21383ab

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

test/integration/gradient_descent.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ If `Δepochs = n - perceptron.epochs` is non-negative, then return an updated mo
120120
the weights and bias of the previously learned perceptron used as the starting state in
121121
new gradient descent updates for `Δepochs` epochs, and using the provided `newdata`
122122
instead of the previous training data. Any other hyperparaameter `replacements` are also
123-
adopted. In `Δepochs` is negative or not specified, instead return `fit(algorithm,
123+
adopted. If `Δepochs` is negative or not specified, instead return `fit(algorithm,
124124
newdata)`, where `algorithm=LearnAPI.clone(algorithm; epochs=n, replacements....)`.
125125
126126
"""

0 commit comments

Comments
 (0)