@@ -111,7 +111,7 @@ point predictions with `predict(model, Point(), Xnew)`.
111111
112112# Warm restart options
113113
114- update(model, newdata, :epochs=>n, other_replacements...; verbosity=1 )
114+ update(model, newdata, :epochs=>n, other_replacements...)
115115
116116If `Δepochs = n - perceptron.epochs` is non-negative, then return an updated model, with
117117the weights and bias of the previously learned perceptron used as the starting state in
@@ -120,7 +120,7 @@ instead of the previous training data. Any other hyperparaameter `replacements`
120120adopted. If `Δepochs` is negative or not specified, instead return `fit(learner,
121121newdata)`, where `learner=LearnAPI.clone(learner; epochs=n, replacements....)`.
122122
123- update_observations(model, newdata, replacements...; verbosity=1 )
123+ update_observations(model, newdata, replacements...)
124124
125125Return an updated model, with the weights and bias of the previously learned perceptron
126126used as the starting state in new gradient descent updates. Adopt any specified
@@ -197,7 +197,7 @@ LearnAPI.learner(model::PerceptronClassifierFitted) = model.learner
197197function LearnAPI. fit (
198198 learner:: PerceptronClassifier ,
199199 observations:: PerceptronClassifierObs ;
200- verbosity= 1 ,
200+ verbosity= LearnAPI . default_verbosity () ,
201201 )
202202
203203 # unpack hyperparameters:
@@ -233,7 +233,7 @@ function LearnAPI.update_observations(
233233 model:: PerceptronClassifierFitted ,
234234 observations_new:: PerceptronClassifierObs ,
235235 replacements... ;
236- verbosity= 1 ,
236+ verbosity= LearnAPI . default_verbosity () ,
237237 )
238238
239239 # unpack data:
@@ -265,7 +265,7 @@ function LearnAPI.update(
265265 model:: PerceptronClassifierFitted ,
266266 observations:: PerceptronClassifierObs ,
267267 replacements... ;
268- verbosity= 1 ,
268+ verbosity= LearnAPI . default_verbosity () ,
269269 )
270270
271271 # unpack data:
0 commit comments